Feb 27 01:04:45 crc systemd[1]: Starting Kubernetes Kubelet... Feb 27 01:04:46 crc restorecon[4680]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 01:04:46 crc restorecon[4680]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 01:04:46 crc restorecon[4680]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 27 01:04:47 crc kubenswrapper[4771]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 27 01:04:47 crc kubenswrapper[4771]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 27 01:04:47 crc kubenswrapper[4771]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 27 01:04:47 crc kubenswrapper[4771]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 27 01:04:47 crc kubenswrapper[4771]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 27 01:04:47 crc kubenswrapper[4771]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.503044 4771 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511264 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511308 4771 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511321 4771 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511332 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511342 4771 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511350 4771 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511359 4771 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511367 4771 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511379 4771 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511392 4771 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511401 4771 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511410 4771 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511421 4771 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511432 4771 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511442 4771 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511453 4771 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511462 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511473 4771 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511482 4771 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511491 4771 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511499 4771 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511506 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511523 4771 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511531 4771 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511540 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511586 4771 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511594 4771 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511602 4771 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511610 4771 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511621 4771 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511631 4771 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511640 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511647 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511656 4771 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511667 4771 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511677 4771 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511685 4771 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511693 4771 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511701 4771 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511709 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511717 4771 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511726 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511734 4771 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511742 4771 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511749 4771 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511757 4771 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511769 4771 feature_gate.go:330] unrecognized feature gate: Example Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511777 4771 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511784 4771 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511794 4771 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511801 4771 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511809 4771 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511817 4771 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511826 4771 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511833 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511841 4771 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511849 4771 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511856 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511864 4771 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511871 4771 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511878 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511891 4771 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511900 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511911 4771 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511919 4771 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511931 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511940 4771 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511948 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511956 4771 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511964 4771 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.511972 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512135 4771 flags.go:64] FLAG: --address="0.0.0.0" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512154 4771 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512171 4771 flags.go:64] FLAG: --anonymous-auth="true" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512183 4771 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512195 4771 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512204 4771 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512217 4771 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512228 4771 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512238 4771 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512248 4771 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512258 4771 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512270 4771 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512280 4771 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512289 4771 flags.go:64] FLAG: --cgroup-root="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512298 4771 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512307 4771 flags.go:64] FLAG: --client-ca-file="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512316 4771 flags.go:64] FLAG: --cloud-config="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512325 4771 flags.go:64] FLAG: --cloud-provider="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512334 4771 flags.go:64] FLAG: --cluster-dns="[]" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512348 4771 flags.go:64] FLAG: --cluster-domain="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512357 4771 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512366 4771 flags.go:64] FLAG: --config-dir="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512375 4771 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512385 4771 flags.go:64] FLAG: --container-log-max-files="5" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512397 4771 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512406 4771 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512415 4771 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512425 4771 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512434 4771 flags.go:64] FLAG: --contention-profiling="false" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512444 4771 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512453 4771 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512462 4771 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512471 4771 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512482 4771 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512492 4771 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512501 4771 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512510 4771 flags.go:64] FLAG: --enable-load-reader="false" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512520 4771 flags.go:64] FLAG: --enable-server="true" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512529 4771 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512539 4771 flags.go:64] FLAG: --event-burst="100" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512576 4771 flags.go:64] FLAG: --event-qps="50" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512586 4771 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512595 4771 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512604 4771 flags.go:64] FLAG: --eviction-hard="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512615 4771 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512625 4771 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512634 4771 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512644 4771 flags.go:64] FLAG: --eviction-soft="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512653 4771 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512662 4771 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512671 4771 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512693 4771 flags.go:64] FLAG: --experimental-mounter-path="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512702 4771 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512712 4771 flags.go:64] FLAG: --fail-swap-on="true" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512720 4771 flags.go:64] FLAG: --feature-gates="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512732 4771 flags.go:64] FLAG: --file-check-frequency="20s" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512741 4771 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512751 4771 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512762 4771 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512771 4771 flags.go:64] FLAG: --healthz-port="10248" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512780 4771 flags.go:64] FLAG: --help="false" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512789 4771 flags.go:64] FLAG: --hostname-override="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512798 4771 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512808 4771 flags.go:64] FLAG: --http-check-frequency="20s" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512817 4771 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512838 4771 flags.go:64] FLAG: --image-credential-provider-config="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512847 4771 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512857 4771 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512868 4771 flags.go:64] FLAG: --image-service-endpoint="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512880 4771 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512891 4771 flags.go:64] FLAG: --kube-api-burst="100" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512903 4771 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512915 4771 flags.go:64] FLAG: --kube-api-qps="50" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512926 4771 flags.go:64] FLAG: --kube-reserved="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512938 4771 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512948 4771 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512958 4771 flags.go:64] FLAG: --kubelet-cgroups="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512968 4771 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512977 4771 flags.go:64] FLAG: --lock-file="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512987 4771 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.512996 4771 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513006 4771 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513022 4771 flags.go:64] FLAG: --log-json-split-stream="false" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513032 4771 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513043 4771 flags.go:64] FLAG: --log-text-split-stream="false" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513052 4771 flags.go:64] FLAG: --logging-format="text" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513062 4771 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513072 4771 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513082 4771 flags.go:64] FLAG: --manifest-url="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513093 4771 flags.go:64] FLAG: --manifest-url-header="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513111 4771 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513122 4771 flags.go:64] FLAG: --max-open-files="1000000" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513144 4771 flags.go:64] FLAG: --max-pods="110" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513155 4771 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513167 4771 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513178 4771 flags.go:64] FLAG: --memory-manager-policy="None" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513187 4771 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513202 4771 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513211 4771 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513220 4771 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513243 4771 flags.go:64] FLAG: --node-status-max-images="50" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513252 4771 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513261 4771 flags.go:64] FLAG: --oom-score-adj="-999" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513270 4771 flags.go:64] FLAG: --pod-cidr="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513279 4771 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513296 4771 flags.go:64] FLAG: --pod-manifest-path="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513304 4771 flags.go:64] FLAG: --pod-max-pids="-1" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513314 4771 flags.go:64] FLAG: --pods-per-core="0" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513323 4771 flags.go:64] FLAG: --port="10250" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513332 4771 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513341 4771 flags.go:64] FLAG: --provider-id="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513350 4771 flags.go:64] FLAG: --qos-reserved="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513360 4771 flags.go:64] FLAG: --read-only-port="10255" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513369 4771 flags.go:64] FLAG: --register-node="true" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513377 4771 flags.go:64] FLAG: --register-schedulable="true" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513387 4771 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513402 4771 flags.go:64] FLAG: --registry-burst="10" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513411 4771 flags.go:64] FLAG: --registry-qps="5" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513420 4771 flags.go:64] FLAG: --reserved-cpus="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513430 4771 flags.go:64] FLAG: --reserved-memory="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513443 4771 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513452 4771 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513461 4771 flags.go:64] FLAG: --rotate-certificates="false" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513470 4771 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513480 4771 flags.go:64] FLAG: --runonce="false" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513489 4771 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513498 4771 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513507 4771 flags.go:64] FLAG: --seccomp-default="false" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513516 4771 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513528 4771 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513538 4771 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513579 4771 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513589 4771 flags.go:64] FLAG: --storage-driver-password="root" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513598 4771 flags.go:64] FLAG: --storage-driver-secure="false" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513607 4771 flags.go:64] FLAG: --storage-driver-table="stats" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513616 4771 flags.go:64] FLAG: --storage-driver-user="root" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513625 4771 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513634 4771 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513644 4771 flags.go:64] FLAG: --system-cgroups="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513653 4771 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513667 4771 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513676 4771 flags.go:64] FLAG: --tls-cert-file="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513685 4771 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513698 4771 flags.go:64] FLAG: --tls-min-version="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513707 4771 flags.go:64] FLAG: --tls-private-key-file="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513715 4771 flags.go:64] FLAG: --topology-manager-policy="none" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513725 4771 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513734 4771 flags.go:64] FLAG: --topology-manager-scope="container" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513743 4771 flags.go:64] FLAG: --v="2" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513755 4771 flags.go:64] FLAG: --version="false" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513768 4771 flags.go:64] FLAG: --vmodule="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513778 4771 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.513788 4771 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514064 4771 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514080 4771 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514091 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514101 4771 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514111 4771 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514120 4771 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514130 4771 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514140 4771 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514156 4771 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514167 4771 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514176 4771 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514185 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514195 4771 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514205 4771 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514215 4771 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514225 4771 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514233 4771 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514241 4771 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514249 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514256 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514264 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514272 4771 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514280 4771 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514290 4771 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514299 4771 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514308 4771 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514318 4771 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514328 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514336 4771 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514346 4771 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514355 4771 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514362 4771 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514370 4771 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514380 4771 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514389 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514398 4771 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514406 4771 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514413 4771 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514423 4771 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514430 4771 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514442 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514450 4771 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514457 4771 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514465 4771 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514472 4771 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514480 4771 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514487 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514495 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514503 4771 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514510 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514518 4771 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514526 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514533 4771 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514541 4771 feature_gate.go:330] unrecognized feature gate: Example Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514578 4771 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514586 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514594 4771 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514602 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514610 4771 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514618 4771 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514626 4771 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514634 4771 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514642 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514649 4771 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514657 4771 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514669 4771 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514677 4771 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514687 4771 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514697 4771 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514705 4771 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.514714 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.514738 4771 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.528009 4771 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.528054 4771 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528184 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528197 4771 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528206 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528216 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528224 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528235 4771 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528248 4771 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528257 4771 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528266 4771 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528275 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528286 4771 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528295 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528304 4771 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528312 4771 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528321 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528329 4771 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528337 4771 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528345 4771 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528352 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528361 4771 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528369 4771 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528377 4771 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528384 4771 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528393 4771 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528403 4771 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528412 4771 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528420 4771 feature_gate.go:330] unrecognized feature gate: Example Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528428 4771 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528436 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528444 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528452 4771 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528459 4771 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528467 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528475 4771 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528485 4771 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528494 4771 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528521 4771 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528530 4771 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528538 4771 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528568 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528577 4771 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528585 4771 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528593 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528600 4771 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528608 4771 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528615 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528623 4771 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528631 4771 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528638 4771 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528646 4771 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528654 4771 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528662 4771 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528669 4771 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528677 4771 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528685 4771 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528693 4771 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528701 4771 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528709 4771 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528717 4771 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528724 4771 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528734 4771 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528744 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528754 4771 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528762 4771 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528771 4771 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528779 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528788 4771 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528796 4771 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528804 4771 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528812 4771 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.528822 4771 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.528837 4771 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529071 4771 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529084 4771 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529094 4771 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529103 4771 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529112 4771 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529120 4771 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529129 4771 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529137 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529145 4771 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529153 4771 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529161 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529172 4771 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529180 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529188 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529195 4771 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529207 4771 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529218 4771 feature_gate.go:330] unrecognized feature gate: Example Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529227 4771 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529235 4771 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529243 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529252 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529262 4771 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529271 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529281 4771 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529289 4771 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529298 4771 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529306 4771 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529314 4771 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529322 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529330 4771 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529337 4771 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529346 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529353 4771 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529361 4771 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529370 4771 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529377 4771 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529386 4771 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529394 4771 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529401 4771 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529409 4771 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529416 4771 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529425 4771 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529434 4771 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529446 4771 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529458 4771 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529468 4771 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529481 4771 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529492 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529501 4771 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529509 4771 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529517 4771 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529525 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529534 4771 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529542 4771 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529574 4771 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529583 4771 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529592 4771 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529600 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529608 4771 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529616 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529624 4771 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529632 4771 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529640 4771 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529647 4771 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529655 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529662 4771 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529670 4771 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529679 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529687 4771 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529695 4771 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.529703 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.529716 4771 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.531053 4771 server.go:940] "Client rotation is on, will bootstrap in background" Feb 27 01:04:47 crc kubenswrapper[4771]: E0227 01:04:47.535962 4771 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.540773 4771 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.540943 4771 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.544280 4771 server.go:997] "Starting client certificate rotation" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.544337 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.544652 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.572628 4771 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 27 01:04:47 crc kubenswrapper[4771]: E0227 01:04:47.575232 4771 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.189:6443: connect: connection refused" logger="UnhandledError" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.578270 4771 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.596848 4771 log.go:25] "Validated CRI v1 runtime API" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.637045 4771 log.go:25] "Validated CRI v1 image API" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.643109 4771 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.649643 4771 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-27-01-00-24-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.649693 4771 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.665222 4771 manager.go:217] Machine: {Timestamp:2026-02-27 01:04:47.660682898 +0000 UTC m=+0.598244206 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:375bc5bf-73cd-4494-8f02-c45b5f7dcf9a BootID:617895cc-625c-4c2b-869d-7397fcc31df7 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:d8:5b:4b Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:d8:5b:4b Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:56:b3:32 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:83:4e:31 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:3a:3e:ce Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:df:50:38 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:76:b3:de:2d:71:c8 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:e6:ef:8a:c2:e2:6a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.665414 4771 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.665510 4771 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.665812 4771 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.665948 4771 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.665974 4771 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.666169 4771 topology_manager.go:138] "Creating topology manager with none policy" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.666178 4771 container_manager_linux.go:303] "Creating device plugin manager" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.666709 4771 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.666737 4771 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.666873 4771 state_mem.go:36] "Initialized new in-memory state store" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.666944 4771 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.671360 4771 kubelet.go:418] "Attempting to sync node with API server" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.671385 4771 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.671415 4771 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.671427 4771 kubelet.go:324] "Adding apiserver pod source" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.671438 4771 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.675888 4771 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.677063 4771 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.677064 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.189:6443: connect: connection refused Feb 27 01:04:47 crc kubenswrapper[4771]: E0227 01:04:47.677162 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.189:6443: connect: connection refused" logger="UnhandledError" Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.677202 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.189:6443: connect: connection refused Feb 27 01:04:47 crc kubenswrapper[4771]: E0227 01:04:47.677313 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.189:6443: connect: connection refused" logger="UnhandledError" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.679805 4771 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.682758 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.682785 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.682794 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.682803 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.682817 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.682849 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.682858 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.682872 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.682882 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.682891 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.682917 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.682926 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.683876 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.684272 4771 server.go:1280] "Started kubelet" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.685339 4771 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.685509 4771 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 27 01:04:47 crc systemd[1]: Started Kubernetes Kubelet. Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.685669 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.189:6443: connect: connection refused Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.686447 4771 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.687512 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.687693 4771 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.692845 4771 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.692881 4771 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.693385 4771 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 27 01:04:47 crc kubenswrapper[4771]: E0227 01:04:47.693503 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.693979 4771 factory.go:55] Registering systemd factory Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.694028 4771 factory.go:221] Registration of the systemd container factory successfully Feb 27 01:04:47 crc kubenswrapper[4771]: E0227 01:04:47.699216 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" interval="200ms" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.699460 4771 server.go:460] "Adding debug handlers to kubelet server" Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.699195 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.189:6443: connect: connection refused Feb 27 01:04:47 crc kubenswrapper[4771]: E0227 01:04:47.699719 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.189:6443: connect: connection refused" logger="UnhandledError" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.699935 4771 factory.go:153] Registering CRI-O factory Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.701766 4771 factory.go:221] Registration of the crio container factory successfully Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.702118 4771 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.702391 4771 factory.go:103] Registering Raw factory Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.702694 4771 manager.go:1196] Started watching for new ooms in manager Feb 27 01:04:47 crc kubenswrapper[4771]: E0227 01:04:47.699447 4771 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.189:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1897f4ef749adb2f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.684246319 +0000 UTC m=+0.621807617,LastTimestamp:2026-02-27 01:04:47.684246319 +0000 UTC m=+0.621807617,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.705817 4771 manager.go:319] Starting recovery of all containers Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.711644 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.711742 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.711769 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.711790 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.711813 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.711836 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.711863 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.711886 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.711910 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.711930 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.711952 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.711976 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.711999 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.712027 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.712051 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.712074 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.712102 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.712121 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.712140 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.712161 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.712184 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715127 4771 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715189 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715212 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715231 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715259 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715276 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715301 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715321 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715341 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715359 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715380 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715401 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715422 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715441 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715460 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715476 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715494 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715512 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715532 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715579 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715599 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715617 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715634 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715670 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715689 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715717 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715735 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715764 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715782 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715806 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715825 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715842 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715867 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715888 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715908 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715930 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715949 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715969 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.715999 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716016 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716035 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716054 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716071 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716092 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716111 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716129 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716149 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716168 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716189 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716208 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716225 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716243 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716263 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716281 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716300 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716318 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716337 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716354 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716372 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716390 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716409 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716428 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716446 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716462 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716481 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716499 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716516 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716533 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716574 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716593 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716611 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716629 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716646 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716664 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716682 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716702 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716721 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716740 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716759 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716777 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716796 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716814 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716832 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716851 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716877 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716897 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716920 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716943 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716962 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.716980 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717000 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717021 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717043 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717061 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717083 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717101 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717120 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717139 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717157 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717175 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717194 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717212 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717244 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717269 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717288 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717305 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717324 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717423 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717445 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717470 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717489 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717507 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717526 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717543 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717585 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717602 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717620 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717638 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717657 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717675 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717692 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717710 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717728 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717747 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717766 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717784 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717802 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717822 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717840 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717857 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717875 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717895 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717912 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717929 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717947 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717965 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717982 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.717998 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718017 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718035 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718054 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718111 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718131 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718149 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718167 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718184 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718201 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718219 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718237 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718253 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718271 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718288 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718306 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718323 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718341 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718358 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718376 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718393 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718413 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718430 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718448 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718466 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718484 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718503 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718521 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718631 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718651 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718669 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718685 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718702 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718721 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718738 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718757 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718775 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718796 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718813 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718830 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718848 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718867 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718883 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718902 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718919 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718937 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718956 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718973 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.718991 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.719011 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.719029 4771 reconstruct.go:97] "Volume reconstruction finished" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.719041 4771 reconciler.go:26] "Reconciler: start to sync state" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.742230 4771 manager.go:324] Recovery completed Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.759615 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.763694 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.763729 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.763740 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.765080 4771 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.765097 4771 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.765117 4771 state_mem.go:36] "Initialized new in-memory state store" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.767563 4771 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.770596 4771 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.771443 4771 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.771904 4771 kubelet.go:2335] "Starting kubelet main sync loop" Feb 27 01:04:47 crc kubenswrapper[4771]: E0227 01:04:47.772092 4771 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 27 01:04:47 crc kubenswrapper[4771]: W0227 01:04:47.773892 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.189:6443: connect: connection refused Feb 27 01:04:47 crc kubenswrapper[4771]: E0227 01:04:47.773987 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.189:6443: connect: connection refused" logger="UnhandledError" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.785498 4771 policy_none.go:49] "None policy: Start" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.786484 4771 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.786513 4771 state_mem.go:35] "Initializing new in-memory state store" Feb 27 01:04:47 crc kubenswrapper[4771]: E0227 01:04:47.794036 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.857808 4771 manager.go:334] "Starting Device Plugin manager" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.858196 4771 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.858217 4771 server.go:79] "Starting device plugin registration server" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.858742 4771 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.858767 4771 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.859057 4771 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.859173 4771 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.859186 4771 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 27 01:04:47 crc kubenswrapper[4771]: E0227 01:04:47.869056 4771 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.873537 4771 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.873650 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.875244 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.875318 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.875340 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.875664 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.876452 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.876511 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.876991 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.877038 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.877054 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.877174 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.877308 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.877347 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.878001 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.878024 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.878036 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.878209 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.878266 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.878297 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.878313 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.878273 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.878406 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.878691 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.878767 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.878802 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.879873 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.879920 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.879936 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.879991 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.880020 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.880036 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.880232 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.880351 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.880408 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.881461 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.881502 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.881519 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.881813 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.881869 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.882385 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.882435 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.882456 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.882877 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.882906 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.882922 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:47 crc kubenswrapper[4771]: E0227 01:04:47.900945 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" interval="400ms" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.921526 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.921604 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.921641 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.921681 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.921769 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.921810 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.921855 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.921894 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.921941 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.921985 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.922081 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.922150 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.922219 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.922268 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.922300 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.963912 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.965516 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.965610 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.965629 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:47 crc kubenswrapper[4771]: I0227 01:04:47.965662 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 01:04:47 crc kubenswrapper[4771]: E0227 01:04:47.966164 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.189:6443: connect: connection refused" node="crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.024902 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.024975 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.025026 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.025070 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.025119 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.025122 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.025199 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.025226 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.025199 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.025272 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.025236 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.025321 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.025351 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.025358 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.025419 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.025444 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.025378 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.025535 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.025613 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.025655 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.025697 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.025737 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.025792 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.026069 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.025402 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.026153 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.026158 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.026092 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.026221 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.026336 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.167200 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.168795 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.168845 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.168863 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.168896 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 01:04:48 crc kubenswrapper[4771]: E0227 01:04:48.169418 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.189:6443: connect: connection refused" node="crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.212212 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.219727 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.244574 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.255180 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.268194 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:04:48 crc kubenswrapper[4771]: W0227 01:04:48.269073 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-e21f2d620ac5030cbc9da6b6a3e1a4424c48ce8841e0aa9ca88ce12f62b1c4a0 WatchSource:0}: Error finding container e21f2d620ac5030cbc9da6b6a3e1a4424c48ce8841e0aa9ca88ce12f62b1c4a0: Status 404 returned error can't find the container with id e21f2d620ac5030cbc9da6b6a3e1a4424c48ce8841e0aa9ca88ce12f62b1c4a0 Feb 27 01:04:48 crc kubenswrapper[4771]: W0227 01:04:48.269820 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-153f55ffc2975cb3450cc83e30d5bef754ef5912eed0700fdcf2eeda7b9ccda8 WatchSource:0}: Error finding container 153f55ffc2975cb3450cc83e30d5bef754ef5912eed0700fdcf2eeda7b9ccda8: Status 404 returned error can't find the container with id 153f55ffc2975cb3450cc83e30d5bef754ef5912eed0700fdcf2eeda7b9ccda8 Feb 27 01:04:48 crc kubenswrapper[4771]: W0227 01:04:48.281639 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-c941e70e21116c3fcd65f0fcf52e0bc70211b6ac482ad84986031fc961df07dc WatchSource:0}: Error finding container c941e70e21116c3fcd65f0fcf52e0bc70211b6ac482ad84986031fc961df07dc: Status 404 returned error can't find the container with id c941e70e21116c3fcd65f0fcf52e0bc70211b6ac482ad84986031fc961df07dc Feb 27 01:04:48 crc kubenswrapper[4771]: W0227 01:04:48.288097 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-2fb872f38ac5e1ddea833cb0133aec256e48365bc2d5c2bb01b0d40af26d49da WatchSource:0}: Error finding container 2fb872f38ac5e1ddea833cb0133aec256e48365bc2d5c2bb01b0d40af26d49da: Status 404 returned error can't find the container with id 2fb872f38ac5e1ddea833cb0133aec256e48365bc2d5c2bb01b0d40af26d49da Feb 27 01:04:48 crc kubenswrapper[4771]: W0227 01:04:48.299147 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-0e293b562e22decc6028c3c7de4c950393c5374dbd739ce967e7583f3759e2d7 WatchSource:0}: Error finding container 0e293b562e22decc6028c3c7de4c950393c5374dbd739ce967e7583f3759e2d7: Status 404 returned error can't find the container with id 0e293b562e22decc6028c3c7de4c950393c5374dbd739ce967e7583f3759e2d7 Feb 27 01:04:48 crc kubenswrapper[4771]: E0227 01:04:48.301881 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" interval="800ms" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.570011 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.571442 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.571492 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.571506 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.571540 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 01:04:48 crc kubenswrapper[4771]: E0227 01:04:48.572157 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.189:6443: connect: connection refused" node="crc" Feb 27 01:04:48 crc kubenswrapper[4771]: W0227 01:04:48.603312 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.189:6443: connect: connection refused Feb 27 01:04:48 crc kubenswrapper[4771]: E0227 01:04:48.603405 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.189:6443: connect: connection refused" logger="UnhandledError" Feb 27 01:04:48 crc kubenswrapper[4771]: W0227 01:04:48.640379 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.189:6443: connect: connection refused Feb 27 01:04:48 crc kubenswrapper[4771]: E0227 01:04:48.640473 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.189:6443: connect: connection refused" logger="UnhandledError" Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.686733 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.189:6443: connect: connection refused Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.778330 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"153f55ffc2975cb3450cc83e30d5bef754ef5912eed0700fdcf2eeda7b9ccda8"} Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.779717 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0e293b562e22decc6028c3c7de4c950393c5374dbd739ce967e7583f3759e2d7"} Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.781820 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2fb872f38ac5e1ddea833cb0133aec256e48365bc2d5c2bb01b0d40af26d49da"} Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.783324 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c941e70e21116c3fcd65f0fcf52e0bc70211b6ac482ad84986031fc961df07dc"} Feb 27 01:04:48 crc kubenswrapper[4771]: I0227 01:04:48.785127 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e21f2d620ac5030cbc9da6b6a3e1a4424c48ce8841e0aa9ca88ce12f62b1c4a0"} Feb 27 01:04:48 crc kubenswrapper[4771]: W0227 01:04:48.936262 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.189:6443: connect: connection refused Feb 27 01:04:48 crc kubenswrapper[4771]: E0227 01:04:48.936361 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.189:6443: connect: connection refused" logger="UnhandledError" Feb 27 01:04:49 crc kubenswrapper[4771]: E0227 01:04:49.102914 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" interval="1.6s" Feb 27 01:04:49 crc kubenswrapper[4771]: W0227 01:04:49.190457 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.189:6443: connect: connection refused Feb 27 01:04:49 crc kubenswrapper[4771]: E0227 01:04:49.190626 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.189:6443: connect: connection refused" logger="UnhandledError" Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.373079 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.374829 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.374886 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.374904 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.374979 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 01:04:49 crc kubenswrapper[4771]: E0227 01:04:49.375601 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.189:6443: connect: connection refused" node="crc" Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.677960 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 27 01:04:49 crc kubenswrapper[4771]: E0227 01:04:49.679633 4771 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.189:6443: connect: connection refused" logger="UnhandledError" Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.686719 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.189:6443: connect: connection refused Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.789898 4771 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="a00d290ae277671ef6fd27f856398b2bcc7d62616557e2f31ec50efb045d1c56" exitCode=0 Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.789951 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"a00d290ae277671ef6fd27f856398b2bcc7d62616557e2f31ec50efb045d1c56"} Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.790038 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.791273 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.791329 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.791349 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.791930 4771 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="a16d7c04e5c94174f79e9a3d7e2fd64f70f7677c4128e6ff85d90ee3446cc724" exitCode=0 Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.791966 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"a16d7c04e5c94174f79e9a3d7e2fd64f70f7677c4128e6ff85d90ee3446cc724"} Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.792029 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.792757 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.792775 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.792783 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.795540 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5"} Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.795609 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6"} Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.795633 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf"} Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.796936 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b" exitCode=0 Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.796987 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.797015 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b"} Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.801056 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.801144 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.801211 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.802843 4771 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8" exitCode=0 Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.802882 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8"} Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.802985 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.803891 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.803965 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.804018 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.806593 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.807441 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.807763 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:49 crc kubenswrapper[4771]: I0227 01:04:49.807868 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.686510 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.189:6443: connect: connection refused Feb 27 01:04:50 crc kubenswrapper[4771]: E0227 01:04:50.706607 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" interval="3.2s" Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.809388 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e86f48f09a5e07a4685c65f576a1110899727e8aed66e70b9cc68ef5ad582c43"} Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.809443 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.810262 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.810293 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.810305 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.811394 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"be367e10a4cf507c9297ec9727c7847db6b54785f2efae23213c2a325e50cb35"} Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.811434 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.811453 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"682d71bc07b1c3a5e3d9be1eb9f1dc1a19176f2bcf9942f18c8966864d51b032"} Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.811466 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"33db46668493c52afb6e30bae58f1197802fc18fcc703f9d05931471b2526bd0"} Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.812372 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.812404 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.812414 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.814709 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7"} Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.814734 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.815510 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.815575 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.815593 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.817955 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40"} Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.817989 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e"} Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.818005 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba"} Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.818021 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4"} Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.821353 4771 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0" exitCode=0 Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.821394 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0"} Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.821484 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.822409 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.822443 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.822458 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:50 crc kubenswrapper[4771]: W0227 01:04:50.864477 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.189:6443: connect: connection refused Feb 27 01:04:50 crc kubenswrapper[4771]: E0227 01:04:50.865224 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.189:6443: connect: connection refused" logger="UnhandledError" Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.976159 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.978129 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.978184 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.978196 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:50 crc kubenswrapper[4771]: I0227 01:04:50.978238 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 01:04:50 crc kubenswrapper[4771]: E0227 01:04:50.980820 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.189:6443: connect: connection refused" node="crc" Feb 27 01:04:51 crc kubenswrapper[4771]: I0227 01:04:51.711840 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 01:04:51 crc kubenswrapper[4771]: I0227 01:04:51.830309 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2f1ccc1ef920495ecbb8a758649b4bc7c55d73d2c142da5c41632d7da93b303b"} Feb 27 01:04:51 crc kubenswrapper[4771]: I0227 01:04:51.830453 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:51 crc kubenswrapper[4771]: I0227 01:04:51.831977 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:51 crc kubenswrapper[4771]: I0227 01:04:51.832164 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:51 crc kubenswrapper[4771]: I0227 01:04:51.832304 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:51 crc kubenswrapper[4771]: I0227 01:04:51.833983 4771 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7" exitCode=0 Feb 27 01:04:51 crc kubenswrapper[4771]: I0227 01:04:51.834111 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:51 crc kubenswrapper[4771]: I0227 01:04:51.834154 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:51 crc kubenswrapper[4771]: I0227 01:04:51.834412 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7"} Feb 27 01:04:51 crc kubenswrapper[4771]: I0227 01:04:51.834635 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:51 crc kubenswrapper[4771]: I0227 01:04:51.834718 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:51 crc kubenswrapper[4771]: I0227 01:04:51.835842 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:51 crc kubenswrapper[4771]: I0227 01:04:51.835893 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:51 crc kubenswrapper[4771]: I0227 01:04:51.835913 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:51 crc kubenswrapper[4771]: I0227 01:04:51.836020 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:51 crc kubenswrapper[4771]: I0227 01:04:51.836058 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:51 crc kubenswrapper[4771]: I0227 01:04:51.836077 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:51 crc kubenswrapper[4771]: I0227 01:04:51.836905 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:51 crc kubenswrapper[4771]: I0227 01:04:51.836931 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:51 crc kubenswrapper[4771]: I0227 01:04:51.836966 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:51 crc kubenswrapper[4771]: I0227 01:04:51.836999 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:51 crc kubenswrapper[4771]: I0227 01:04:51.836971 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:51 crc kubenswrapper[4771]: I0227 01:04:51.837082 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:52 crc kubenswrapper[4771]: I0227 01:04:52.762438 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:04:52 crc kubenswrapper[4771]: I0227 01:04:52.842718 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978"} Feb 27 01:04:52 crc kubenswrapper[4771]: I0227 01:04:52.842786 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910"} Feb 27 01:04:52 crc kubenswrapper[4771]: I0227 01:04:52.842805 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:52 crc kubenswrapper[4771]: I0227 01:04:52.842815 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1"} Feb 27 01:04:52 crc kubenswrapper[4771]: I0227 01:04:52.842938 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:52 crc kubenswrapper[4771]: I0227 01:04:52.843797 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:52 crc kubenswrapper[4771]: I0227 01:04:52.843838 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:52 crc kubenswrapper[4771]: I0227 01:04:52.843853 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:52 crc kubenswrapper[4771]: I0227 01:04:52.844580 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:52 crc kubenswrapper[4771]: I0227 01:04:52.844622 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:52 crc kubenswrapper[4771]: I0227 01:04:52.844643 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:53 crc kubenswrapper[4771]: I0227 01:04:53.693836 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 27 01:04:53 crc kubenswrapper[4771]: I0227 01:04:53.788379 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:04:53 crc kubenswrapper[4771]: I0227 01:04:53.788675 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:53 crc kubenswrapper[4771]: I0227 01:04:53.790300 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:53 crc kubenswrapper[4771]: I0227 01:04:53.790358 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:53 crc kubenswrapper[4771]: I0227 01:04:53.790380 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:53 crc kubenswrapper[4771]: I0227 01:04:53.851568 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d"} Feb 27 01:04:53 crc kubenswrapper[4771]: I0227 01:04:53.851642 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:53 crc kubenswrapper[4771]: I0227 01:04:53.851658 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57"} Feb 27 01:04:53 crc kubenswrapper[4771]: I0227 01:04:53.851670 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:53 crc kubenswrapper[4771]: I0227 01:04:53.853131 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:53 crc kubenswrapper[4771]: I0227 01:04:53.853192 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:53 crc kubenswrapper[4771]: I0227 01:04:53.853209 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:53 crc kubenswrapper[4771]: I0227 01:04:53.853212 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:53 crc kubenswrapper[4771]: I0227 01:04:53.853257 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:53 crc kubenswrapper[4771]: I0227 01:04:53.853276 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:54 crc kubenswrapper[4771]: I0227 01:04:54.181346 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:54 crc kubenswrapper[4771]: I0227 01:04:54.183089 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:54 crc kubenswrapper[4771]: I0227 01:04:54.183145 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:54 crc kubenswrapper[4771]: I0227 01:04:54.183162 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:54 crc kubenswrapper[4771]: I0227 01:04:54.183197 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 01:04:54 crc kubenswrapper[4771]: I0227 01:04:54.641534 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:04:54 crc kubenswrapper[4771]: I0227 01:04:54.701708 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:04:54 crc kubenswrapper[4771]: I0227 01:04:54.702013 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:54 crc kubenswrapper[4771]: I0227 01:04:54.703858 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:54 crc kubenswrapper[4771]: I0227 01:04:54.703923 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:54 crc kubenswrapper[4771]: I0227 01:04:54.703943 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:54 crc kubenswrapper[4771]: I0227 01:04:54.854127 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:54 crc kubenswrapper[4771]: I0227 01:04:54.854291 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:54 crc kubenswrapper[4771]: I0227 01:04:54.855376 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:54 crc kubenswrapper[4771]: I0227 01:04:54.855428 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:54 crc kubenswrapper[4771]: I0227 01:04:54.855444 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:54 crc kubenswrapper[4771]: I0227 01:04:54.856088 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:54 crc kubenswrapper[4771]: I0227 01:04:54.856168 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:54 crc kubenswrapper[4771]: I0227 01:04:54.856188 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:56 crc kubenswrapper[4771]: I0227 01:04:56.280771 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:04:56 crc kubenswrapper[4771]: I0227 01:04:56.280978 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:56 crc kubenswrapper[4771]: I0227 01:04:56.282718 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:56 crc kubenswrapper[4771]: I0227 01:04:56.282770 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:56 crc kubenswrapper[4771]: I0227 01:04:56.282788 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:56 crc kubenswrapper[4771]: I0227 01:04:56.289902 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:04:56 crc kubenswrapper[4771]: I0227 01:04:56.698254 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:04:56 crc kubenswrapper[4771]: I0227 01:04:56.698497 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:56 crc kubenswrapper[4771]: I0227 01:04:56.700270 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:56 crc kubenswrapper[4771]: I0227 01:04:56.700341 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:56 crc kubenswrapper[4771]: I0227 01:04:56.700360 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:56 crc kubenswrapper[4771]: I0227 01:04:56.789288 4771 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 27 01:04:56 crc kubenswrapper[4771]: I0227 01:04:56.789388 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 01:04:56 crc kubenswrapper[4771]: I0227 01:04:56.860096 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:56 crc kubenswrapper[4771]: I0227 01:04:56.860983 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:56 crc kubenswrapper[4771]: I0227 01:04:56.861053 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:56 crc kubenswrapper[4771]: I0227 01:04:56.861077 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:57 crc kubenswrapper[4771]: I0227 01:04:57.680510 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:04:57 crc kubenswrapper[4771]: I0227 01:04:57.824141 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 27 01:04:57 crc kubenswrapper[4771]: I0227 01:04:57.824358 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:57 crc kubenswrapper[4771]: I0227 01:04:57.826109 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:57 crc kubenswrapper[4771]: I0227 01:04:57.826181 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:57 crc kubenswrapper[4771]: I0227 01:04:57.826218 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:57 crc kubenswrapper[4771]: I0227 01:04:57.862587 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:04:57 crc kubenswrapper[4771]: I0227 01:04:57.863796 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:04:57 crc kubenswrapper[4771]: I0227 01:04:57.863841 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:04:57 crc kubenswrapper[4771]: I0227 01:04:57.863859 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:04:57 crc kubenswrapper[4771]: E0227 01:04:57.869438 4771 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 01:05:01 crc kubenswrapper[4771]: I0227 01:05:01.687032 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 27 01:05:01 crc kubenswrapper[4771]: W0227 01:05:01.697715 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 27 01:05:01 crc kubenswrapper[4771]: I0227 01:05:01.697842 4771 trace.go:236] Trace[209285551]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Feb-2026 01:04:51.696) (total time: 10001ms): Feb 27 01:05:01 crc kubenswrapper[4771]: Trace[209285551]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (01:05:01.697) Feb 27 01:05:01 crc kubenswrapper[4771]: Trace[209285551]: [10.001651241s] [10.001651241s] END Feb 27 01:05:01 crc kubenswrapper[4771]: E0227 01:05:01.697876 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 27 01:05:01 crc kubenswrapper[4771]: I0227 01:05:01.876273 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 27 01:05:01 crc kubenswrapper[4771]: I0227 01:05:01.878928 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2f1ccc1ef920495ecbb8a758649b4bc7c55d73d2c142da5c41632d7da93b303b" exitCode=255 Feb 27 01:05:01 crc kubenswrapper[4771]: I0227 01:05:01.878996 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2f1ccc1ef920495ecbb8a758649b4bc7c55d73d2c142da5c41632d7da93b303b"} Feb 27 01:05:01 crc kubenswrapper[4771]: I0227 01:05:01.879199 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:01 crc kubenswrapper[4771]: I0227 01:05:01.887157 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:01 crc kubenswrapper[4771]: I0227 01:05:01.887230 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:01 crc kubenswrapper[4771]: I0227 01:05:01.887253 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:01 crc kubenswrapper[4771]: I0227 01:05:01.888219 4771 scope.go:117] "RemoveContainer" containerID="2f1ccc1ef920495ecbb8a758649b4bc7c55d73d2c142da5c41632d7da93b303b" Feb 27 01:05:01 crc kubenswrapper[4771]: W0227 01:05:01.979323 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 27 01:05:01 crc kubenswrapper[4771]: I0227 01:05:01.979460 4771 trace.go:236] Trace[2097946000]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Feb-2026 01:04:51.977) (total time: 10002ms): Feb 27 01:05:01 crc kubenswrapper[4771]: Trace[2097946000]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (01:05:01.979) Feb 27 01:05:01 crc kubenswrapper[4771]: Trace[2097946000]: [10.002023111s] [10.002023111s] END Feb 27 01:05:01 crc kubenswrapper[4771]: E0227 01:05:01.979494 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 27 01:05:02 crc kubenswrapper[4771]: W0227 01:05:02.184857 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 27 01:05:02 crc kubenswrapper[4771]: I0227 01:05:02.185000 4771 trace.go:236] Trace[724443974]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Feb-2026 01:04:52.183) (total time: 10001ms): Feb 27 01:05:02 crc kubenswrapper[4771]: Trace[724443974]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (01:05:02.184) Feb 27 01:05:02 crc kubenswrapper[4771]: Trace[724443974]: [10.001785525s] [10.001785525s] END Feb 27 01:05:02 crc kubenswrapper[4771]: E0227 01:05:02.185040 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 27 01:05:02 crc kubenswrapper[4771]: E0227 01:05:02.214579 4771 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:02Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1897f4ef749adb2f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.684246319 +0000 UTC m=+0.621807617,LastTimestamp:2026-02-27 01:04:47.684246319 +0000 UTC m=+0.621807617,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:02 crc kubenswrapper[4771]: W0227 01:05:02.216736 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:02Z is after 2026-02-23T05:33:13Z Feb 27 01:05:02 crc kubenswrapper[4771]: E0227 01:05:02.216841 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 01:05:02 crc kubenswrapper[4771]: E0227 01:05:02.218924 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:02Z is after 2026-02-23T05:33:13Z" interval="6.4s" Feb 27 01:05:02 crc kubenswrapper[4771]: E0227 01:05:02.220963 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:02Z is after 2026-02-23T05:33:13Z" node="crc" Feb 27 01:05:02 crc kubenswrapper[4771]: I0227 01:05:02.221163 4771 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 27 01:05:02 crc kubenswrapper[4771]: I0227 01:05:02.221228 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 27 01:05:02 crc kubenswrapper[4771]: I0227 01:05:02.226280 4771 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 27 01:05:02 crc kubenswrapper[4771]: I0227 01:05:02.226358 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 27 01:05:02 crc kubenswrapper[4771]: E0227 01:05:02.227914 4771 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 01:05:02 crc kubenswrapper[4771]: I0227 01:05:02.690896 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:02Z is after 2026-02-23T05:33:13Z Feb 27 01:05:02 crc kubenswrapper[4771]: I0227 01:05:02.883863 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 27 01:05:02 crc kubenswrapper[4771]: I0227 01:05:02.884522 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 27 01:05:02 crc kubenswrapper[4771]: I0227 01:05:02.886522 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bafc9d2fe5c47cec0e8c1cfc64ef820d14ecd0c46519ef4da13e77bdbbf3c564" exitCode=255 Feb 27 01:05:02 crc kubenswrapper[4771]: I0227 01:05:02.886596 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"bafc9d2fe5c47cec0e8c1cfc64ef820d14ecd0c46519ef4da13e77bdbbf3c564"} Feb 27 01:05:02 crc kubenswrapper[4771]: I0227 01:05:02.886669 4771 scope.go:117] "RemoveContainer" containerID="2f1ccc1ef920495ecbb8a758649b4bc7c55d73d2c142da5c41632d7da93b303b" Feb 27 01:05:02 crc kubenswrapper[4771]: I0227 01:05:02.886876 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:02 crc kubenswrapper[4771]: I0227 01:05:02.888876 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:02 crc kubenswrapper[4771]: I0227 01:05:02.888931 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:02 crc kubenswrapper[4771]: I0227 01:05:02.888950 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:02 crc kubenswrapper[4771]: I0227 01:05:02.889730 4771 scope.go:117] "RemoveContainer" containerID="bafc9d2fe5c47cec0e8c1cfc64ef820d14ecd0c46519ef4da13e77bdbbf3c564" Feb 27 01:05:02 crc kubenswrapper[4771]: E0227 01:05:02.890020 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 01:05:03 crc kubenswrapper[4771]: I0227 01:05:03.014907 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 27 01:05:03 crc kubenswrapper[4771]: I0227 01:05:03.015208 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:03 crc kubenswrapper[4771]: I0227 01:05:03.017037 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:03 crc kubenswrapper[4771]: I0227 01:05:03.017099 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:03 crc kubenswrapper[4771]: I0227 01:05:03.017113 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:03 crc kubenswrapper[4771]: I0227 01:05:03.064266 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 27 01:05:03 crc kubenswrapper[4771]: I0227 01:05:03.692033 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:03Z is after 2026-02-23T05:33:13Z Feb 27 01:05:03 crc kubenswrapper[4771]: I0227 01:05:03.892395 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 27 01:05:03 crc kubenswrapper[4771]: I0227 01:05:03.895717 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:03 crc kubenswrapper[4771]: I0227 01:05:03.897587 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:03 crc kubenswrapper[4771]: I0227 01:05:03.897648 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:03 crc kubenswrapper[4771]: I0227 01:05:03.897668 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:03 crc kubenswrapper[4771]: I0227 01:05:03.916163 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 27 01:05:04 crc kubenswrapper[4771]: I0227 01:05:04.690133 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:04Z is after 2026-02-23T05:33:13Z Feb 27 01:05:04 crc kubenswrapper[4771]: I0227 01:05:04.706907 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:05:04 crc kubenswrapper[4771]: I0227 01:05:04.707072 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:04 crc kubenswrapper[4771]: I0227 01:05:04.709003 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:04 crc kubenswrapper[4771]: I0227 01:05:04.709042 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:04 crc kubenswrapper[4771]: I0227 01:05:04.709057 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:04 crc kubenswrapper[4771]: I0227 01:05:04.899851 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:04 crc kubenswrapper[4771]: I0227 01:05:04.901242 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:04 crc kubenswrapper[4771]: I0227 01:05:04.901317 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:04 crc kubenswrapper[4771]: I0227 01:05:04.901341 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:05 crc kubenswrapper[4771]: I0227 01:05:05.039253 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:05:05 crc kubenswrapper[4771]: I0227 01:05:05.040046 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:05 crc kubenswrapper[4771]: I0227 01:05:05.042087 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:05 crc kubenswrapper[4771]: I0227 01:05:05.042271 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:05 crc kubenswrapper[4771]: I0227 01:05:05.042348 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:05 crc kubenswrapper[4771]: I0227 01:05:05.043132 4771 scope.go:117] "RemoveContainer" containerID="bafc9d2fe5c47cec0e8c1cfc64ef820d14ecd0c46519ef4da13e77bdbbf3c564" Feb 27 01:05:05 crc kubenswrapper[4771]: E0227 01:05:05.043418 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 01:05:05 crc kubenswrapper[4771]: I0227 01:05:05.691398 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:05Z is after 2026-02-23T05:33:13Z Feb 27 01:05:05 crc kubenswrapper[4771]: W0227 01:05:05.958473 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:05Z is after 2026-02-23T05:33:13Z Feb 27 01:05:05 crc kubenswrapper[4771]: E0227 01:05:05.958648 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 01:05:06 crc kubenswrapper[4771]: W0227 01:05:06.086694 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:06Z is after 2026-02-23T05:33:13Z Feb 27 01:05:06 crc kubenswrapper[4771]: E0227 01:05:06.086828 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:06Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 01:05:06 crc kubenswrapper[4771]: I0227 01:05:06.691931 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:06Z is after 2026-02-23T05:33:13Z Feb 27 01:05:06 crc kubenswrapper[4771]: I0227 01:05:06.706737 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:05:06 crc kubenswrapper[4771]: I0227 01:05:06.706979 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:06 crc kubenswrapper[4771]: I0227 01:05:06.708794 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:06 crc kubenswrapper[4771]: I0227 01:05:06.708865 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:06 crc kubenswrapper[4771]: I0227 01:05:06.708886 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:06 crc kubenswrapper[4771]: I0227 01:05:06.709786 4771 scope.go:117] "RemoveContainer" containerID="bafc9d2fe5c47cec0e8c1cfc64ef820d14ecd0c46519ef4da13e77bdbbf3c564" Feb 27 01:05:06 crc kubenswrapper[4771]: E0227 01:05:06.710085 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 01:05:06 crc kubenswrapper[4771]: I0227 01:05:06.714270 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:05:06 crc kubenswrapper[4771]: I0227 01:05:06.789021 4771 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 27 01:05:06 crc kubenswrapper[4771]: I0227 01:05:06.789223 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 27 01:05:06 crc kubenswrapper[4771]: I0227 01:05:06.906168 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:06 crc kubenswrapper[4771]: I0227 01:05:06.907819 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:06 crc kubenswrapper[4771]: I0227 01:05:06.907889 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:06 crc kubenswrapper[4771]: I0227 01:05:06.907916 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:06 crc kubenswrapper[4771]: I0227 01:05:06.908924 4771 scope.go:117] "RemoveContainer" containerID="bafc9d2fe5c47cec0e8c1cfc64ef820d14ecd0c46519ef4da13e77bdbbf3c564" Feb 27 01:05:06 crc kubenswrapper[4771]: E0227 01:05:06.909248 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 01:05:07 crc kubenswrapper[4771]: W0227 01:05:07.189514 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:07Z is after 2026-02-23T05:33:13Z Feb 27 01:05:07 crc kubenswrapper[4771]: E0227 01:05:07.189609 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 01:05:07 crc kubenswrapper[4771]: I0227 01:05:07.690771 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:07Z is after 2026-02-23T05:33:13Z Feb 27 01:05:07 crc kubenswrapper[4771]: E0227 01:05:07.869946 4771 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 01:05:08 crc kubenswrapper[4771]: I0227 01:05:08.621380 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:08 crc kubenswrapper[4771]: I0227 01:05:08.623130 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:08 crc kubenswrapper[4771]: I0227 01:05:08.623178 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:08 crc kubenswrapper[4771]: I0227 01:05:08.623198 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:08 crc kubenswrapper[4771]: I0227 01:05:08.623235 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 01:05:08 crc kubenswrapper[4771]: E0227 01:05:08.626190 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:08Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 27 01:05:08 crc kubenswrapper[4771]: E0227 01:05:08.629579 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:08Z is after 2026-02-23T05:33:13Z" node="crc" Feb 27 01:05:08 crc kubenswrapper[4771]: I0227 01:05:08.690986 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:08Z is after 2026-02-23T05:33:13Z Feb 27 01:05:09 crc kubenswrapper[4771]: I0227 01:05:09.690050 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:09Z is after 2026-02-23T05:33:13Z Feb 27 01:05:10 crc kubenswrapper[4771]: W0227 01:05:10.227463 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:10Z is after 2026-02-23T05:33:13Z Feb 27 01:05:10 crc kubenswrapper[4771]: E0227 01:05:10.227631 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:10Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 01:05:10 crc kubenswrapper[4771]: I0227 01:05:10.691533 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:10Z is after 2026-02-23T05:33:13Z Feb 27 01:05:10 crc kubenswrapper[4771]: I0227 01:05:10.764239 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 27 01:05:10 crc kubenswrapper[4771]: E0227 01:05:10.771030 4771 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:10Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 01:05:11 crc kubenswrapper[4771]: I0227 01:05:11.691175 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:11Z is after 2026-02-23T05:33:13Z Feb 27 01:05:12 crc kubenswrapper[4771]: E0227 01:05:12.218366 4771 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:12Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1897f4ef749adb2f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.684246319 +0000 UTC m=+0.621807617,LastTimestamp:2026-02-27 01:04:47.684246319 +0000 UTC m=+0.621807617,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:12 crc kubenswrapper[4771]: I0227 01:05:12.691163 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:12Z is after 2026-02-23T05:33:13Z Feb 27 01:05:12 crc kubenswrapper[4771]: I0227 01:05:12.762751 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:05:12 crc kubenswrapper[4771]: I0227 01:05:12.763231 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:12 crc kubenswrapper[4771]: I0227 01:05:12.764978 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:12 crc kubenswrapper[4771]: I0227 01:05:12.765187 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:12 crc kubenswrapper[4771]: I0227 01:05:12.765327 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:12 crc kubenswrapper[4771]: I0227 01:05:12.766341 4771 scope.go:117] "RemoveContainer" containerID="bafc9d2fe5c47cec0e8c1cfc64ef820d14ecd0c46519ef4da13e77bdbbf3c564" Feb 27 01:05:13 crc kubenswrapper[4771]: I0227 01:05:13.692230 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:13Z is after 2026-02-23T05:33:13Z Feb 27 01:05:13 crc kubenswrapper[4771]: I0227 01:05:13.928746 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 27 01:05:13 crc kubenswrapper[4771]: I0227 01:05:13.930513 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 27 01:05:13 crc kubenswrapper[4771]: I0227 01:05:13.933596 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bcecf191573ec1e56a589b3e69a074799b012a802c1a3e8cacc7dd3dddd65e9e" exitCode=255 Feb 27 01:05:13 crc kubenswrapper[4771]: I0227 01:05:13.933656 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"bcecf191573ec1e56a589b3e69a074799b012a802c1a3e8cacc7dd3dddd65e9e"} Feb 27 01:05:13 crc kubenswrapper[4771]: I0227 01:05:13.933706 4771 scope.go:117] "RemoveContainer" containerID="bafc9d2fe5c47cec0e8c1cfc64ef820d14ecd0c46519ef4da13e77bdbbf3c564" Feb 27 01:05:13 crc kubenswrapper[4771]: I0227 01:05:13.933871 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:13 crc kubenswrapper[4771]: I0227 01:05:13.935038 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:13 crc kubenswrapper[4771]: I0227 01:05:13.935073 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:13 crc kubenswrapper[4771]: I0227 01:05:13.935084 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:13 crc kubenswrapper[4771]: I0227 01:05:13.935664 4771 scope.go:117] "RemoveContainer" containerID="bcecf191573ec1e56a589b3e69a074799b012a802c1a3e8cacc7dd3dddd65e9e" Feb 27 01:05:13 crc kubenswrapper[4771]: E0227 01:05:13.935868 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 01:05:14 crc kubenswrapper[4771]: I0227 01:05:14.691366 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:14Z is after 2026-02-23T05:33:13Z Feb 27 01:05:14 crc kubenswrapper[4771]: I0227 01:05:14.939822 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 27 01:05:15 crc kubenswrapper[4771]: I0227 01:05:15.038585 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:05:15 crc kubenswrapper[4771]: I0227 01:05:15.038793 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:15 crc kubenswrapper[4771]: I0227 01:05:15.040235 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:15 crc kubenswrapper[4771]: I0227 01:05:15.040287 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:15 crc kubenswrapper[4771]: I0227 01:05:15.040306 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:15 crc kubenswrapper[4771]: I0227 01:05:15.041158 4771 scope.go:117] "RemoveContainer" containerID="bcecf191573ec1e56a589b3e69a074799b012a802c1a3e8cacc7dd3dddd65e9e" Feb 27 01:05:15 crc kubenswrapper[4771]: E0227 01:05:15.041440 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 01:05:15 crc kubenswrapper[4771]: I0227 01:05:15.629964 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:15 crc kubenswrapper[4771]: I0227 01:05:15.632252 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:15 crc kubenswrapper[4771]: I0227 01:05:15.632310 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:15 crc kubenswrapper[4771]: I0227 01:05:15.632328 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:15 crc kubenswrapper[4771]: I0227 01:05:15.632361 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 01:05:15 crc kubenswrapper[4771]: E0227 01:05:15.633921 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:15Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 27 01:05:15 crc kubenswrapper[4771]: E0227 01:05:15.638311 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:15Z is after 2026-02-23T05:33:13Z" node="crc" Feb 27 01:05:15 crc kubenswrapper[4771]: I0227 01:05:15.691363 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:15Z is after 2026-02-23T05:33:13Z Feb 27 01:05:16 crc kubenswrapper[4771]: W0227 01:05:16.005496 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z Feb 27 01:05:16 crc kubenswrapper[4771]: E0227 01:05:16.005637 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 01:05:16 crc kubenswrapper[4771]: I0227 01:05:16.692521 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z Feb 27 01:05:16 crc kubenswrapper[4771]: I0227 01:05:16.789150 4771 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 27 01:05:16 crc kubenswrapper[4771]: I0227 01:05:16.789243 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 27 01:05:16 crc kubenswrapper[4771]: I0227 01:05:16.789321 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:05:16 crc kubenswrapper[4771]: I0227 01:05:16.789546 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:16 crc kubenswrapper[4771]: I0227 01:05:16.791754 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:16 crc kubenswrapper[4771]: I0227 01:05:16.791811 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:16 crc kubenswrapper[4771]: I0227 01:05:16.791829 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:16 crc kubenswrapper[4771]: I0227 01:05:16.792485 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 27 01:05:16 crc kubenswrapper[4771]: I0227 01:05:16.792773 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6" gracePeriod=30 Feb 27 01:05:16 crc kubenswrapper[4771]: W0227 01:05:16.846686 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z Feb 27 01:05:16 crc kubenswrapper[4771]: E0227 01:05:16.846786 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 01:05:16 crc kubenswrapper[4771]: I0227 01:05:16.953598 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 27 01:05:16 crc kubenswrapper[4771]: I0227 01:05:16.954093 4771 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6" exitCode=255 Feb 27 01:05:16 crc kubenswrapper[4771]: I0227 01:05:16.954158 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6"} Feb 27 01:05:17 crc kubenswrapper[4771]: I0227 01:05:17.691141 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:17Z is after 2026-02-23T05:33:13Z Feb 27 01:05:17 crc kubenswrapper[4771]: E0227 01:05:17.870071 4771 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 01:05:17 crc kubenswrapper[4771]: I0227 01:05:17.959026 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 27 01:05:17 crc kubenswrapper[4771]: I0227 01:05:17.959513 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b"} Feb 27 01:05:17 crc kubenswrapper[4771]: I0227 01:05:17.959622 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:17 crc kubenswrapper[4771]: I0227 01:05:17.960436 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:17 crc kubenswrapper[4771]: I0227 01:05:17.960464 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:17 crc kubenswrapper[4771]: I0227 01:05:17.960473 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:18 crc kubenswrapper[4771]: W0227 01:05:18.214139 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:18Z is after 2026-02-23T05:33:13Z Feb 27 01:05:18 crc kubenswrapper[4771]: E0227 01:05:18.214306 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 01:05:18 crc kubenswrapper[4771]: I0227 01:05:18.692246 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:18Z is after 2026-02-23T05:33:13Z Feb 27 01:05:18 crc kubenswrapper[4771]: I0227 01:05:18.962404 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:18 crc kubenswrapper[4771]: I0227 01:05:18.963824 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:18 crc kubenswrapper[4771]: I0227 01:05:18.963876 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:18 crc kubenswrapper[4771]: I0227 01:05:18.963893 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:19 crc kubenswrapper[4771]: I0227 01:05:19.691314 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:19Z is after 2026-02-23T05:33:13Z Feb 27 01:05:20 crc kubenswrapper[4771]: I0227 01:05:20.693102 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:21 crc kubenswrapper[4771]: I0227 01:05:21.697048 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.226149 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f4ef749adb2f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.684246319 +0000 UTC m=+0.621807617,LastTimestamp:2026-02-27 01:04:47.684246319 +0000 UTC m=+0.621807617,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.233705 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f4ef79578a24 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.76372074 +0000 UTC m=+0.701282038,LastTimestamp:2026-02-27 01:04:47.76372074 +0000 UTC m=+0.701282038,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.240792 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f4ef7957c7a0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.76373648 +0000 UTC m=+0.701297788,LastTimestamp:2026-02-27 01:04:47.76373648 +0000 UTC m=+0.701297788,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.248250 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f4ef7957f1a9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.763747241 +0000 UTC m=+0.701308539,LastTimestamp:2026-02-27 01:04:47.763747241 +0000 UTC m=+0.701308539,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.255156 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f4ef7f34e44f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.862113359 +0000 UTC m=+0.799674647,LastTimestamp:2026-02-27 01:04:47.862113359 +0000 UTC m=+0.799674647,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.262208 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f4ef79578a24\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f4ef79578a24 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.76372074 +0000 UTC m=+0.701282038,LastTimestamp:2026-02-27 01:04:47.875270072 +0000 UTC m=+0.812831370,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.269246 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f4ef7957c7a0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f4ef7957c7a0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.76373648 +0000 UTC m=+0.701297788,LastTimestamp:2026-02-27 01:04:47.875330323 +0000 UTC m=+0.812891621,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.276878 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f4ef7957f1a9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f4ef7957f1a9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.763747241 +0000 UTC m=+0.701308539,LastTimestamp:2026-02-27 01:04:47.875348304 +0000 UTC m=+0.812909602,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.283895 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f4ef79578a24\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f4ef79578a24 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.76372074 +0000 UTC m=+0.701282038,LastTimestamp:2026-02-27 01:04:47.877017808 +0000 UTC m=+0.814579106,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.290405 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f4ef7957c7a0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f4ef7957c7a0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.76373648 +0000 UTC m=+0.701297788,LastTimestamp:2026-02-27 01:04:47.877048329 +0000 UTC m=+0.814609627,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.300673 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f4ef7957f1a9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f4ef7957f1a9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.763747241 +0000 UTC m=+0.701308539,LastTimestamp:2026-02-27 01:04:47.87706268 +0000 UTC m=+0.814623978,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.307318 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f4ef79578a24\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f4ef79578a24 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.76372074 +0000 UTC m=+0.701282038,LastTimestamp:2026-02-27 01:04:47.878017675 +0000 UTC m=+0.815578973,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.314132 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f4ef7957c7a0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f4ef7957c7a0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.76373648 +0000 UTC m=+0.701297788,LastTimestamp:2026-02-27 01:04:47.878030556 +0000 UTC m=+0.815591854,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.320490 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f4ef7957f1a9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f4ef7957f1a9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.763747241 +0000 UTC m=+0.701308539,LastTimestamp:2026-02-27 01:04:47.878042076 +0000 UTC m=+0.815603374,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.327215 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f4ef79578a24\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f4ef79578a24 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.76372074 +0000 UTC m=+0.701282038,LastTimestamp:2026-02-27 01:04:47.878239281 +0000 UTC m=+0.815800609,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.334344 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f4ef79578a24\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f4ef79578a24 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.76372074 +0000 UTC m=+0.701282038,LastTimestamp:2026-02-27 01:04:47.878285092 +0000 UTC m=+0.815846390,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.339576 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f4ef7957c7a0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f4ef7957c7a0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.76373648 +0000 UTC m=+0.701297788,LastTimestamp:2026-02-27 01:04:47.878306943 +0000 UTC m=+0.815868251,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.346135 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f4ef7957f1a9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f4ef7957f1a9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.763747241 +0000 UTC m=+0.701308539,LastTimestamp:2026-02-27 01:04:47.878320173 +0000 UTC m=+0.815881471,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.352858 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f4ef7957c7a0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f4ef7957c7a0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.76373648 +0000 UTC m=+0.701297788,LastTimestamp:2026-02-27 01:04:47.878385695 +0000 UTC m=+0.815947023,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.359715 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f4ef7957f1a9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f4ef7957f1a9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.763747241 +0000 UTC m=+0.701308539,LastTimestamp:2026-02-27 01:04:47.878421696 +0000 UTC m=+0.815983024,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.366812 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f4ef79578a24\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f4ef79578a24 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.76372074 +0000 UTC m=+0.701282038,LastTimestamp:2026-02-27 01:04:47.879910276 +0000 UTC m=+0.817471584,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.374394 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f4ef7957c7a0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f4ef7957c7a0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.76373648 +0000 UTC m=+0.701297788,LastTimestamp:2026-02-27 01:04:47.879931156 +0000 UTC m=+0.817492454,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.380924 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f4ef7957f1a9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f4ef7957f1a9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.763747241 +0000 UTC m=+0.701308539,LastTimestamp:2026-02-27 01:04:47.879944507 +0000 UTC m=+0.817505805,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.388350 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f4ef79578a24\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f4ef79578a24 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.76372074 +0000 UTC m=+0.701282038,LastTimestamp:2026-02-27 01:04:47.880010978 +0000 UTC m=+0.817572306,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.395087 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897f4ef7957c7a0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897f4ef7957c7a0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:47.76373648 +0000 UTC m=+0.701297788,LastTimestamp:2026-02-27 01:04:47.880030319 +0000 UTC m=+0.817591647,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.407596 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f4ef97c31a43 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:48.274086467 +0000 UTC m=+1.211647785,LastTimestamp:2026-02-27 01:04:48.274086467 +0000 UTC m=+1.211647785,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.413104 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f4ef97c3af36 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:48.274124598 +0000 UTC m=+1.211685926,LastTimestamp:2026-02-27 01:04:48.274124598 +0000 UTC m=+1.211685926,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.419935 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897f4ef987393d7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:48.285651927 +0000 UTC m=+1.223213245,LastTimestamp:2026-02-27 01:04:48.285651927 +0000 UTC m=+1.223213245,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.426631 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f4ef99307adf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:48.298031839 +0000 UTC m=+1.235593167,LastTimestamp:2026-02-27 01:04:48.298031839 +0000 UTC m=+1.235593167,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.431768 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f4ef99aca171 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:48.306168177 +0000 UTC m=+1.243729495,LastTimestamp:2026-02-27 01:04:48.306168177 +0000 UTC m=+1.243729495,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.437288 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f4efbf415c3a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:48.936672314 +0000 UTC m=+1.874233602,LastTimestamp:2026-02-27 01:04:48.936672314 +0000 UTC m=+1.874233602,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.444000 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f4efbf4b0b47 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:48.937306951 +0000 UTC m=+1.874868259,LastTimestamp:2026-02-27 01:04:48.937306951 +0000 UTC m=+1.874868259,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.448934 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f4efbf4b496d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:48.937322861 +0000 UTC m=+1.874884149,LastTimestamp:2026-02-27 01:04:48.937322861 +0000 UTC m=+1.874884149,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.455325 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f4efbf53b86f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:48.937875567 +0000 UTC m=+1.875436855,LastTimestamp:2026-02-27 01:04:48.937875567 +0000 UTC m=+1.875436855,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.462138 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897f4efbf577cd5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:48.938122453 +0000 UTC m=+1.875683741,LastTimestamp:2026-02-27 01:04:48.938122453 +0000 UTC m=+1.875683741,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.469775 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f4efbfd5d587 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:48.946402695 +0000 UTC m=+1.883964013,LastTimestamp:2026-02-27 01:04:48.946402695 +0000 UTC m=+1.883964013,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.476658 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f4efc00abd0e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:48.949869838 +0000 UTC m=+1.887431126,LastTimestamp:2026-02-27 01:04:48.949869838 +0000 UTC m=+1.887431126,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.483224 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f4efc018b661 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:48.950785633 +0000 UTC m=+1.888346921,LastTimestamp:2026-02-27 01:04:48.950785633 +0000 UTC m=+1.888346921,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.489960 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f4efc034bcfc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:48.952622332 +0000 UTC m=+1.890183620,LastTimestamp:2026-02-27 01:04:48.952622332 +0000 UTC m=+1.890183620,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.496980 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f4efc06aceeb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:48.956165867 +0000 UTC m=+1.893727165,LastTimestamp:2026-02-27 01:04:48.956165867 +0000 UTC m=+1.893727165,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.504141 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897f4efc095a969 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:48.958974313 +0000 UTC m=+1.896535601,LastTimestamp:2026-02-27 01:04:48.958974313 +0000 UTC m=+1.896535601,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.513524 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f4efd48a6aed openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:49.293781741 +0000 UTC m=+2.231343059,LastTimestamp:2026-02-27 01:04:49.293781741 +0000 UTC m=+2.231343059,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.520465 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f4efd5539380 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:49.306964864 +0000 UTC m=+2.244526182,LastTimestamp:2026-02-27 01:04:49.306964864 +0000 UTC m=+2.244526182,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.530358 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f4efd56da8de openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:49.30867427 +0000 UTC m=+2.246235598,LastTimestamp:2026-02-27 01:04:49.30867427 +0000 UTC m=+2.246235598,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.536458 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f4efe34ce89e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:49.541408926 +0000 UTC m=+2.478970214,LastTimestamp:2026-02-27 01:04:49.541408926 +0000 UTC m=+2.478970214,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.543703 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f4efe43695d9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:49.556723161 +0000 UTC m=+2.494284479,LastTimestamp:2026-02-27 01:04:49.556723161 +0000 UTC m=+2.494284479,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.549802 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f4efe45a2a87 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:49.559054983 +0000 UTC m=+2.496616271,LastTimestamp:2026-02-27 01:04:49.559054983 +0000 UTC m=+2.496616271,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.556680 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897f4eff249117b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:49.792815483 +0000 UTC m=+2.730376781,LastTimestamp:2026-02-27 01:04:49.792815483 +0000 UTC m=+2.730376781,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.562636 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f4eff259128c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:49.793864332 +0000 UTC m=+2.731425620,LastTimestamp:2026-02-27 01:04:49.793864332 +0000 UTC m=+2.731425620,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.568978 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f4eff2a656f8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:49.79892812 +0000 UTC m=+2.736489448,LastTimestamp:2026-02-27 01:04:49.79892812 +0000 UTC m=+2.736489448,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.573861 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f4eff3179b7e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:49.80635123 +0000 UTC m=+2.743912518,LastTimestamp:2026-02-27 01:04:49.80635123 +0000 UTC m=+2.743912518,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.577647 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f4eff31eee45 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:49.806831173 +0000 UTC m=+2.744392461,LastTimestamp:2026-02-27 01:04:49.806831173 +0000 UTC m=+2.744392461,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.583825 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f4eff3e8267d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:49.820018301 +0000 UTC m=+2.757579809,LastTimestamp:2026-02-27 01:04:49.820018301 +0000 UTC m=+2.757579809,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.587865 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f4f00146ff73 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.044338035 +0000 UTC m=+2.981899333,LastTimestamp:2026-02-27 01:04:50.044338035 +0000 UTC m=+2.981899333,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.593648 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f4f0019a5f45 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.049802053 +0000 UTC m=+2.987363341,LastTimestamp:2026-02-27 01:04:50.049802053 +0000 UTC m=+2.987363341,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.597423 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897f4f001b48475 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.051515509 +0000 UTC m=+2.989076797,LastTimestamp:2026-02-27 01:04:50.051515509 +0000 UTC m=+2.989076797,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.600706 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f4f001e3dd0f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.054618383 +0000 UTC m=+2.992179671,LastTimestamp:2026-02-27 01:04:50.054618383 +0000 UTC m=+2.992179671,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.605030 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f4f001f2b048 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.05558996 +0000 UTC m=+2.993151248,LastTimestamp:2026-02-27 01:04:50.05558996 +0000 UTC m=+2.993151248,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.608935 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f4f002082caa openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.056998058 +0000 UTC m=+2.994559346,LastTimestamp:2026-02-27 01:04:50.056998058 +0000 UTC m=+2.994559346,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.612801 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f4f003101b6a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.074295146 +0000 UTC m=+3.011856434,LastTimestamp:2026-02-27 01:04:50.074295146 +0000 UTC m=+3.011856434,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.617469 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897f4f0032aed4e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.076052814 +0000 UTC m=+3.013614092,LastTimestamp:2026-02-27 01:04:50.076052814 +0000 UTC m=+3.013614092,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.618474 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f4f003859713 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.081994515 +0000 UTC m=+3.019555803,LastTimestamp:2026-02-27 01:04:50.081994515 +0000 UTC m=+3.019555803,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.622226 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f4f00397a09f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.083176607 +0000 UTC m=+3.020737885,LastTimestamp:2026-02-27 01:04:50.083176607 +0000 UTC m=+3.020737885,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.625731 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f4f00ea7ed83 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.268794243 +0000 UTC m=+3.206355531,LastTimestamp:2026-02-27 01:04:50.268794243 +0000 UTC m=+3.206355531,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.631331 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f4f00fe984a0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.289869984 +0000 UTC m=+3.227431272,LastTimestamp:2026-02-27 01:04:50.289869984 +0000 UTC m=+3.227431272,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.635595 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.635649 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f4f010147cce openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.29268603 +0000 UTC m=+3.230247318,LastTimestamp:2026-02-27 01:04:50.29268603 +0000 UTC m=+3.230247318,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: I0227 01:05:22.639367 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.640574 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f4f01028989a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.294003866 +0000 UTC m=+3.231565154,LastTimestamp:2026-02-27 01:04:50.294003866 +0000 UTC m=+3.231565154,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: I0227 01:05:22.640741 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:22 crc kubenswrapper[4771]: I0227 01:05:22.640767 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:22 crc kubenswrapper[4771]: I0227 01:05:22.640775 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:22 crc kubenswrapper[4771]: I0227 01:05:22.640794 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.645534 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f4f0119ae0be openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.318270654 +0000 UTC m=+3.255831942,LastTimestamp:2026-02-27 01:04:50.318270654 +0000 UTC m=+3.255831942,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.645683 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.651219 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f4f011b20169 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.319786345 +0000 UTC m=+3.257347633,LastTimestamp:2026-02-27 01:04:50.319786345 +0000 UTC m=+3.257347633,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.657973 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f4f01df9e170 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.525823344 +0000 UTC m=+3.463384632,LastTimestamp:2026-02-27 01:04:50.525823344 +0000 UTC m=+3.463384632,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.664921 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f4f01e16a0db openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.527707355 +0000 UTC m=+3.465268633,LastTimestamp:2026-02-27 01:04:50.527707355 +0000 UTC m=+3.465268633,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.670614 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897f4f01f1784e7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.544542951 +0000 UTC m=+3.482104239,LastTimestamp:2026-02-27 01:04:50.544542951 +0000 UTC m=+3.482104239,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.674975 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f4f01f406823 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.547222563 +0000 UTC m=+3.484783851,LastTimestamp:2026-02-27 01:04:50.547222563 +0000 UTC m=+3.484783851,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.681704 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f4f01f51f3fb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.548372475 +0000 UTC m=+3.485933763,LastTimestamp:2026-02-27 01:04:50.548372475 +0000 UTC m=+3.485933763,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: I0227 01:05:22.687756 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.687871 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f4f02c9e31b0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.771472816 +0000 UTC m=+3.709034104,LastTimestamp:2026-02-27 01:04:50.771472816 +0000 UTC m=+3.709034104,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.693420 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f4f02d5f03a6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.784109478 +0000 UTC m=+3.721670776,LastTimestamp:2026-02-27 01:04:50.784109478 +0000 UTC m=+3.721670776,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.697376 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f4f02d6bb739 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.784941881 +0000 UTC m=+3.722503169,LastTimestamp:2026-02-27 01:04:50.784941881 +0000 UTC m=+3.722503169,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.704985 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f4f02fb9ad91 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.823605649 +0000 UTC m=+3.761166947,LastTimestamp:2026-02-27 01:04:50.823605649 +0000 UTC m=+3.761166947,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.712256 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f4f0395ca846 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.985281606 +0000 UTC m=+3.922842894,LastTimestamp:2026-02-27 01:04:50.985281606 +0000 UTC m=+3.922842894,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.717011 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f4f03a1b7405 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.997785605 +0000 UTC m=+3.935346903,LastTimestamp:2026-02-27 01:04:50.997785605 +0000 UTC m=+3.935346903,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.723846 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f4f03a908e69 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:51.005460073 +0000 UTC m=+3.943021361,LastTimestamp:2026-02-27 01:04:51.005460073 +0000 UTC m=+3.943021361,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.728012 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f4f03b3148fe openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:51.015993598 +0000 UTC m=+3.953554896,LastTimestamp:2026-02-27 01:04:51.015993598 +0000 UTC m=+3.953554896,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.733662 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f4f06c34a11a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:51.838296346 +0000 UTC m=+4.775857674,LastTimestamp:2026-02-27 01:04:51.838296346 +0000 UTC m=+4.775857674,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.737854 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f4f07bfd3adb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:52.103101147 +0000 UTC m=+5.040662465,LastTimestamp:2026-02-27 01:04:52.103101147 +0000 UTC m=+5.040662465,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.744609 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f4f07c85e46a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:52.11205745 +0000 UTC m=+5.049618778,LastTimestamp:2026-02-27 01:04:52.11205745 +0000 UTC m=+5.049618778,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.751225 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f4f07c9fcef8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:52.113755896 +0000 UTC m=+5.051317214,LastTimestamp:2026-02-27 01:04:52.113755896 +0000 UTC m=+5.051317214,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.757949 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f4f08afa5538 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:52.354569528 +0000 UTC m=+5.292130876,LastTimestamp:2026-02-27 01:04:52.354569528 +0000 UTC m=+5.292130876,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: I0227 01:05:22.762919 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:05:22 crc kubenswrapper[4771]: I0227 01:05:22.763187 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:22 crc kubenswrapper[4771]: I0227 01:05:22.764864 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:22 crc kubenswrapper[4771]: I0227 01:05:22.764923 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:22 crc kubenswrapper[4771]: I0227 01:05:22.764945 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.764974 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f4f08be8dbc5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:52.370201541 +0000 UTC m=+5.307762829,LastTimestamp:2026-02-27 01:04:52.370201541 +0000 UTC m=+5.307762829,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: I0227 01:05:22.765843 4771 scope.go:117] "RemoveContainer" containerID="bcecf191573ec1e56a589b3e69a074799b012a802c1a3e8cacc7dd3dddd65e9e" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.766183 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.770836 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f4f08bfd34e5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:52.371535077 +0000 UTC m=+5.309096405,LastTimestamp:2026-02-27 01:04:52.371535077 +0000 UTC m=+5.309096405,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.777053 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f4f09a96839c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:52.61646326 +0000 UTC m=+5.554024578,LastTimestamp:2026-02-27 01:04:52.61646326 +0000 UTC m=+5.554024578,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.783271 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f4f09b72fc29 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:52.630912041 +0000 UTC m=+5.568473339,LastTimestamp:2026-02-27 01:04:52.630912041 +0000 UTC m=+5.568473339,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.788527 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f4f09b84f83b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:52.632090683 +0000 UTC m=+5.569651981,LastTimestamp:2026-02-27 01:04:52.632090683 +0000 UTC m=+5.569651981,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.795027 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f4f0ac450b26 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:52.913113894 +0000 UTC m=+5.850675212,LastTimestamp:2026-02-27 01:04:52.913113894 +0000 UTC m=+5.850675212,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.801264 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f4f0ad4c0c3c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:52.93035014 +0000 UTC m=+5.867911458,LastTimestamp:2026-02-27 01:04:52.93035014 +0000 UTC m=+5.867911458,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.807679 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f4f0ad667778 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:52.932081528 +0000 UTC m=+5.869642856,LastTimestamp:2026-02-27 01:04:52.932081528 +0000 UTC m=+5.869642856,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.813136 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f4f0bd2c429c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:53.196702364 +0000 UTC m=+6.134263682,LastTimestamp:2026-02-27 01:04:53.196702364 +0000 UTC m=+6.134263682,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.819177 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897f4f0be6591db openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:53.217235419 +0000 UTC m=+6.154796747,LastTimestamp:2026-02-27 01:04:53.217235419 +0000 UTC m=+6.154796747,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.829375 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 27 01:05:22 crc kubenswrapper[4771]: &Event{ObjectMeta:{kube-controller-manager-crc.1897f4f1934fd1b8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Feb 27 01:05:22 crc kubenswrapper[4771]: body: Feb 27 01:05:22 crc kubenswrapper[4771]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:56.789356984 +0000 UTC m=+9.726918312,LastTimestamp:2026-02-27 01:04:56.789356984 +0000 UTC m=+9.726918312,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 01:05:22 crc kubenswrapper[4771]: > Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.836784 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f4f19350f36b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:56.789431147 +0000 UTC m=+9.726992465,LastTimestamp:2026-02-27 01:04:56.789431147 +0000 UTC m=+9.726992465,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.844940 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897f4f02d6bb739\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f4f02d6bb739 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.784941881 +0000 UTC m=+3.722503169,LastTimestamp:2026-02-27 01:05:01.889985243 +0000 UTC m=+14.827546571,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.851451 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897f4f0395ca846\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f4f0395ca846 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.985281606 +0000 UTC m=+3.922842894,LastTimestamp:2026-02-27 01:05:02.150729855 +0000 UTC m=+15.088291183,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.859126 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897f4f03a1b7405\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f4f03a1b7405 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:50.997785605 +0000 UTC m=+3.935346903,LastTimestamp:2026-02-27 01:05:02.158026052 +0000 UTC m=+15.095587350,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.868411 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 27 01:05:22 crc kubenswrapper[4771]: &Event{ObjectMeta:{kube-apiserver-crc.1897f4f2d713553f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 27 01:05:22 crc kubenswrapper[4771]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 27 01:05:22 crc kubenswrapper[4771]: Feb 27 01:05:22 crc kubenswrapper[4771]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:05:02.221210943 +0000 UTC m=+15.158772271,LastTimestamp:2026-02-27 01:05:02.221210943 +0000 UTC m=+15.158772271,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 01:05:22 crc kubenswrapper[4771]: > Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.876292 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f4f2d7141a48 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:05:02.221261384 +0000 UTC m=+15.158822712,LastTimestamp:2026-02-27 01:05:02.221261384 +0000 UTC m=+15.158822712,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.883063 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897f4f2d713553f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 27 01:05:22 crc kubenswrapper[4771]: &Event{ObjectMeta:{kube-apiserver-crc.1897f4f2d713553f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 27 01:05:22 crc kubenswrapper[4771]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 27 01:05:22 crc kubenswrapper[4771]: Feb 27 01:05:22 crc kubenswrapper[4771]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:05:02.221210943 +0000 UTC m=+15.158772271,LastTimestamp:2026-02-27 01:05:02.226335852 +0000 UTC m=+15.163897150,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 01:05:22 crc kubenswrapper[4771]: > Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.889540 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897f4f2d7141a48\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897f4f2d7141a48 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:05:02.221261384 +0000 UTC m=+15.158822712,LastTimestamp:2026-02-27 01:05:02.226391303 +0000 UTC m=+15.163952611,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.900941 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 27 01:05:22 crc kubenswrapper[4771]: &Event{ObjectMeta:{kube-controller-manager-crc.1897f4f3e758f441 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 27 01:05:22 crc kubenswrapper[4771]: body: Feb 27 01:05:22 crc kubenswrapper[4771]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:05:06.789176385 +0000 UTC m=+19.726737713,LastTimestamp:2026-02-27 01:05:06.789176385 +0000 UTC m=+19.726737713,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 01:05:22 crc kubenswrapper[4771]: > Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.907526 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f4f3e75a6ec9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:05:06.789273289 +0000 UTC m=+19.726834617,LastTimestamp:2026-02-27 01:05:06.789273289 +0000 UTC m=+19.726834617,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.917286 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897f4f3e758f441\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 27 01:05:22 crc kubenswrapper[4771]: &Event{ObjectMeta:{kube-controller-manager-crc.1897f4f3e758f441 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 27 01:05:22 crc kubenswrapper[4771]: body: Feb 27 01:05:22 crc kubenswrapper[4771]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:05:06.789176385 +0000 UTC m=+19.726737713,LastTimestamp:2026-02-27 01:05:16.789216933 +0000 UTC m=+29.726778261,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 01:05:22 crc kubenswrapper[4771]: > Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.923809 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897f4f3e75a6ec9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f4f3e75a6ec9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:05:06.789273289 +0000 UTC m=+19.726834617,LastTimestamp:2026-02-27 01:05:16.789281625 +0000 UTC m=+29.726842953,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.936080 4771 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f4f63b9b3d44 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:05:16.792741188 +0000 UTC m=+29.730302516,LastTimestamp:2026-02-27 01:05:16.792741188 +0000 UTC m=+29.730302516,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.942671 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897f4efc018b661\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f4efc018b661 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:48.950785633 +0000 UTC m=+1.888346921,LastTimestamp:2026-02-27 01:05:16.915688348 +0000 UTC m=+29.853249666,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.949732 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897f4efd48a6aed\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f4efd48a6aed openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:49.293781741 +0000 UTC m=+2.231343059,LastTimestamp:2026-02-27 01:05:17.157934268 +0000 UTC m=+30.095495596,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:22 crc kubenswrapper[4771]: E0227 01:05:22.958798 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897f4efd5539380\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f4efd5539380 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:04:49.306964864 +0000 UTC m=+2.244526182,LastTimestamp:2026-02-27 01:05:17.171158896 +0000 UTC m=+30.108720224,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:23 crc kubenswrapper[4771]: I0227 01:05:23.690336 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:23 crc kubenswrapper[4771]: I0227 01:05:23.789513 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:05:23 crc kubenswrapper[4771]: I0227 01:05:23.789778 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:23 crc kubenswrapper[4771]: I0227 01:05:23.791190 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:23 crc kubenswrapper[4771]: I0227 01:05:23.791232 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:23 crc kubenswrapper[4771]: I0227 01:05:23.791251 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:24 crc kubenswrapper[4771]: I0227 01:05:24.693651 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:25 crc kubenswrapper[4771]: I0227 01:05:25.691397 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:26 crc kubenswrapper[4771]: I0227 01:05:26.693703 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:26 crc kubenswrapper[4771]: I0227 01:05:26.789877 4771 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 27 01:05:26 crc kubenswrapper[4771]: I0227 01:05:26.789985 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 27 01:05:26 crc kubenswrapper[4771]: E0227 01:05:26.797728 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897f4f3e758f441\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 27 01:05:26 crc kubenswrapper[4771]: &Event{ObjectMeta:{kube-controller-manager-crc.1897f4f3e758f441 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 27 01:05:26 crc kubenswrapper[4771]: body: Feb 27 01:05:26 crc kubenswrapper[4771]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:05:06.789176385 +0000 UTC m=+19.726737713,LastTimestamp:2026-02-27 01:05:26.789951397 +0000 UTC m=+39.727512735,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 01:05:26 crc kubenswrapper[4771]: > Feb 27 01:05:26 crc kubenswrapper[4771]: E0227 01:05:26.803995 4771 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897f4f3e75a6ec9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897f4f3e75a6ec9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:05:06.789273289 +0000 UTC m=+19.726834617,LastTimestamp:2026-02-27 01:05:26.790025159 +0000 UTC m=+39.727586477,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:05:27 crc kubenswrapper[4771]: I0227 01:05:27.220935 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 27 01:05:27 crc kubenswrapper[4771]: I0227 01:05:27.242161 4771 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 27 01:05:27 crc kubenswrapper[4771]: I0227 01:05:27.681278 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:05:27 crc kubenswrapper[4771]: I0227 01:05:27.681537 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:27 crc kubenswrapper[4771]: I0227 01:05:27.683596 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:27 crc kubenswrapper[4771]: I0227 01:05:27.683821 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:27 crc kubenswrapper[4771]: I0227 01:05:27.683968 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:27 crc kubenswrapper[4771]: I0227 01:05:27.694161 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:27 crc kubenswrapper[4771]: E0227 01:05:27.870951 4771 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 01:05:28 crc kubenswrapper[4771]: I0227 01:05:28.692787 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:29 crc kubenswrapper[4771]: W0227 01:05:29.230992 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Feb 27 01:05:29 crc kubenswrapper[4771]: E0227 01:05:29.231076 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 27 01:05:29 crc kubenswrapper[4771]: E0227 01:05:29.643450 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 27 01:05:29 crc kubenswrapper[4771]: I0227 01:05:29.646077 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:29 crc kubenswrapper[4771]: I0227 01:05:29.647715 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:29 crc kubenswrapper[4771]: I0227 01:05:29.647771 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:29 crc kubenswrapper[4771]: I0227 01:05:29.647789 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:29 crc kubenswrapper[4771]: I0227 01:05:29.647824 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 01:05:29 crc kubenswrapper[4771]: E0227 01:05:29.654509 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 27 01:05:29 crc kubenswrapper[4771]: I0227 01:05:29.692735 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:30 crc kubenswrapper[4771]: I0227 01:05:30.692661 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:31 crc kubenswrapper[4771]: I0227 01:05:31.694056 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:32 crc kubenswrapper[4771]: W0227 01:05:32.578324 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:32 crc kubenswrapper[4771]: E0227 01:05:32.578406 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 27 01:05:32 crc kubenswrapper[4771]: I0227 01:05:32.693598 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:33 crc kubenswrapper[4771]: I0227 01:05:33.690780 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:33 crc kubenswrapper[4771]: I0227 01:05:33.793980 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:05:33 crc kubenswrapper[4771]: I0227 01:05:33.794120 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:33 crc kubenswrapper[4771]: I0227 01:05:33.796045 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:33 crc kubenswrapper[4771]: I0227 01:05:33.796094 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:33 crc kubenswrapper[4771]: I0227 01:05:33.796112 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:33 crc kubenswrapper[4771]: I0227 01:05:33.800914 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:05:33 crc kubenswrapper[4771]: W0227 01:05:33.961138 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 27 01:05:33 crc kubenswrapper[4771]: E0227 01:05:33.961195 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 27 01:05:34 crc kubenswrapper[4771]: I0227 01:05:34.005566 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:34 crc kubenswrapper[4771]: I0227 01:05:34.006602 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:34 crc kubenswrapper[4771]: I0227 01:05:34.006640 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:34 crc kubenswrapper[4771]: I0227 01:05:34.006654 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:34 crc kubenswrapper[4771]: I0227 01:05:34.721812 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:35 crc kubenswrapper[4771]: I0227 01:05:35.691762 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:36 crc kubenswrapper[4771]: I0227 01:05:36.655208 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:36 crc kubenswrapper[4771]: I0227 01:05:36.656925 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:36 crc kubenswrapper[4771]: I0227 01:05:36.656991 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:36 crc kubenswrapper[4771]: I0227 01:05:36.657017 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:36 crc kubenswrapper[4771]: I0227 01:05:36.657060 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 01:05:36 crc kubenswrapper[4771]: E0227 01:05:36.667438 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 27 01:05:36 crc kubenswrapper[4771]: E0227 01:05:36.667531 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 27 01:05:36 crc kubenswrapper[4771]: I0227 01:05:36.691623 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:37 crc kubenswrapper[4771]: W0227 01:05:37.136022 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 27 01:05:37 crc kubenswrapper[4771]: E0227 01:05:37.136095 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 27 01:05:37 crc kubenswrapper[4771]: I0227 01:05:37.693089 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:37 crc kubenswrapper[4771]: I0227 01:05:37.772282 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:37 crc kubenswrapper[4771]: I0227 01:05:37.774698 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:37 crc kubenswrapper[4771]: I0227 01:05:37.774769 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:37 crc kubenswrapper[4771]: I0227 01:05:37.774788 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:37 crc kubenswrapper[4771]: I0227 01:05:37.775760 4771 scope.go:117] "RemoveContainer" containerID="bcecf191573ec1e56a589b3e69a074799b012a802c1a3e8cacc7dd3dddd65e9e" Feb 27 01:05:37 crc kubenswrapper[4771]: E0227 01:05:37.871398 4771 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 01:05:38 crc kubenswrapper[4771]: I0227 01:05:38.690371 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:39 crc kubenswrapper[4771]: I0227 01:05:39.023188 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 27 01:05:39 crc kubenswrapper[4771]: I0227 01:05:39.026846 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846"} Feb 27 01:05:39 crc kubenswrapper[4771]: I0227 01:05:39.027143 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:39 crc kubenswrapper[4771]: I0227 01:05:39.036048 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:39 crc kubenswrapper[4771]: I0227 01:05:39.036128 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:39 crc kubenswrapper[4771]: I0227 01:05:39.036147 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:39 crc kubenswrapper[4771]: I0227 01:05:39.693144 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:40 crc kubenswrapper[4771]: I0227 01:05:40.032956 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 27 01:05:40 crc kubenswrapper[4771]: I0227 01:05:40.034738 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 27 01:05:40 crc kubenswrapper[4771]: I0227 01:05:40.037526 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846" exitCode=255 Feb 27 01:05:40 crc kubenswrapper[4771]: I0227 01:05:40.037635 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846"} Feb 27 01:05:40 crc kubenswrapper[4771]: I0227 01:05:40.037707 4771 scope.go:117] "RemoveContainer" containerID="bcecf191573ec1e56a589b3e69a074799b012a802c1a3e8cacc7dd3dddd65e9e" Feb 27 01:05:40 crc kubenswrapper[4771]: I0227 01:05:40.037850 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:40 crc kubenswrapper[4771]: I0227 01:05:40.039150 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:40 crc kubenswrapper[4771]: I0227 01:05:40.039206 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:40 crc kubenswrapper[4771]: I0227 01:05:40.039230 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:40 crc kubenswrapper[4771]: I0227 01:05:40.040346 4771 scope.go:117] "RemoveContainer" containerID="6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846" Feb 27 01:05:40 crc kubenswrapper[4771]: E0227 01:05:40.040716 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 01:05:40 crc kubenswrapper[4771]: I0227 01:05:40.694240 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:41 crc kubenswrapper[4771]: I0227 01:05:41.043576 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 27 01:05:41 crc kubenswrapper[4771]: I0227 01:05:41.693622 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:41 crc kubenswrapper[4771]: I0227 01:05:41.717628 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 01:05:41 crc kubenswrapper[4771]: I0227 01:05:41.717859 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:41 crc kubenswrapper[4771]: I0227 01:05:41.719764 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:41 crc kubenswrapper[4771]: I0227 01:05:41.719987 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:41 crc kubenswrapper[4771]: I0227 01:05:41.720130 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:42 crc kubenswrapper[4771]: I0227 01:05:42.692325 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:42 crc kubenswrapper[4771]: I0227 01:05:42.763349 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:05:42 crc kubenswrapper[4771]: I0227 01:05:42.763623 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:42 crc kubenswrapper[4771]: I0227 01:05:42.765163 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:42 crc kubenswrapper[4771]: I0227 01:05:42.765221 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:42 crc kubenswrapper[4771]: I0227 01:05:42.765239 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:42 crc kubenswrapper[4771]: I0227 01:05:42.766032 4771 scope.go:117] "RemoveContainer" containerID="6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846" Feb 27 01:05:42 crc kubenswrapper[4771]: E0227 01:05:42.766309 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 01:05:43 crc kubenswrapper[4771]: I0227 01:05:43.668150 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:43 crc kubenswrapper[4771]: I0227 01:05:43.669771 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:43 crc kubenswrapper[4771]: I0227 01:05:43.669836 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:43 crc kubenswrapper[4771]: I0227 01:05:43.669862 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:43 crc kubenswrapper[4771]: I0227 01:05:43.669923 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 01:05:43 crc kubenswrapper[4771]: E0227 01:05:43.675830 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 27 01:05:43 crc kubenswrapper[4771]: E0227 01:05:43.675898 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 27 01:05:43 crc kubenswrapper[4771]: I0227 01:05:43.688242 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:44 crc kubenswrapper[4771]: I0227 01:05:44.694019 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:45 crc kubenswrapper[4771]: I0227 01:05:45.038701 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:05:45 crc kubenswrapper[4771]: I0227 01:05:45.039310 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:45 crc kubenswrapper[4771]: I0227 01:05:45.041327 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:45 crc kubenswrapper[4771]: I0227 01:05:45.041390 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:45 crc kubenswrapper[4771]: I0227 01:05:45.041405 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:45 crc kubenswrapper[4771]: I0227 01:05:45.042464 4771 scope.go:117] "RemoveContainer" containerID="6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846" Feb 27 01:05:45 crc kubenswrapper[4771]: E0227 01:05:45.042771 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 01:05:45 crc kubenswrapper[4771]: I0227 01:05:45.691944 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:46 crc kubenswrapper[4771]: I0227 01:05:46.695436 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:47 crc kubenswrapper[4771]: I0227 01:05:47.693841 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:47 crc kubenswrapper[4771]: E0227 01:05:47.872048 4771 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 01:05:48 crc kubenswrapper[4771]: I0227 01:05:48.695333 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:49 crc kubenswrapper[4771]: I0227 01:05:49.691770 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:50 crc kubenswrapper[4771]: I0227 01:05:50.676895 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:50 crc kubenswrapper[4771]: I0227 01:05:50.678843 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:50 crc kubenswrapper[4771]: I0227 01:05:50.679336 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:50 crc kubenswrapper[4771]: I0227 01:05:50.679514 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:50 crc kubenswrapper[4771]: I0227 01:05:50.679717 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 01:05:50 crc kubenswrapper[4771]: E0227 01:05:50.683766 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 27 01:05:50 crc kubenswrapper[4771]: E0227 01:05:50.684103 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 27 01:05:50 crc kubenswrapper[4771]: I0227 01:05:50.687611 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:51 crc kubenswrapper[4771]: I0227 01:05:51.693194 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 01:05:52 crc kubenswrapper[4771]: I0227 01:05:52.415220 4771 csr.go:261] certificate signing request csr-ng2f8 is approved, waiting to be issued Feb 27 01:05:52 crc kubenswrapper[4771]: I0227 01:05:52.423149 4771 csr.go:257] certificate signing request csr-ng2f8 is issued Feb 27 01:05:52 crc kubenswrapper[4771]: I0227 01:05:52.477780 4771 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 27 01:05:52 crc kubenswrapper[4771]: I0227 01:05:52.543193 4771 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 27 01:05:53 crc kubenswrapper[4771]: I0227 01:05:53.424260 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-03 17:27:25.435205036 +0000 UTC Feb 27 01:05:53 crc kubenswrapper[4771]: I0227 01:05:53.424324 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6712h21m32.010886185s for next certificate rotation Feb 27 01:05:53 crc kubenswrapper[4771]: I0227 01:05:53.772879 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:53 crc kubenswrapper[4771]: I0227 01:05:53.774703 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:53 crc kubenswrapper[4771]: I0227 01:05:53.774787 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:53 crc kubenswrapper[4771]: I0227 01:05:53.774811 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.684220 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.685941 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.685985 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.685997 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.686096 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.697006 4771 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.697378 4771 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 27 01:05:57 crc kubenswrapper[4771]: E0227 01:05:57.697408 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.702231 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.702278 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.702296 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.702320 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.702339 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:05:57Z","lastTransitionTime":"2026-02-27T01:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:05:57 crc kubenswrapper[4771]: E0227 01:05:57.719455 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.728048 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.728275 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.728425 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.728603 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.728768 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:05:57Z","lastTransitionTime":"2026-02-27T01:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:05:57 crc kubenswrapper[4771]: E0227 01:05:57.737866 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.747727 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.747768 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.747779 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.747798 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.747814 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:05:57Z","lastTransitionTime":"2026-02-27T01:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:05:57 crc kubenswrapper[4771]: E0227 01:05:57.757691 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.764964 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.765019 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.765028 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.765042 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.765054 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:05:57Z","lastTransitionTime":"2026-02-27T01:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.772671 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.773954 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.774018 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.774038 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:05:57 crc kubenswrapper[4771]: E0227 01:05:57.774206 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:05:57 crc kubenswrapper[4771]: E0227 01:05:57.774434 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 01:05:57 crc kubenswrapper[4771]: E0227 01:05:57.774509 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:05:57 crc kubenswrapper[4771]: I0227 01:05:57.775111 4771 scope.go:117] "RemoveContainer" containerID="6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846" Feb 27 01:05:57 crc kubenswrapper[4771]: E0227 01:05:57.775422 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 01:05:57 crc kubenswrapper[4771]: E0227 01:05:57.872814 4771 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 01:05:57 crc kubenswrapper[4771]: E0227 01:05:57.875126 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:05:57 crc kubenswrapper[4771]: E0227 01:05:57.975477 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:05:58 crc kubenswrapper[4771]: E0227 01:05:58.076925 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:05:58 crc kubenswrapper[4771]: E0227 01:05:58.178375 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:05:58 crc kubenswrapper[4771]: E0227 01:05:58.278484 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:05:58 crc kubenswrapper[4771]: E0227 01:05:58.378861 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:05:58 crc kubenswrapper[4771]: E0227 01:05:58.479727 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:05:58 crc kubenswrapper[4771]: E0227 01:05:58.580251 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:05:58 crc kubenswrapper[4771]: E0227 01:05:58.680662 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:05:58 crc kubenswrapper[4771]: I0227 01:05:58.693097 4771 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 27 01:05:58 crc kubenswrapper[4771]: E0227 01:05:58.780764 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:05:58 crc kubenswrapper[4771]: E0227 01:05:58.881910 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:05:58 crc kubenswrapper[4771]: E0227 01:05:58.982722 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:05:59 crc kubenswrapper[4771]: E0227 01:05:59.083380 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:05:59 crc kubenswrapper[4771]: E0227 01:05:59.184237 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:05:59 crc kubenswrapper[4771]: E0227 01:05:59.284764 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:05:59 crc kubenswrapper[4771]: E0227 01:05:59.385178 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:05:59 crc kubenswrapper[4771]: E0227 01:05:59.486348 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:05:59 crc kubenswrapper[4771]: E0227 01:05:59.586826 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:05:59 crc kubenswrapper[4771]: E0227 01:05:59.687653 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:05:59 crc kubenswrapper[4771]: E0227 01:05:59.788816 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:05:59 crc kubenswrapper[4771]: E0227 01:05:59.889773 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:05:59 crc kubenswrapper[4771]: E0227 01:05:59.990887 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:00 crc kubenswrapper[4771]: E0227 01:06:00.091034 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:00 crc kubenswrapper[4771]: E0227 01:06:00.191368 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:00 crc kubenswrapper[4771]: E0227 01:06:00.292490 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:00 crc kubenswrapper[4771]: E0227 01:06:00.393601 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:00 crc kubenswrapper[4771]: E0227 01:06:00.494753 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:00 crc kubenswrapper[4771]: E0227 01:06:00.595312 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:00 crc kubenswrapper[4771]: E0227 01:06:00.695948 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:00 crc kubenswrapper[4771]: E0227 01:06:00.796653 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:00 crc kubenswrapper[4771]: E0227 01:06:00.897596 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:00 crc kubenswrapper[4771]: I0227 01:06:00.949049 4771 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 27 01:06:00 crc kubenswrapper[4771]: E0227 01:06:00.998408 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:01 crc kubenswrapper[4771]: E0227 01:06:01.098617 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:01 crc kubenswrapper[4771]: E0227 01:06:01.199022 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:01 crc kubenswrapper[4771]: E0227 01:06:01.299381 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:01 crc kubenswrapper[4771]: E0227 01:06:01.399705 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:01 crc kubenswrapper[4771]: E0227 01:06:01.500719 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:01 crc kubenswrapper[4771]: E0227 01:06:01.601153 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:01 crc kubenswrapper[4771]: E0227 01:06:01.701892 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:01 crc kubenswrapper[4771]: E0227 01:06:01.802578 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:01 crc kubenswrapper[4771]: E0227 01:06:01.903736 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:02 crc kubenswrapper[4771]: E0227 01:06:02.004750 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:02 crc kubenswrapper[4771]: E0227 01:06:02.105426 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:02 crc kubenswrapper[4771]: E0227 01:06:02.205873 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:02 crc kubenswrapper[4771]: E0227 01:06:02.306657 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:02 crc kubenswrapper[4771]: E0227 01:06:02.407578 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:02 crc kubenswrapper[4771]: E0227 01:06:02.508613 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:02 crc kubenswrapper[4771]: E0227 01:06:02.609481 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:02 crc kubenswrapper[4771]: E0227 01:06:02.709657 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:02 crc kubenswrapper[4771]: E0227 01:06:02.810236 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:02 crc kubenswrapper[4771]: E0227 01:06:02.910806 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:03 crc kubenswrapper[4771]: E0227 01:06:03.011390 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:03 crc kubenswrapper[4771]: E0227 01:06:03.111597 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:03 crc kubenswrapper[4771]: E0227 01:06:03.212493 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:03 crc kubenswrapper[4771]: E0227 01:06:03.312674 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:03 crc kubenswrapper[4771]: E0227 01:06:03.413441 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:03 crc kubenswrapper[4771]: E0227 01:06:03.514020 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:03 crc kubenswrapper[4771]: E0227 01:06:03.614958 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:03 crc kubenswrapper[4771]: E0227 01:06:03.715990 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:03 crc kubenswrapper[4771]: E0227 01:06:03.816973 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:03 crc kubenswrapper[4771]: E0227 01:06:03.917171 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:04 crc kubenswrapper[4771]: E0227 01:06:04.017298 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:04 crc kubenswrapper[4771]: E0227 01:06:04.117822 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:04 crc kubenswrapper[4771]: E0227 01:06:04.218310 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:04 crc kubenswrapper[4771]: E0227 01:06:04.318434 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:04 crc kubenswrapper[4771]: E0227 01:06:04.419586 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:04 crc kubenswrapper[4771]: E0227 01:06:04.519798 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:04 crc kubenswrapper[4771]: E0227 01:06:04.620292 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:04 crc kubenswrapper[4771]: E0227 01:06:04.721192 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:04 crc kubenswrapper[4771]: E0227 01:06:04.821773 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:04 crc kubenswrapper[4771]: E0227 01:06:04.922871 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:05 crc kubenswrapper[4771]: E0227 01:06:05.023080 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:05 crc kubenswrapper[4771]: E0227 01:06:05.123860 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:05 crc kubenswrapper[4771]: E0227 01:06:05.224314 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:05 crc kubenswrapper[4771]: E0227 01:06:05.324465 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:05 crc kubenswrapper[4771]: E0227 01:06:05.425447 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:05 crc kubenswrapper[4771]: E0227 01:06:05.525795 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:05 crc kubenswrapper[4771]: E0227 01:06:05.626188 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:05 crc kubenswrapper[4771]: E0227 01:06:05.726736 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:05 crc kubenswrapper[4771]: E0227 01:06:05.827343 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:05 crc kubenswrapper[4771]: E0227 01:06:05.927729 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:06 crc kubenswrapper[4771]: E0227 01:06:06.028385 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:06 crc kubenswrapper[4771]: E0227 01:06:06.129454 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:06 crc kubenswrapper[4771]: E0227 01:06:06.229851 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:06 crc kubenswrapper[4771]: E0227 01:06:06.330660 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:06 crc kubenswrapper[4771]: E0227 01:06:06.431862 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:06 crc kubenswrapper[4771]: E0227 01:06:06.532346 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:06 crc kubenswrapper[4771]: E0227 01:06:06.632546 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:06 crc kubenswrapper[4771]: E0227 01:06:06.732894 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:06 crc kubenswrapper[4771]: E0227 01:06:06.834027 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:06 crc kubenswrapper[4771]: E0227 01:06:06.935044 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:07 crc kubenswrapper[4771]: E0227 01:06:07.035215 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:07 crc kubenswrapper[4771]: E0227 01:06:07.135898 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:07 crc kubenswrapper[4771]: E0227 01:06:07.236921 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:07 crc kubenswrapper[4771]: E0227 01:06:07.337783 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:07 crc kubenswrapper[4771]: E0227 01:06:07.438296 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:07 crc kubenswrapper[4771]: E0227 01:06:07.538912 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:07 crc kubenswrapper[4771]: E0227 01:06:07.639737 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:07 crc kubenswrapper[4771]: E0227 01:06:07.740578 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:07 crc kubenswrapper[4771]: E0227 01:06:07.840841 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:07 crc kubenswrapper[4771]: E0227 01:06:07.873897 4771 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 01:06:07 crc kubenswrapper[4771]: E0227 01:06:07.922480 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 27 01:06:07 crc kubenswrapper[4771]: I0227 01:06:07.927969 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:07 crc kubenswrapper[4771]: I0227 01:06:07.928029 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:07 crc kubenswrapper[4771]: I0227 01:06:07.928047 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:07 crc kubenswrapper[4771]: I0227 01:06:07.928070 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:07 crc kubenswrapper[4771]: I0227 01:06:07.928101 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:07Z","lastTransitionTime":"2026-02-27T01:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:07 crc kubenswrapper[4771]: E0227 01:06:07.941322 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:07 crc kubenswrapper[4771]: I0227 01:06:07.947249 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:07 crc kubenswrapper[4771]: I0227 01:06:07.947305 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:07 crc kubenswrapper[4771]: I0227 01:06:07.947331 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:07 crc kubenswrapper[4771]: I0227 01:06:07.947363 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:07 crc kubenswrapper[4771]: I0227 01:06:07.947387 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:07Z","lastTransitionTime":"2026-02-27T01:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:07 crc kubenswrapper[4771]: E0227 01:06:07.963421 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:07 crc kubenswrapper[4771]: I0227 01:06:07.968114 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:07 crc kubenswrapper[4771]: I0227 01:06:07.968178 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:07 crc kubenswrapper[4771]: I0227 01:06:07.968202 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:07 crc kubenswrapper[4771]: I0227 01:06:07.968234 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:07 crc kubenswrapper[4771]: I0227 01:06:07.968256 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:07Z","lastTransitionTime":"2026-02-27T01:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:07 crc kubenswrapper[4771]: E0227 01:06:07.980772 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:07 crc kubenswrapper[4771]: I0227 01:06:07.992467 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:07 crc kubenswrapper[4771]: I0227 01:06:07.992578 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:07 crc kubenswrapper[4771]: I0227 01:06:07.992609 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:07 crc kubenswrapper[4771]: I0227 01:06:07.992736 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:07 crc kubenswrapper[4771]: I0227 01:06:07.992767 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:07Z","lastTransitionTime":"2026-02-27T01:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:08 crc kubenswrapper[4771]: E0227 01:06:08.008579 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:08 crc kubenswrapper[4771]: E0227 01:06:08.008800 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 01:06:08 crc kubenswrapper[4771]: E0227 01:06:08.008840 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:08 crc kubenswrapper[4771]: E0227 01:06:08.109018 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:08 crc kubenswrapper[4771]: E0227 01:06:08.210220 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:08 crc kubenswrapper[4771]: E0227 01:06:08.310967 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:08 crc kubenswrapper[4771]: E0227 01:06:08.412057 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:08 crc kubenswrapper[4771]: E0227 01:06:08.513162 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:08 crc kubenswrapper[4771]: E0227 01:06:08.614042 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:08 crc kubenswrapper[4771]: E0227 01:06:08.715212 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:08 crc kubenswrapper[4771]: E0227 01:06:08.816060 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:08 crc kubenswrapper[4771]: E0227 01:06:08.916697 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:09 crc kubenswrapper[4771]: E0227 01:06:09.017560 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:09 crc kubenswrapper[4771]: E0227 01:06:09.118502 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:09 crc kubenswrapper[4771]: E0227 01:06:09.219358 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:09 crc kubenswrapper[4771]: E0227 01:06:09.320374 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:09 crc kubenswrapper[4771]: E0227 01:06:09.420651 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:09 crc kubenswrapper[4771]: E0227 01:06:09.520841 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:09 crc kubenswrapper[4771]: E0227 01:06:09.621295 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:09 crc kubenswrapper[4771]: E0227 01:06:09.721645 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:09 crc kubenswrapper[4771]: E0227 01:06:09.822612 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:09 crc kubenswrapper[4771]: E0227 01:06:09.923621 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:10 crc kubenswrapper[4771]: E0227 01:06:10.024354 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:10 crc kubenswrapper[4771]: E0227 01:06:10.124475 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:10 crc kubenswrapper[4771]: E0227 01:06:10.224909 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:10 crc kubenswrapper[4771]: E0227 01:06:10.326011 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:10 crc kubenswrapper[4771]: E0227 01:06:10.427018 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:10 crc kubenswrapper[4771]: E0227 01:06:10.528209 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:10 crc kubenswrapper[4771]: E0227 01:06:10.629244 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:10 crc kubenswrapper[4771]: E0227 01:06:10.729910 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:10 crc kubenswrapper[4771]: E0227 01:06:10.830392 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:10 crc kubenswrapper[4771]: E0227 01:06:10.930900 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:11 crc kubenswrapper[4771]: E0227 01:06:11.031769 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:11 crc kubenswrapper[4771]: E0227 01:06:11.132349 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:11 crc kubenswrapper[4771]: E0227 01:06:11.232896 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:11 crc kubenswrapper[4771]: E0227 01:06:11.333768 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:11 crc kubenswrapper[4771]: E0227 01:06:11.433857 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:11 crc kubenswrapper[4771]: E0227 01:06:11.534745 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:11 crc kubenswrapper[4771]: E0227 01:06:11.635109 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:11 crc kubenswrapper[4771]: E0227 01:06:11.735920 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:11 crc kubenswrapper[4771]: E0227 01:06:11.837056 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:11 crc kubenswrapper[4771]: E0227 01:06:11.937176 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:12 crc kubenswrapper[4771]: E0227 01:06:12.038194 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:12 crc kubenswrapper[4771]: E0227 01:06:12.138994 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:12 crc kubenswrapper[4771]: E0227 01:06:12.239146 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:12 crc kubenswrapper[4771]: E0227 01:06:12.339468 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:12 crc kubenswrapper[4771]: E0227 01:06:12.440644 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:12 crc kubenswrapper[4771]: E0227 01:06:12.541830 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:12 crc kubenswrapper[4771]: E0227 01:06:12.642164 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:12 crc kubenswrapper[4771]: E0227 01:06:12.742308 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:12 crc kubenswrapper[4771]: I0227 01:06:12.772848 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 01:06:12 crc kubenswrapper[4771]: I0227 01:06:12.774981 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:12 crc kubenswrapper[4771]: I0227 01:06:12.775036 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:12 crc kubenswrapper[4771]: I0227 01:06:12.775056 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:12 crc kubenswrapper[4771]: I0227 01:06:12.776025 4771 scope.go:117] "RemoveContainer" containerID="6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846" Feb 27 01:06:12 crc kubenswrapper[4771]: E0227 01:06:12.776447 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 01:06:12 crc kubenswrapper[4771]: E0227 01:06:12.842468 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:12 crc kubenswrapper[4771]: E0227 01:06:12.943543 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:13 crc kubenswrapper[4771]: E0227 01:06:13.043938 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:13 crc kubenswrapper[4771]: E0227 01:06:13.144700 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:13 crc kubenswrapper[4771]: E0227 01:06:13.245118 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:13 crc kubenswrapper[4771]: E0227 01:06:13.345637 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:13 crc kubenswrapper[4771]: E0227 01:06:13.446734 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:13 crc kubenswrapper[4771]: E0227 01:06:13.547867 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:13 crc kubenswrapper[4771]: E0227 01:06:13.647977 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:13 crc kubenswrapper[4771]: E0227 01:06:13.748482 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:13 crc kubenswrapper[4771]: E0227 01:06:13.849212 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:13 crc kubenswrapper[4771]: E0227 01:06:13.949784 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:14 crc kubenswrapper[4771]: E0227 01:06:14.050762 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:14 crc kubenswrapper[4771]: E0227 01:06:14.151613 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:14 crc kubenswrapper[4771]: E0227 01:06:14.252391 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:14 crc kubenswrapper[4771]: E0227 01:06:14.353003 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:14 crc kubenswrapper[4771]: E0227 01:06:14.453923 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:14 crc kubenswrapper[4771]: E0227 01:06:14.554320 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:14 crc kubenswrapper[4771]: E0227 01:06:14.655616 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:14 crc kubenswrapper[4771]: E0227 01:06:14.756487 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:14 crc kubenswrapper[4771]: E0227 01:06:14.857099 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:14 crc kubenswrapper[4771]: E0227 01:06:14.958049 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:15 crc kubenswrapper[4771]: E0227 01:06:15.058479 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:15 crc kubenswrapper[4771]: E0227 01:06:15.158936 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:15 crc kubenswrapper[4771]: E0227 01:06:15.259595 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:15 crc kubenswrapper[4771]: E0227 01:06:15.360066 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:15 crc kubenswrapper[4771]: E0227 01:06:15.461214 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:15 crc kubenswrapper[4771]: E0227 01:06:15.561376 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:15 crc kubenswrapper[4771]: E0227 01:06:15.662395 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:15 crc kubenswrapper[4771]: E0227 01:06:15.762910 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:15 crc kubenswrapper[4771]: E0227 01:06:15.863669 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:15 crc kubenswrapper[4771]: E0227 01:06:15.964250 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.039451 4771 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.066958 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.067027 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.067052 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.067084 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.067111 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:16Z","lastTransitionTime":"2026-02-27T01:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.169919 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.169976 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.169995 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.170019 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.170036 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:16Z","lastTransitionTime":"2026-02-27T01:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.272780 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.272843 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.272861 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.272884 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.272903 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:16Z","lastTransitionTime":"2026-02-27T01:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.375400 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.375449 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.375468 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.375492 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.375510 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:16Z","lastTransitionTime":"2026-02-27T01:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.478768 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.478814 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.478832 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.478855 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.478872 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:16Z","lastTransitionTime":"2026-02-27T01:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.588223 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.588272 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.588285 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.588306 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.588319 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:16Z","lastTransitionTime":"2026-02-27T01:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.690374 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.690446 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.690468 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.690498 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.690519 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:16Z","lastTransitionTime":"2026-02-27T01:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.739990 4771 apiserver.go:52] "Watching apiserver" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.745355 4771 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.745671 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.746309 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.746472 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:16 crc kubenswrapper[4771]: E0227 01:06:16.746541 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.746893 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:16 crc kubenswrapper[4771]: E0227 01:06:16.746958 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.747044 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:16 crc kubenswrapper[4771]: E0227 01:06:16.747231 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.747531 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.748236 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.749271 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.751042 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.751310 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.751639 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.751741 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.752061 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.752784 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.753279 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.753666 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.794479 4771 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.796091 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.798288 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.798360 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.798385 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.798418 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.798443 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:16Z","lastTransitionTime":"2026-02-27T01:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.799964 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.809381 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.829534 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.837116 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.837171 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.837212 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.837250 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.837285 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.837323 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.837357 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.837389 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.837425 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.837458 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.837490 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.837525 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.837584 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.837617 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.837648 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.837685 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.837718 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.837749 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.837782 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.837816 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.837849 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.837880 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.837915 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.837948 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.837981 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838015 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838047 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838083 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838115 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838150 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838186 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838221 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838254 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838286 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838322 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838357 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838390 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838424 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838456 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838462 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838490 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838529 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838586 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838620 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838655 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838689 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838721 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838755 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838761 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838789 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838823 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838855 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838887 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838923 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838957 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.838988 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839022 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839055 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839090 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839175 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839213 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839246 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839279 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839311 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839344 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839377 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839409 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839442 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839477 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839515 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839576 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839590 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839646 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839687 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839726 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839760 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839793 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839850 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839857 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839913 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839943 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839966 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.839992 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840017 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840039 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840060 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840083 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840105 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840127 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840153 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840175 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840197 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840220 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840240 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840262 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840285 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840309 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840333 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840375 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840399 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840503 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840527 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840575 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840600 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840624 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840653 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840691 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840725 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840752 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840776 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840798 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840820 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840842 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840867 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840889 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840913 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840936 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840958 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.840983 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.841006 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.841028 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.841053 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.841078 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.841102 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.841126 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.841147 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.841170 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.841195 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.841219 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.841241 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.841263 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.841284 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.841272 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.841308 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.841500 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.841597 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.841659 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.841715 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.841772 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.841825 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.841879 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.841935 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.841982 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.842030 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.842075 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.842197 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.842253 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.842306 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.842355 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.842408 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.842460 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.842513 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.842605 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.842667 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.842723 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.842774 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.842826 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.842880 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.842936 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.842989 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.843045 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.843095 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.843148 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.843208 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.843262 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.843313 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.843365 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.843421 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.843479 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.843580 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.843639 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.843691 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.843748 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.843804 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.843901 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.843958 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.844006 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.844061 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.844114 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.844168 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.844229 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.844281 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.844337 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.844394 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.844448 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.844505 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.844603 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.844662 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.844717 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.844780 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.844835 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.844886 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.844938 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.844991 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.845044 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.845094 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.845147 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.845197 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.845251 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.845307 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.845403 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.845473 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.849946 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.841510 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.841625 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.842146 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.842354 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.842406 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.842581 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.842827 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.843351 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.843425 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.843456 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.844115 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.844354 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.844665 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.845065 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.845113 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.845136 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.845404 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.863447 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.863476 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.863821 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.864609 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.864700 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.864972 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.865047 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.865038 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.845650 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.845648 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.845821 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.845859 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.845799 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.846266 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.846298 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.846809 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.847175 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.847182 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.847290 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.847371 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.847422 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.848069 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.848093 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.848368 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.849211 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.849357 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.849526 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.849631 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.850484 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.850605 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.850791 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.850963 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.851001 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.850644 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.851314 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.851644 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.851679 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.851758 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.851791 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.852145 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.852215 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.852595 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.852833 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.852976 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.852982 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.853026 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.853126 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.853371 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.853397 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.853415 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.853448 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.853464 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.853696 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.853831 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.853895 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.853914 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.853966 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.854028 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.854306 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.854429 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.854479 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.854535 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.855034 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.855208 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.855411 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.855484 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.856023 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.856177 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.856191 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.856350 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.856425 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.856636 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.857084 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.857156 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.857299 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.857440 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.857536 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.857980 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.858379 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.858646 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.858681 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.858808 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.858921 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.858886 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.859455 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.859477 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.860535 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.860601 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.860775 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.860829 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.860855 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.860933 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.861121 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.861139 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.861162 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.861347 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.861509 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.861501 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.861996 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.862044 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.862171 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.862238 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.862249 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.862870 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.867088 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: E0227 01:06:16.845586 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:06:17.345504589 +0000 UTC m=+90.283065927 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.867920 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.867959 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.868013 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.868055 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.870704 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.870819 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.870910 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.871003 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.859058 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.871216 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.871319 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:16 crc kubenswrapper[4771]: E0227 01:06:16.871452 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 01:06:16 crc kubenswrapper[4771]: E0227 01:06:16.871544 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:17.371515313 +0000 UTC m=+90.309076621 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.872992 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.873323 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.873784 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.874077 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.874206 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.874494 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.874578 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: E0227 01:06:16.874701 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.875043 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.874747 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.874770 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.875214 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.875401 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.875428 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.875585 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.875605 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.876020 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.876243 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.876452 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.876534 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.876651 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.873454 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.876866 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.876925 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: E0227 01:06:16.877006 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:17.376981451 +0000 UTC m=+90.314542739 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.877333 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.876684 4771 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.877442 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.877600 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.877641 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.877670 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.877850 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.877865 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.877876 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.877891 4771 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.877902 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.877912 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.877922 4771 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.877934 4771 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.877946 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.877959 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.877974 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.877992 4771 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878010 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878024 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878037 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878047 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878063 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878073 4771 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878086 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878096 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878105 4771 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878115 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878127 4771 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878136 4771 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878145 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878154 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878165 4771 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878174 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878185 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878197 4771 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878206 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878218 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878226 4771 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878237 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878246 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878239 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878257 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878349 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878374 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878394 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878413 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878441 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878459 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878477 4771 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878502 4771 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878522 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878540 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878583 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878607 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878627 4771 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878644 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878662 4771 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878685 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878707 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878727 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878745 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878767 4771 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878788 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878805 4771 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878827 4771 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878845 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878865 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878882 4771 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878905 4771 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878922 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878950 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878968 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.878991 4771 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.879010 4771 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.879027 4771 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.879055 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.879074 4771 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.879091 4771 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.879109 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.879132 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.879148 4771 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.879166 4771 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.879189 4771 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.879216 4771 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.879234 4771 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.879406 4771 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.879433 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.879441 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.879493 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.879516 4771 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.879653 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.879701 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.879731 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.879815 4771 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.879895 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.879915 4771 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.880152 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.880171 4771 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.879927 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.880225 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.880367 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.880407 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.880475 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.880506 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.880624 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.880664 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.880691 4771 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.880771 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.880852 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.880874 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.880893 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.880950 4771 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.880968 4771 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881017 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881039 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881063 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881112 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881130 4771 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881185 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881205 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881224 4771 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881274 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881301 4771 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881354 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881374 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881391 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881447 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881467 4771 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881485 4771 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881582 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881605 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881622 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881675 4771 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881701 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881748 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881766 4771 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881785 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881839 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881858 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881878 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881927 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.881952 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.882001 4771 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.882020 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.882045 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.882094 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.882112 4771 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.882129 4771 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.882183 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.882199 4771 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.882216 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.882265 4771 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.882289 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.882335 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.882354 4771 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.882377 4771 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.882425 4771 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.882935 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.883010 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.883152 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.883390 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.883411 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.883451 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.883645 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.883723 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.883973 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.885161 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.886065 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.886490 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.886923 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.892109 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.892499 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 01:06:16 crc kubenswrapper[4771]: E0227 01:06:16.892682 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 01:06:16 crc kubenswrapper[4771]: E0227 01:06:16.892713 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 01:06:16 crc kubenswrapper[4771]: E0227 01:06:16.892728 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:16 crc kubenswrapper[4771]: E0227 01:06:16.892796 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:17.392772249 +0000 UTC m=+90.330333537 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.892899 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.892935 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.892985 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.893104 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.893121 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.893194 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.894303 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.894746 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.894938 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.896203 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.896458 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.896786 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.897771 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.898519 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.901514 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.901545 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.901584 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.901601 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.901610 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:16Z","lastTransitionTime":"2026-02-27T01:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.903834 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: E0227 01:06:16.905467 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 01:06:16 crc kubenswrapper[4771]: E0227 01:06:16.905509 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.905673 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: E0227 01:06:16.905543 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:16 crc kubenswrapper[4771]: E0227 01:06:16.905935 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:17.405896695 +0000 UTC m=+90.343458023 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.907577 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.908733 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.908741 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.908831 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.909240 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.913214 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.913272 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.913333 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.909296 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.914668 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.918767 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.921133 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.924689 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.926934 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.934490 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.936863 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.942470 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.945364 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.983794 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.983860 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.983919 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.983930 4771 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.983939 4771 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.983949 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.983959 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.983967 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.983975 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.983983 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.983991 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.983999 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984007 4771 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984015 4771 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984022 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984030 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984038 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984045 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984053 4771 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984061 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984068 4771 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984076 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984084 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984091 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984099 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984107 4771 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984114 4771 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984123 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984131 4771 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984139 4771 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984148 4771 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984156 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984165 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984174 4771 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984191 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984199 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984207 4771 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984215 4771 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984223 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984231 4771 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984241 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984249 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984256 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984265 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984276 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984287 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984300 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984312 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984567 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 01:06:16 crc kubenswrapper[4771]: I0227 01:06:16.984862 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.003496 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.003519 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.003527 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.003539 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.003561 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:17Z","lastTransitionTime":"2026-02-27T01:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.074273 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.090194 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.103636 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 01:06:17 crc kubenswrapper[4771]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Feb 27 01:06:17 crc kubenswrapper[4771]: set -o allexport Feb 27 01:06:17 crc kubenswrapper[4771]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Feb 27 01:06:17 crc kubenswrapper[4771]: source /etc/kubernetes/apiserver-url.env Feb 27 01:06:17 crc kubenswrapper[4771]: else Feb 27 01:06:17 crc kubenswrapper[4771]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Feb 27 01:06:17 crc kubenswrapper[4771]: exit 1 Feb 27 01:06:17 crc kubenswrapper[4771]: fi Feb 27 01:06:17 crc kubenswrapper[4771]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Feb 27 01:06:17 crc kubenswrapper[4771]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 27 01:06:17 crc kubenswrapper[4771]: > logger="UnhandledError" Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.104777 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.106355 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.106408 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.106433 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.106584 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.106610 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:17Z","lastTransitionTime":"2026-02-27T01:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.107533 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 01:06:17 crc kubenswrapper[4771]: W0227 01:06:17.120251 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-df81fbedc40a691bf7a68b47b0b692b1f4bd61b38d7101bb2862a8585ccfa8fc WatchSource:0}: Error finding container df81fbedc40a691bf7a68b47b0b692b1f4bd61b38d7101bb2862a8585ccfa8fc: Status 404 returned error can't find the container with id df81fbedc40a691bf7a68b47b0b692b1f4bd61b38d7101bb2862a8585ccfa8fc Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.124110 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 01:06:17 crc kubenswrapper[4771]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 27 01:06:17 crc kubenswrapper[4771]: if [[ -f "/env/_master" ]]; then Feb 27 01:06:17 crc kubenswrapper[4771]: set -o allexport Feb 27 01:06:17 crc kubenswrapper[4771]: source "/env/_master" Feb 27 01:06:17 crc kubenswrapper[4771]: set +o allexport Feb 27 01:06:17 crc kubenswrapper[4771]: fi Feb 27 01:06:17 crc kubenswrapper[4771]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Feb 27 01:06:17 crc kubenswrapper[4771]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Feb 27 01:06:17 crc kubenswrapper[4771]: ho_enable="--enable-hybrid-overlay" Feb 27 01:06:17 crc kubenswrapper[4771]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Feb 27 01:06:17 crc kubenswrapper[4771]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Feb 27 01:06:17 crc kubenswrapper[4771]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Feb 27 01:06:17 crc kubenswrapper[4771]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 27 01:06:17 crc kubenswrapper[4771]: --webhook-cert-dir="/etc/webhook-cert" \ Feb 27 01:06:17 crc kubenswrapper[4771]: --webhook-host=127.0.0.1 \ Feb 27 01:06:17 crc kubenswrapper[4771]: --webhook-port=9743 \ Feb 27 01:06:17 crc kubenswrapper[4771]: ${ho_enable} \ Feb 27 01:06:17 crc kubenswrapper[4771]: --enable-interconnect \ Feb 27 01:06:17 crc kubenswrapper[4771]: --disable-approver \ Feb 27 01:06:17 crc kubenswrapper[4771]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Feb 27 01:06:17 crc kubenswrapper[4771]: --wait-for-kubernetes-api=200s \ Feb 27 01:06:17 crc kubenswrapper[4771]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Feb 27 01:06:17 crc kubenswrapper[4771]: --loglevel="${LOGLEVEL}" Feb 27 01:06:17 crc kubenswrapper[4771]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 27 01:06:17 crc kubenswrapper[4771]: > logger="UnhandledError" Feb 27 01:06:17 crc kubenswrapper[4771]: W0227 01:06:17.128892 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-60f2786975cf86ce51fd78bd35efb71c705785fca6b91b2e812ecf9e846b95b4 WatchSource:0}: Error finding container 60f2786975cf86ce51fd78bd35efb71c705785fca6b91b2e812ecf9e846b95b4: Status 404 returned error can't find the container with id 60f2786975cf86ce51fd78bd35efb71c705785fca6b91b2e812ecf9e846b95b4 Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.129658 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 01:06:17 crc kubenswrapper[4771]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 27 01:06:17 crc kubenswrapper[4771]: if [[ -f "/env/_master" ]]; then Feb 27 01:06:17 crc kubenswrapper[4771]: set -o allexport Feb 27 01:06:17 crc kubenswrapper[4771]: source "/env/_master" Feb 27 01:06:17 crc kubenswrapper[4771]: set +o allexport Feb 27 01:06:17 crc kubenswrapper[4771]: fi Feb 27 01:06:17 crc kubenswrapper[4771]: Feb 27 01:06:17 crc kubenswrapper[4771]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Feb 27 01:06:17 crc kubenswrapper[4771]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 27 01:06:17 crc kubenswrapper[4771]: --disable-webhook \ Feb 27 01:06:17 crc kubenswrapper[4771]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Feb 27 01:06:17 crc kubenswrapper[4771]: --loglevel="${LOGLEVEL}" Feb 27 01:06:17 crc kubenswrapper[4771]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 27 01:06:17 crc kubenswrapper[4771]: > logger="UnhandledError" Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.130920 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.134089 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.135629 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.143536 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"60f2786975cf86ce51fd78bd35efb71c705785fca6b91b2e812ecf9e846b95b4"} Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.148195 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.149448 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.149982 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"df81fbedc40a691bf7a68b47b0b692b1f4bd61b38d7101bb2862a8585ccfa8fc"} Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.151346 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6e63c4d3d937d933f5743ae7809e59e3d49986c724d7cb7263f86f5f72005400"} Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.152805 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 01:06:17 crc kubenswrapper[4771]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 27 01:06:17 crc kubenswrapper[4771]: if [[ -f "/env/_master" ]]; then Feb 27 01:06:17 crc kubenswrapper[4771]: set -o allexport Feb 27 01:06:17 crc kubenswrapper[4771]: source "/env/_master" Feb 27 01:06:17 crc kubenswrapper[4771]: set +o allexport Feb 27 01:06:17 crc kubenswrapper[4771]: fi Feb 27 01:06:17 crc kubenswrapper[4771]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Feb 27 01:06:17 crc kubenswrapper[4771]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Feb 27 01:06:17 crc kubenswrapper[4771]: ho_enable="--enable-hybrid-overlay" Feb 27 01:06:17 crc kubenswrapper[4771]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Feb 27 01:06:17 crc kubenswrapper[4771]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Feb 27 01:06:17 crc kubenswrapper[4771]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Feb 27 01:06:17 crc kubenswrapper[4771]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 27 01:06:17 crc kubenswrapper[4771]: --webhook-cert-dir="/etc/webhook-cert" \ Feb 27 01:06:17 crc kubenswrapper[4771]: --webhook-host=127.0.0.1 \ Feb 27 01:06:17 crc kubenswrapper[4771]: --webhook-port=9743 \ Feb 27 01:06:17 crc kubenswrapper[4771]: ${ho_enable} \ Feb 27 01:06:17 crc kubenswrapper[4771]: --enable-interconnect \ Feb 27 01:06:17 crc kubenswrapper[4771]: --disable-approver \ Feb 27 01:06:17 crc kubenswrapper[4771]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Feb 27 01:06:17 crc kubenswrapper[4771]: --wait-for-kubernetes-api=200s \ Feb 27 01:06:17 crc kubenswrapper[4771]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Feb 27 01:06:17 crc kubenswrapper[4771]: --loglevel="${LOGLEVEL}" Feb 27 01:06:17 crc kubenswrapper[4771]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 27 01:06:17 crc kubenswrapper[4771]: > logger="UnhandledError" Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.153498 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 01:06:17 crc kubenswrapper[4771]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Feb 27 01:06:17 crc kubenswrapper[4771]: set -o allexport Feb 27 01:06:17 crc kubenswrapper[4771]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Feb 27 01:06:17 crc kubenswrapper[4771]: source /etc/kubernetes/apiserver-url.env Feb 27 01:06:17 crc kubenswrapper[4771]: else Feb 27 01:06:17 crc kubenswrapper[4771]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Feb 27 01:06:17 crc kubenswrapper[4771]: exit 1 Feb 27 01:06:17 crc kubenswrapper[4771]: fi Feb 27 01:06:17 crc kubenswrapper[4771]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Feb 27 01:06:17 crc kubenswrapper[4771]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 27 01:06:17 crc kubenswrapper[4771]: > logger="UnhandledError" Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.154858 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.155794 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 01:06:17 crc kubenswrapper[4771]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 27 01:06:17 crc kubenswrapper[4771]: if [[ -f "/env/_master" ]]; then Feb 27 01:06:17 crc kubenswrapper[4771]: set -o allexport Feb 27 01:06:17 crc kubenswrapper[4771]: source "/env/_master" Feb 27 01:06:17 crc kubenswrapper[4771]: set +o allexport Feb 27 01:06:17 crc kubenswrapper[4771]: fi Feb 27 01:06:17 crc kubenswrapper[4771]: Feb 27 01:06:17 crc kubenswrapper[4771]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Feb 27 01:06:17 crc kubenswrapper[4771]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 27 01:06:17 crc kubenswrapper[4771]: --disable-webhook \ Feb 27 01:06:17 crc kubenswrapper[4771]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Feb 27 01:06:17 crc kubenswrapper[4771]: --loglevel="${LOGLEVEL}" Feb 27 01:06:17 crc kubenswrapper[4771]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 27 01:06:17 crc kubenswrapper[4771]: > logger="UnhandledError" Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.157822 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.158948 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.171202 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.191126 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.203225 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.211711 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.211748 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.211760 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.211776 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.211787 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:17Z","lastTransitionTime":"2026-02-27T01:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.213177 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.223632 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.233256 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.247224 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.255821 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.266329 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.277861 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.288641 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.306022 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.313795 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.313828 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.313837 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.313850 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.313858 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:17Z","lastTransitionTime":"2026-02-27T01:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.316848 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.390518 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.390802 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:06:18.390742774 +0000 UTC m=+91.328304102 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.390950 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.391026 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.391104 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.391189 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:18.391168166 +0000 UTC m=+91.328729454 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.391249 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.391352 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:18.39133578 +0000 UTC m=+91.328897098 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.416522 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.416637 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.416658 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.416701 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.416719 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:17Z","lastTransitionTime":"2026-02-27T01:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.491950 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.492020 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.492152 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.492169 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.492180 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.492228 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:18.492211692 +0000 UTC m=+91.429772980 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.492355 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.492426 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.492452 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:17 crc kubenswrapper[4771]: E0227 01:06:17.492618 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:18.492533341 +0000 UTC m=+91.430094679 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.519876 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.519916 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.519928 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.519940 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.519950 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:17Z","lastTransitionTime":"2026-02-27T01:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.622663 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.622746 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.622770 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.622805 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.622829 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:17Z","lastTransitionTime":"2026-02-27T01:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.726186 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.726250 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.726267 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.726298 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.726320 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:17Z","lastTransitionTime":"2026-02-27T01:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.779381 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.780058 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.780753 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.781314 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.781849 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.782317 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.782874 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.783372 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.783989 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.784497 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.784982 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.785667 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.786149 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.786663 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.787165 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.790007 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.790540 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.790912 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.791786 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.792334 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.792838 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.793849 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.794263 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.795341 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.795749 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.796798 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.797428 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.798225 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.798829 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.799426 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.800251 4771 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.800351 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.802730 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.802821 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.803582 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.804037 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.805657 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.806682 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.807173 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.808286 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.808995 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.809428 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.810489 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.811481 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.812161 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.813314 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.814532 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.816615 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.818366 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.820126 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.821887 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.822241 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.823609 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.826720 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.828049 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.830754 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.832743 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.832802 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.832827 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.832856 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.832883 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:17Z","lastTransitionTime":"2026-02-27T01:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.838299 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.858210 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.871227 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.891260 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.905362 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.935686 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.935766 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.935794 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.935826 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:17 crc kubenswrapper[4771]: I0227 01:06:17.935852 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:17Z","lastTransitionTime":"2026-02-27T01:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.038667 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.038734 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.038750 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.038777 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.038795 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:18Z","lastTransitionTime":"2026-02-27T01:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.092012 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.092124 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.092327 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.092429 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.092452 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:18Z","lastTransitionTime":"2026-02-27T01:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:18 crc kubenswrapper[4771]: E0227 01:06:18.110054 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.115391 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.115474 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.115520 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.115544 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.115609 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:18Z","lastTransitionTime":"2026-02-27T01:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:18 crc kubenswrapper[4771]: E0227 01:06:18.132619 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.138269 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.138330 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.138348 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.138373 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.138390 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:18Z","lastTransitionTime":"2026-02-27T01:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:18 crc kubenswrapper[4771]: E0227 01:06:18.154209 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.158219 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.158420 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.158580 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.158720 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.158861 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:18Z","lastTransitionTime":"2026-02-27T01:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:18 crc kubenswrapper[4771]: E0227 01:06:18.174183 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.179201 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.179377 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.179528 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.179791 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.179934 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:18Z","lastTransitionTime":"2026-02-27T01:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:18 crc kubenswrapper[4771]: E0227 01:06:18.192245 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:18 crc kubenswrapper[4771]: E0227 01:06:18.192508 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.194608 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.194672 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.194689 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.194712 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.194730 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:18Z","lastTransitionTime":"2026-02-27T01:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.297274 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.297345 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.297363 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.297392 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.297415 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:18Z","lastTransitionTime":"2026-02-27T01:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.400331 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.400413 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.400437 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.400466 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.400487 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:18Z","lastTransitionTime":"2026-02-27T01:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.401360 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.401505 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:18 crc kubenswrapper[4771]: E0227 01:06:18.401513 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:06:20.401493466 +0000 UTC m=+93.339054754 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:06:18 crc kubenswrapper[4771]: E0227 01:06:18.401592 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.401610 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:18 crc kubenswrapper[4771]: E0227 01:06:18.401630 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:20.401622609 +0000 UTC m=+93.339183897 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 01:06:18 crc kubenswrapper[4771]: E0227 01:06:18.401813 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 01:06:18 crc kubenswrapper[4771]: E0227 01:06:18.403313 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:20.403277324 +0000 UTC m=+93.340838642 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.502160 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.502228 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:18 crc kubenswrapper[4771]: E0227 01:06:18.502415 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 01:06:18 crc kubenswrapper[4771]: E0227 01:06:18.502418 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 01:06:18 crc kubenswrapper[4771]: E0227 01:06:18.502441 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 01:06:18 crc kubenswrapper[4771]: E0227 01:06:18.502465 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 01:06:18 crc kubenswrapper[4771]: E0227 01:06:18.502467 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:18 crc kubenswrapper[4771]: E0227 01:06:18.502491 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:18 crc kubenswrapper[4771]: E0227 01:06:18.502537 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:20.502516572 +0000 UTC m=+93.440077900 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:18 crc kubenswrapper[4771]: E0227 01:06:18.502597 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:20.502585533 +0000 UTC m=+93.440146851 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.503947 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.503993 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.504011 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.504035 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.504055 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:18Z","lastTransitionTime":"2026-02-27T01:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.606049 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.606118 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.606142 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.606169 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.606192 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:18Z","lastTransitionTime":"2026-02-27T01:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.708464 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.708524 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.708542 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.708602 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.708620 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:18Z","lastTransitionTime":"2026-02-27T01:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.773098 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.773139 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.773098 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:18 crc kubenswrapper[4771]: E0227 01:06:18.773298 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:06:18 crc kubenswrapper[4771]: E0227 01:06:18.773486 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:06:18 crc kubenswrapper[4771]: E0227 01:06:18.773694 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.811900 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.811978 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.811999 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.812021 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.812038 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:18Z","lastTransitionTime":"2026-02-27T01:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.914442 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.914496 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.914516 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.914537 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:18 crc kubenswrapper[4771]: I0227 01:06:18.914580 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:18Z","lastTransitionTime":"2026-02-27T01:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.016882 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.016935 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.016948 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.016968 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.016980 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:19Z","lastTransitionTime":"2026-02-27T01:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.119992 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.120072 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.120088 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.120108 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.120119 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:19Z","lastTransitionTime":"2026-02-27T01:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.222175 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.222245 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.222268 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.222300 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.222324 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:19Z","lastTransitionTime":"2026-02-27T01:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.324598 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.324668 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.324692 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.324722 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.324746 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:19Z","lastTransitionTime":"2026-02-27T01:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.427645 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.427707 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.427731 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.427762 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.427786 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:19Z","lastTransitionTime":"2026-02-27T01:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.530740 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.530797 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.530823 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.530852 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.530874 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:19Z","lastTransitionTime":"2026-02-27T01:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.634307 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.634380 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.634398 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.634422 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.634439 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:19Z","lastTransitionTime":"2026-02-27T01:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.737679 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.738024 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.738044 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.738066 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.738083 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:19Z","lastTransitionTime":"2026-02-27T01:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.839990 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.840266 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.840353 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.840447 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.840541 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:19Z","lastTransitionTime":"2026-02-27T01:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.943165 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.943325 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.943412 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.943503 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:19 crc kubenswrapper[4771]: I0227 01:06:19.943608 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:19Z","lastTransitionTime":"2026-02-27T01:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.045948 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.046001 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.046016 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.046039 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.046055 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:20Z","lastTransitionTime":"2026-02-27T01:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.148377 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.148421 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.148438 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.148459 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.148476 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:20Z","lastTransitionTime":"2026-02-27T01:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.250732 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.250758 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.250769 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.250781 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.250791 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:20Z","lastTransitionTime":"2026-02-27T01:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.353182 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.353295 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.353315 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.353335 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.353350 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:20Z","lastTransitionTime":"2026-02-27T01:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.420183 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.420334 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.420374 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:20 crc kubenswrapper[4771]: E0227 01:06:20.420398 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:06:24.420366798 +0000 UTC m=+97.357928126 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:06:20 crc kubenswrapper[4771]: E0227 01:06:20.420489 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 01:06:20 crc kubenswrapper[4771]: E0227 01:06:20.420534 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 01:06:20 crc kubenswrapper[4771]: E0227 01:06:20.420541 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:24.420526962 +0000 UTC m=+97.358088280 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 01:06:20 crc kubenswrapper[4771]: E0227 01:06:20.420661 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:24.420637805 +0000 UTC m=+97.358199133 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.456436 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.456467 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.456479 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.456493 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.456503 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:20Z","lastTransitionTime":"2026-02-27T01:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.521274 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.521337 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:20 crc kubenswrapper[4771]: E0227 01:06:20.521491 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 01:06:20 crc kubenswrapper[4771]: E0227 01:06:20.521512 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 01:06:20 crc kubenswrapper[4771]: E0227 01:06:20.521540 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 01:06:20 crc kubenswrapper[4771]: E0227 01:06:20.521579 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:20 crc kubenswrapper[4771]: E0227 01:06:20.521633 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:24.5216157 +0000 UTC m=+97.459176998 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:20 crc kubenswrapper[4771]: E0227 01:06:20.521515 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 01:06:20 crc kubenswrapper[4771]: E0227 01:06:20.521673 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:20 crc kubenswrapper[4771]: E0227 01:06:20.521700 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:24.521692592 +0000 UTC m=+97.459253890 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.559346 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.559412 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.559434 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.559460 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.559479 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:20Z","lastTransitionTime":"2026-02-27T01:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.662707 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.662765 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.662782 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.662805 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.662823 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:20Z","lastTransitionTime":"2026-02-27T01:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.765608 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.765637 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.765648 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.765662 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.765673 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:20Z","lastTransitionTime":"2026-02-27T01:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.772431 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:20 crc kubenswrapper[4771]: E0227 01:06:20.772534 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.772615 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:20 crc kubenswrapper[4771]: E0227 01:06:20.772678 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.772727 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:20 crc kubenswrapper[4771]: E0227 01:06:20.772787 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.868535 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.868586 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.868596 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.868610 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.868620 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:20Z","lastTransitionTime":"2026-02-27T01:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.971681 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.971756 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.971782 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.971810 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:20 crc kubenswrapper[4771]: I0227 01:06:20.971831 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:20Z","lastTransitionTime":"2026-02-27T01:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.075364 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.075418 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.075438 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.075467 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.075490 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:21Z","lastTransitionTime":"2026-02-27T01:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.178333 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.178382 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.178399 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.178420 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.178440 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:21Z","lastTransitionTime":"2026-02-27T01:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.281118 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.281178 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.281195 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.281218 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.281234 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:21Z","lastTransitionTime":"2026-02-27T01:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.383685 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.383766 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.383790 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.383818 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.383840 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:21Z","lastTransitionTime":"2026-02-27T01:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.487129 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.487197 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.487214 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.487243 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.487261 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:21Z","lastTransitionTime":"2026-02-27T01:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.590215 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.590281 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.590299 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.590323 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.590340 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:21Z","lastTransitionTime":"2026-02-27T01:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.693195 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.693243 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.693259 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.693282 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.693299 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:21Z","lastTransitionTime":"2026-02-27T01:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.795708 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.795764 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.795776 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.795792 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.795804 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:21Z","lastTransitionTime":"2026-02-27T01:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.898499 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.898577 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.898594 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.898617 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:21 crc kubenswrapper[4771]: I0227 01:06:21.898636 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:21Z","lastTransitionTime":"2026-02-27T01:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.001886 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.001961 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.001985 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.002014 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.002036 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:22Z","lastTransitionTime":"2026-02-27T01:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.104361 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.104430 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.104448 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.104473 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.104491 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:22Z","lastTransitionTime":"2026-02-27T01:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.213748 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.213798 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.213810 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.213828 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.213842 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:22Z","lastTransitionTime":"2026-02-27T01:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.317044 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.317105 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.317124 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.317148 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.317165 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:22Z","lastTransitionTime":"2026-02-27T01:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.420210 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.420283 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.420302 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.420323 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.420342 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:22Z","lastTransitionTime":"2026-02-27T01:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.523167 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.523240 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.523258 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.523286 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.523304 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:22Z","lastTransitionTime":"2026-02-27T01:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.627766 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.627843 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.627867 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.627895 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.627918 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:22Z","lastTransitionTime":"2026-02-27T01:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.730677 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.730739 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.730761 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.730792 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.730813 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:22Z","lastTransitionTime":"2026-02-27T01:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.772829 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.772961 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:22 crc kubenswrapper[4771]: E0227 01:06:22.773145 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.773205 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:22 crc kubenswrapper[4771]: E0227 01:06:22.773356 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:06:22 crc kubenswrapper[4771]: E0227 01:06:22.773470 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.834474 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.834597 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.834615 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.834639 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.834656 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:22Z","lastTransitionTime":"2026-02-27T01:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.937460 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.937520 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.937544 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.937620 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:22 crc kubenswrapper[4771]: I0227 01:06:22.937641 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:22Z","lastTransitionTime":"2026-02-27T01:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.040729 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.040805 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.040827 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.040921 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.040958 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:23Z","lastTransitionTime":"2026-02-27T01:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.144495 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.144580 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.144598 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.144623 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.144642 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:23Z","lastTransitionTime":"2026-02-27T01:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.246889 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.246947 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.246965 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.246988 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.247007 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:23Z","lastTransitionTime":"2026-02-27T01:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.348982 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.349043 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.349060 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.349084 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.349123 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:23Z","lastTransitionTime":"2026-02-27T01:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.412179 4771 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.450812 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.450872 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.450894 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.450922 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.450943 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:23Z","lastTransitionTime":"2026-02-27T01:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.553502 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.553541 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.553588 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.553607 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.553619 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:23Z","lastTransitionTime":"2026-02-27T01:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.656635 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.656694 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.656719 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.656748 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.656773 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:23Z","lastTransitionTime":"2026-02-27T01:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.759989 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.760050 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.760066 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.760093 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.760111 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:23Z","lastTransitionTime":"2026-02-27T01:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.863115 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.863170 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.863186 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.863213 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.863230 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:23Z","lastTransitionTime":"2026-02-27T01:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.966290 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.966346 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.966364 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.966388 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:23 crc kubenswrapper[4771]: I0227 01:06:23.966403 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:23Z","lastTransitionTime":"2026-02-27T01:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.069277 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.069351 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.069384 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.069411 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.069433 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:24Z","lastTransitionTime":"2026-02-27T01:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.171004 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.171047 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.171059 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.171077 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.171088 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:24Z","lastTransitionTime":"2026-02-27T01:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.273388 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.273461 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.273484 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.273516 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.273542 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:24Z","lastTransitionTime":"2026-02-27T01:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.375832 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.375893 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.375910 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.375936 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.375952 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:24Z","lastTransitionTime":"2026-02-27T01:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.457530 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.457666 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.457722 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:24 crc kubenswrapper[4771]: E0227 01:06:24.457873 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 01:06:24 crc kubenswrapper[4771]: E0227 01:06:24.457914 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 01:06:24 crc kubenswrapper[4771]: E0227 01:06:24.457876 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:06:32.457829144 +0000 UTC m=+105.395390472 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:06:24 crc kubenswrapper[4771]: E0227 01:06:24.458081 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:32.458037529 +0000 UTC m=+105.395598847 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 01:06:24 crc kubenswrapper[4771]: E0227 01:06:24.458113 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:32.458102332 +0000 UTC m=+105.395663730 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.477868 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.477940 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.477957 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.477982 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.478001 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:24Z","lastTransitionTime":"2026-02-27T01:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.558759 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.558836 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:24 crc kubenswrapper[4771]: E0227 01:06:24.558990 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 01:06:24 crc kubenswrapper[4771]: E0227 01:06:24.559015 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 01:06:24 crc kubenswrapper[4771]: E0227 01:06:24.559033 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:24 crc kubenswrapper[4771]: E0227 01:06:24.559050 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 01:06:24 crc kubenswrapper[4771]: E0227 01:06:24.559095 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 01:06:24 crc kubenswrapper[4771]: E0227 01:06:24.559107 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:32.559085567 +0000 UTC m=+105.496646885 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:24 crc kubenswrapper[4771]: E0227 01:06:24.559113 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:24 crc kubenswrapper[4771]: E0227 01:06:24.559330 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:32.55923265 +0000 UTC m=+105.496793948 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.584954 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.585019 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.585037 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.585063 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.585080 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:24Z","lastTransitionTime":"2026-02-27T01:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.687783 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.687865 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.687895 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.687924 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.687945 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:24Z","lastTransitionTime":"2026-02-27T01:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.772288 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.772324 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.772330 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:24 crc kubenswrapper[4771]: E0227 01:06:24.772530 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:06:24 crc kubenswrapper[4771]: E0227 01:06:24.772763 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:06:24 crc kubenswrapper[4771]: E0227 01:06:24.772948 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.790662 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.790690 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.790698 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.790710 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.790719 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:24Z","lastTransitionTime":"2026-02-27T01:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.893144 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.893191 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.893208 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.893229 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.893246 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:24Z","lastTransitionTime":"2026-02-27T01:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.996113 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.996170 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.996192 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.996219 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:24 crc kubenswrapper[4771]: I0227 01:06:24.996241 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:24Z","lastTransitionTime":"2026-02-27T01:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.099317 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.099357 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.099365 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.099381 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.099391 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:25Z","lastTransitionTime":"2026-02-27T01:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.202495 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.202602 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.202643 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.202679 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.202702 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:25Z","lastTransitionTime":"2026-02-27T01:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.306678 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.306750 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.306768 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.306793 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.306810 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:25Z","lastTransitionTime":"2026-02-27T01:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.410171 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.410255 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.410275 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.410302 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.410321 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:25Z","lastTransitionTime":"2026-02-27T01:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.513292 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.513367 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.513405 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.513434 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.513454 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:25Z","lastTransitionTime":"2026-02-27T01:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.616451 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.616518 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.616542 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.616611 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.616646 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:25Z","lastTransitionTime":"2026-02-27T01:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.720099 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.720162 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.720180 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.720206 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.720229 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:25Z","lastTransitionTime":"2026-02-27T01:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.789068 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.789472 4771 scope.go:117] "RemoveContainer" containerID="6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.822235 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.822333 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.822353 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.822414 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.822436 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:25Z","lastTransitionTime":"2026-02-27T01:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.924931 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.924966 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.924978 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.924994 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:25 crc kubenswrapper[4771]: I0227 01:06:25.925006 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:25Z","lastTransitionTime":"2026-02-27T01:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.028472 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.028527 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.028575 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.028603 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.028621 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:26Z","lastTransitionTime":"2026-02-27T01:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.132080 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.132142 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.132159 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.132183 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.132203 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:26Z","lastTransitionTime":"2026-02-27T01:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.180597 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.183654 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645"} Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.184168 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.200428 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.213436 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.235094 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.235145 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.235162 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.235185 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.235203 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:26Z","lastTransitionTime":"2026-02-27T01:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.243582 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.259657 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.278874 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.293669 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.303293 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.313685 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.337794 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.337836 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.337847 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.337866 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.337879 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:26Z","lastTransitionTime":"2026-02-27T01:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.441371 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.441434 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.441467 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.441492 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.441509 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:26Z","lastTransitionTime":"2026-02-27T01:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.548991 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.549041 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.549058 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.549080 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.549096 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:26Z","lastTransitionTime":"2026-02-27T01:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.652624 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.652678 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.652696 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.652718 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.652905 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:26Z","lastTransitionTime":"2026-02-27T01:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.755648 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.755784 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.755809 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.755847 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.755863 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:26Z","lastTransitionTime":"2026-02-27T01:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.772105 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.772176 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:26 crc kubenswrapper[4771]: E0227 01:06:26.772282 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.772293 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:26 crc kubenswrapper[4771]: E0227 01:06:26.772426 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:06:26 crc kubenswrapper[4771]: E0227 01:06:26.772537 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.858962 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.859017 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.859035 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.859057 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.859076 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:26Z","lastTransitionTime":"2026-02-27T01:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.962060 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.962122 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.962139 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.962161 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:26 crc kubenswrapper[4771]: I0227 01:06:26.962177 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:26Z","lastTransitionTime":"2026-02-27T01:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.064968 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.065021 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.065038 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.065059 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.065076 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:27Z","lastTransitionTime":"2026-02-27T01:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.178062 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.178125 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.178146 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.178175 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.178197 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:27Z","lastTransitionTime":"2026-02-27T01:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.280733 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.280779 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.280791 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.280810 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.280821 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:27Z","lastTransitionTime":"2026-02-27T01:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.383187 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.383233 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.383244 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.383260 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.383275 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:27Z","lastTransitionTime":"2026-02-27T01:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.486214 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.486289 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.486344 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.486375 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.486399 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:27Z","lastTransitionTime":"2026-02-27T01:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.589304 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.589360 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.589380 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.589406 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.589422 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:27Z","lastTransitionTime":"2026-02-27T01:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.691962 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.692025 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.692042 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.692065 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.692085 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:27Z","lastTransitionTime":"2026-02-27T01:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.796240 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.797280 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.797336 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.797360 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.797389 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.797409 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:27Z","lastTransitionTime":"2026-02-27T01:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.810101 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.837856 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.857186 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.880574 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.895327 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.904995 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.905052 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.905070 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.905094 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.905111 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:27Z","lastTransitionTime":"2026-02-27T01:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.907538 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:27 crc kubenswrapper[4771]: I0227 01:06:27.918659 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.007258 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.007292 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.007302 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.007319 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.007330 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:28Z","lastTransitionTime":"2026-02-27T01:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.110647 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.110698 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.110716 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.110740 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.110758 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:28Z","lastTransitionTime":"2026-02-27T01:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.192649 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8"} Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.208696 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.213943 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.214000 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.214016 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.214040 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.214058 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:28Z","lastTransitionTime":"2026-02-27T01:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.218295 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-gv8pz"] Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.218748 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gv8pz" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.221802 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.221921 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.222780 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.224594 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.254425 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.269371 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.286180 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.294836 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6b4fce63-a548-406d-8663-45d1e335b000-hosts-file\") pod \"node-resolver-gv8pz\" (UID: \"6b4fce63-a548-406d-8663-45d1e335b000\") " pod="openshift-dns/node-resolver-gv8pz" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.295123 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrvxz\" (UniqueName: \"kubernetes.io/projected/6b4fce63-a548-406d-8663-45d1e335b000-kube-api-access-vrvxz\") pod \"node-resolver-gv8pz\" (UID: \"6b4fce63-a548-406d-8663-45d1e335b000\") " pod="openshift-dns/node-resolver-gv8pz" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.300592 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.315531 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.317493 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.317543 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.317588 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.317613 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.317633 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:28Z","lastTransitionTime":"2026-02-27T01:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.326240 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.326291 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.326309 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.326330 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.326346 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:28Z","lastTransitionTime":"2026-02-27T01:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.330263 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: E0227 01:06:28.340674 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.344836 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.345599 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.345639 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.345651 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.345670 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.345681 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:28Z","lastTransitionTime":"2026-02-27T01:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.359289 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: E0227 01:06:28.360029 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.364059 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.364084 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.364096 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.364114 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.364127 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:28Z","lastTransitionTime":"2026-02-27T01:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.368172 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: E0227 01:06:28.376203 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.379453 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.379997 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.380022 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.380030 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.380041 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.380049 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:28Z","lastTransitionTime":"2026-02-27T01:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:28 crc kubenswrapper[4771]: E0227 01:06:28.389902 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.390682 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.394201 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.394235 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.394246 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.394262 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.394273 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:28Z","lastTransitionTime":"2026-02-27T01:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.396083 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6b4fce63-a548-406d-8663-45d1e335b000-hosts-file\") pod \"node-resolver-gv8pz\" (UID: \"6b4fce63-a548-406d-8663-45d1e335b000\") " pod="openshift-dns/node-resolver-gv8pz" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.396134 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrvxz\" (UniqueName: \"kubernetes.io/projected/6b4fce63-a548-406d-8663-45d1e335b000-kube-api-access-vrvxz\") pod \"node-resolver-gv8pz\" (UID: \"6b4fce63-a548-406d-8663-45d1e335b000\") " pod="openshift-dns/node-resolver-gv8pz" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.396189 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6b4fce63-a548-406d-8663-45d1e335b000-hosts-file\") pod \"node-resolver-gv8pz\" (UID: \"6b4fce63-a548-406d-8663-45d1e335b000\") " pod="openshift-dns/node-resolver-gv8pz" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.403428 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: E0227 01:06:28.408065 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: E0227 01:06:28.408386 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.415540 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrvxz\" (UniqueName: \"kubernetes.io/projected/6b4fce63-a548-406d-8663-45d1e335b000-kube-api-access-vrvxz\") pod \"node-resolver-gv8pz\" (UID: \"6b4fce63-a548-406d-8663-45d1e335b000\") " pod="openshift-dns/node-resolver-gv8pz" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.421608 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.421645 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.421657 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.421672 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.421683 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:28Z","lastTransitionTime":"2026-02-27T01:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.426972 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.439179 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.450057 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.524734 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.524766 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.524777 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.524791 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.524800 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:28Z","lastTransitionTime":"2026-02-27T01:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.542047 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gv8pz" Feb 27 01:06:28 crc kubenswrapper[4771]: W0227 01:06:28.559255 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b4fce63_a548_406d_8663_45d1e335b000.slice/crio-3243fbbe75c1c71d71efcc15875c6503fa696f27a43fe080098dee16df4f8c23 WatchSource:0}: Error finding container 3243fbbe75c1c71d71efcc15875c6503fa696f27a43fe080098dee16df4f8c23: Status 404 returned error can't find the container with id 3243fbbe75c1c71d71efcc15875c6503fa696f27a43fe080098dee16df4f8c23 Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.581392 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-srbwq"] Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.582015 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.597430 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.598081 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.598207 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.599677 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.599826 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-host-run-k8s-cni-cncf-io\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.599903 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3c460c23-4b4a-458f-a52e-4208b9942829-multus-daemon-config\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.599985 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-etc-kubernetes\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.600032 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-multus-cni-dir\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.600064 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c460c23-4b4a-458f-a52e-4208b9942829-cni-binary-copy\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.600093 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-multus-socket-dir-parent\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.600126 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-host-var-lib-cni-bin\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.600156 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-host-var-lib-kubelet\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.600186 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-cnibin\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.600226 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-system-cni-dir\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.600249 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-os-release\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.600273 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-host-run-multus-certs\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.600306 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-host-run-netns\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.600346 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-multus-conf-dir\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.600381 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-hostroot\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.600412 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f826\" (UniqueName: \"kubernetes.io/projected/3c460c23-4b4a-458f-a52e-4208b9942829-kube-api-access-6f826\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.600439 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-host-var-lib-cni-multus\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.601245 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.602875 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-hhdz6"] Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.605517 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-hw7dn"] Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.605894 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.605913 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.607604 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.608526 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.609060 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.609247 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.609258 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.609898 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.610155 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.612479 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.621433 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.628181 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.628246 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.628260 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.628277 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.628308 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:28Z","lastTransitionTime":"2026-02-27T01:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.628836 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.644921 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.659627 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.671614 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.689775 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.698855 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.701437 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f826\" (UniqueName: \"kubernetes.io/projected/3c460c23-4b4a-458f-a52e-4208b9942829-kube-api-access-6f826\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.701471 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fcw7\" (UniqueName: \"kubernetes.io/projected/0fc94570-c9c6-41e3-8a2b-0536f371b5da-kube-api-access-8fcw7\") pod \"multus-additional-cni-plugins-hhdz6\" (UID: \"0fc94570-c9c6-41e3-8a2b-0536f371b5da\") " pod="openshift-multus/multus-additional-cni-plugins-hhdz6" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.701496 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-cnibin\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.701512 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c460c23-4b4a-458f-a52e-4208b9942829-cni-binary-copy\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.701531 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-host-var-lib-kubelet\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.701660 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fc94570-c9c6-41e3-8a2b-0536f371b5da-system-cni-dir\") pod \"multus-additional-cni-plugins-hhdz6\" (UID: \"0fc94570-c9c6-41e3-8a2b-0536f371b5da\") " pod="openshift-multus/multus-additional-cni-plugins-hhdz6" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.701685 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0fc94570-c9c6-41e3-8a2b-0536f371b5da-cni-binary-copy\") pod \"multus-additional-cni-plugins-hhdz6\" (UID: \"0fc94570-c9c6-41e3-8a2b-0536f371b5da\") " pod="openshift-multus/multus-additional-cni-plugins-hhdz6" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.701714 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ca81e505-d53f-496e-bd26-7cec669591e4-rootfs\") pod \"machine-config-daemon-hw7dn\" (UID: \"ca81e505-d53f-496e-bd26-7cec669591e4\") " pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.701740 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-system-cni-dir\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.701768 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-os-release\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.701786 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-multus-conf-dir\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.701805 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-host-var-lib-cni-multus\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.701799 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-cnibin\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.701871 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-hostroot\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.701823 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-hostroot\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.701955 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-host-var-lib-kubelet\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.701957 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca81e505-d53f-496e-bd26-7cec669591e4-proxy-tls\") pod \"machine-config-daemon-hw7dn\" (UID: \"ca81e505-d53f-496e-bd26-7cec669591e4\") " pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.702015 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-host-run-k8s-cni-cncf-io\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.702046 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3c460c23-4b4a-458f-a52e-4208b9942829-multus-daemon-config\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.702081 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0fc94570-c9c6-41e3-8a2b-0536f371b5da-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hhdz6\" (UID: \"0fc94570-c9c6-41e3-8a2b-0536f371b5da\") " pod="openshift-multus/multus-additional-cni-plugins-hhdz6" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.702159 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-etc-kubernetes\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.702192 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0fc94570-c9c6-41e3-8a2b-0536f371b5da-cnibin\") pod \"multus-additional-cni-plugins-hhdz6\" (UID: \"0fc94570-c9c6-41e3-8a2b-0536f371b5da\") " pod="openshift-multus/multus-additional-cni-plugins-hhdz6" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.702214 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-os-release\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.702282 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-host-run-k8s-cni-cncf-io\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.702292 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-system-cni-dir\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.702223 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ca81e505-d53f-496e-bd26-7cec669591e4-mcd-auth-proxy-config\") pod \"machine-config-daemon-hw7dn\" (UID: \"ca81e505-d53f-496e-bd26-7cec669591e4\") " pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.702345 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2xw9\" (UniqueName: \"kubernetes.io/projected/ca81e505-d53f-496e-bd26-7cec669591e4-kube-api-access-r2xw9\") pod \"machine-config-daemon-hw7dn\" (UID: \"ca81e505-d53f-496e-bd26-7cec669591e4\") " pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.702387 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-multus-cni-dir\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.702421 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-multus-socket-dir-parent\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.702450 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-host-var-lib-cni-bin\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.702485 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-host-run-multus-certs\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.702520 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0fc94570-c9c6-41e3-8a2b-0536f371b5da-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hhdz6\" (UID: \"0fc94570-c9c6-41e3-8a2b-0536f371b5da\") " pod="openshift-multus/multus-additional-cni-plugins-hhdz6" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.702591 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-host-run-netns\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.702600 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-multus-socket-dir-parent\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.702627 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0fc94570-c9c6-41e3-8a2b-0536f371b5da-os-release\") pod \"multus-additional-cni-plugins-hhdz6\" (UID: \"0fc94570-c9c6-41e3-8a2b-0536f371b5da\") " pod="openshift-multus/multus-additional-cni-plugins-hhdz6" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.702696 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-etc-kubernetes\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.702752 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-host-var-lib-cni-multus\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.702786 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-multus-conf-dir\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.702802 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-host-var-lib-cni-bin\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.702839 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-host-run-multus-certs\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.702839 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-host-run-netns\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.703233 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c460c23-4b4a-458f-a52e-4208b9942829-cni-binary-copy\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.703503 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3c460c23-4b4a-458f-a52e-4208b9942829-multus-daemon-config\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.702948 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c460c23-4b4a-458f-a52e-4208b9942829-multus-cni-dir\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.711894 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.722198 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f826\" (UniqueName: \"kubernetes.io/projected/3c460c23-4b4a-458f-a52e-4208b9942829-kube-api-access-6f826\") pod \"multus-srbwq\" (UID: \"3c460c23-4b4a-458f-a52e-4208b9942829\") " pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.731393 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.733155 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.733199 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.733212 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.733230 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.733241 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:28Z","lastTransitionTime":"2026-02-27T01:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.743208 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.755839 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.764525 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.773108 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.773188 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.773272 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:28 crc kubenswrapper[4771]: E0227 01:06:28.773209 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:06:28 crc kubenswrapper[4771]: E0227 01:06:28.773418 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:06:28 crc kubenswrapper[4771]: E0227 01:06:28.774839 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.779916 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.790543 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.805590 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca81e505-d53f-496e-bd26-7cec669591e4-proxy-tls\") pod \"machine-config-daemon-hw7dn\" (UID: \"ca81e505-d53f-496e-bd26-7cec669591e4\") " pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.805647 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0fc94570-c9c6-41e3-8a2b-0536f371b5da-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hhdz6\" (UID: \"0fc94570-c9c6-41e3-8a2b-0536f371b5da\") " pod="openshift-multus/multus-additional-cni-plugins-hhdz6" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.805541 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.805695 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ca81e505-d53f-496e-bd26-7cec669591e4-mcd-auth-proxy-config\") pod \"machine-config-daemon-hw7dn\" (UID: \"ca81e505-d53f-496e-bd26-7cec669591e4\") " pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.805717 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2xw9\" (UniqueName: \"kubernetes.io/projected/ca81e505-d53f-496e-bd26-7cec669591e4-kube-api-access-r2xw9\") pod \"machine-config-daemon-hw7dn\" (UID: \"ca81e505-d53f-496e-bd26-7cec669591e4\") " pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.805738 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0fc94570-c9c6-41e3-8a2b-0536f371b5da-cnibin\") pod \"multus-additional-cni-plugins-hhdz6\" (UID: \"0fc94570-c9c6-41e3-8a2b-0536f371b5da\") " pod="openshift-multus/multus-additional-cni-plugins-hhdz6" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.805775 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0fc94570-c9c6-41e3-8a2b-0536f371b5da-os-release\") pod \"multus-additional-cni-plugins-hhdz6\" (UID: \"0fc94570-c9c6-41e3-8a2b-0536f371b5da\") " pod="openshift-multus/multus-additional-cni-plugins-hhdz6" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.805795 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0fc94570-c9c6-41e3-8a2b-0536f371b5da-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hhdz6\" (UID: \"0fc94570-c9c6-41e3-8a2b-0536f371b5da\") " pod="openshift-multus/multus-additional-cni-plugins-hhdz6" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.805817 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fcw7\" (UniqueName: \"kubernetes.io/projected/0fc94570-c9c6-41e3-8a2b-0536f371b5da-kube-api-access-8fcw7\") pod \"multus-additional-cni-plugins-hhdz6\" (UID: \"0fc94570-c9c6-41e3-8a2b-0536f371b5da\") " pod="openshift-multus/multus-additional-cni-plugins-hhdz6" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.805861 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fc94570-c9c6-41e3-8a2b-0536f371b5da-system-cni-dir\") pod \"multus-additional-cni-plugins-hhdz6\" (UID: \"0fc94570-c9c6-41e3-8a2b-0536f371b5da\") " pod="openshift-multus/multus-additional-cni-plugins-hhdz6" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.805880 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0fc94570-c9c6-41e3-8a2b-0536f371b5da-cni-binary-copy\") pod \"multus-additional-cni-plugins-hhdz6\" (UID: \"0fc94570-c9c6-41e3-8a2b-0536f371b5da\") " pod="openshift-multus/multus-additional-cni-plugins-hhdz6" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.805902 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ca81e505-d53f-496e-bd26-7cec669591e4-rootfs\") pod \"machine-config-daemon-hw7dn\" (UID: \"ca81e505-d53f-496e-bd26-7cec669591e4\") " pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.805904 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0fc94570-c9c6-41e3-8a2b-0536f371b5da-os-release\") pod \"multus-additional-cni-plugins-hhdz6\" (UID: \"0fc94570-c9c6-41e3-8a2b-0536f371b5da\") " pod="openshift-multus/multus-additional-cni-plugins-hhdz6" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.805916 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0fc94570-c9c6-41e3-8a2b-0536f371b5da-cnibin\") pod \"multus-additional-cni-plugins-hhdz6\" (UID: \"0fc94570-c9c6-41e3-8a2b-0536f371b5da\") " pod="openshift-multus/multus-additional-cni-plugins-hhdz6" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.805957 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ca81e505-d53f-496e-bd26-7cec669591e4-rootfs\") pod \"machine-config-daemon-hw7dn\" (UID: \"ca81e505-d53f-496e-bd26-7cec669591e4\") " pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.805998 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fc94570-c9c6-41e3-8a2b-0536f371b5da-system-cni-dir\") pod \"multus-additional-cni-plugins-hhdz6\" (UID: \"0fc94570-c9c6-41e3-8a2b-0536f371b5da\") " pod="openshift-multus/multus-additional-cni-plugins-hhdz6" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.806291 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0fc94570-c9c6-41e3-8a2b-0536f371b5da-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hhdz6\" (UID: \"0fc94570-c9c6-41e3-8a2b-0536f371b5da\") " pod="openshift-multus/multus-additional-cni-plugins-hhdz6" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.807068 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ca81e505-d53f-496e-bd26-7cec669591e4-mcd-auth-proxy-config\") pod \"machine-config-daemon-hw7dn\" (UID: \"ca81e505-d53f-496e-bd26-7cec669591e4\") " pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.807180 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0fc94570-c9c6-41e3-8a2b-0536f371b5da-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hhdz6\" (UID: \"0fc94570-c9c6-41e3-8a2b-0536f371b5da\") " pod="openshift-multus/multus-additional-cni-plugins-hhdz6" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.808104 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0fc94570-c9c6-41e3-8a2b-0536f371b5da-cni-binary-copy\") pod \"multus-additional-cni-plugins-hhdz6\" (UID: \"0fc94570-c9c6-41e3-8a2b-0536f371b5da\") " pod="openshift-multus/multus-additional-cni-plugins-hhdz6" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.819393 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.825147 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fcw7\" (UniqueName: \"kubernetes.io/projected/0fc94570-c9c6-41e3-8a2b-0536f371b5da-kube-api-access-8fcw7\") pod \"multus-additional-cni-plugins-hhdz6\" (UID: \"0fc94570-c9c6-41e3-8a2b-0536f371b5da\") " pod="openshift-multus/multus-additional-cni-plugins-hhdz6" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.826079 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca81e505-d53f-496e-bd26-7cec669591e4-proxy-tls\") pod \"machine-config-daemon-hw7dn\" (UID: \"ca81e505-d53f-496e-bd26-7cec669591e4\") " pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.831892 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.832081 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2xw9\" (UniqueName: \"kubernetes.io/projected/ca81e505-d53f-496e-bd26-7cec669591e4-kube-api-access-r2xw9\") pod \"machine-config-daemon-hw7dn\" (UID: \"ca81e505-d53f-496e-bd26-7cec669591e4\") " pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.842863 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.842903 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.842914 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.842930 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.842941 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:28Z","lastTransitionTime":"2026-02-27T01:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.854565 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.873612 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.895083 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.906186 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.935009 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-srbwq" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.945716 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.946146 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.946175 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.946190 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.946212 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.946228 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:28Z","lastTransitionTime":"2026-02-27T01:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.952228 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 01:06:28 crc kubenswrapper[4771]: W0227 01:06:28.956890 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c460c23_4b4a_458f_a52e_4208b9942829.slice/crio-cd31e415939f1f302a3d6ca893a6eeaa345ab96ea2eb48e952c423fdb4589624 WatchSource:0}: Error finding container cd31e415939f1f302a3d6ca893a6eeaa345ab96ea2eb48e952c423fdb4589624: Status 404 returned error can't find the container with id cd31e415939f1f302a3d6ca893a6eeaa345ab96ea2eb48e952c423fdb4589624 Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.977358 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-h5vs8"] Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.979726 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.982920 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.983230 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.983487 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.983740 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.983811 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.985460 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 27 01:06:28 crc kubenswrapper[4771]: I0227 01:06:28.985462 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 27 01:06:28 crc kubenswrapper[4771]: W0227 01:06:28.993630 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca81e505_d53f_496e_bd26_7cec669591e4.slice/crio-f3ffcc7ee50d992faf8579ad01c22fadad60ee37fca67662cdf4f957ac7e981e WatchSource:0}: Error finding container f3ffcc7ee50d992faf8579ad01c22fadad60ee37fca67662cdf4f957ac7e981e: Status 404 returned error can't find the container with id f3ffcc7ee50d992faf8579ad01c22fadad60ee37fca67662cdf4f957ac7e981e Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.007654 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21f824c6-1bde-4e58-b4ef-72a56a140abb-ovn-node-metrics-cert\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.007692 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21f824c6-1bde-4e58-b4ef-72a56a140abb-ovnkube-script-lib\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.007717 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-node-log\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.007738 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c6hm\" (UniqueName: \"kubernetes.io/projected/21f824c6-1bde-4e58-b4ef-72a56a140abb-kube-api-access-9c6hm\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.007771 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-kubelet\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.007794 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-cni-netd\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.007814 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-run-ovn\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.007832 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21f824c6-1bde-4e58-b4ef-72a56a140abb-env-overrides\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.007947 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-slash\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.007970 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-run-netns\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.008051 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.008115 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-log-socket\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.008316 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-var-lib-openvswitch\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.008345 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.008375 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-run-ovn-kubernetes\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.008395 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-cni-bin\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.008430 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-run-systemd\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.008450 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-run-openvswitch\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.008473 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21f824c6-1bde-4e58-b4ef-72a56a140abb-ovnkube-config\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.008503 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-systemd-units\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.008526 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-etc-openvswitch\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.021212 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.033158 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.043448 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.052850 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.052896 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.052908 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.052926 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.052977 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:29Z","lastTransitionTime":"2026-02-27T01:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.055239 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.071418 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.084710 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.093780 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.109396 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-run-ovn-kubernetes\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.109511 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-cni-bin\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.109635 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.109668 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-run-systemd\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.109709 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-run-openvswitch\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.109728 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21f824c6-1bde-4e58-b4ef-72a56a140abb-ovnkube-config\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.109792 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-systemd-units\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.109821 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-etc-openvswitch\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.110013 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-run-ovn-kubernetes\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.110101 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-cni-bin\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.110169 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.110243 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-run-systemd\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.110273 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-run-openvswitch\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.109924 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-etc-openvswitch\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.110987 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-systemd-units\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.110911 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21f824c6-1bde-4e58-b4ef-72a56a140abb-ovn-node-metrics-cert\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.112071 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21f824c6-1bde-4e58-b4ef-72a56a140abb-ovnkube-script-lib\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.112167 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-node-log\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.112195 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c6hm\" (UniqueName: \"kubernetes.io/projected/21f824c6-1bde-4e58-b4ef-72a56a140abb-kube-api-access-9c6hm\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.112345 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-node-log\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.112372 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-kubelet\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.112459 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-cni-netd\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.112490 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-run-ovn\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.112515 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21f824c6-1bde-4e58-b4ef-72a56a140abb-env-overrides\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.112526 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-kubelet\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.112542 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-slash\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.112587 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-run-netns\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.112658 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-var-lib-openvswitch\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.112681 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-log-socket\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.112765 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-log-socket\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.112761 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-cni-netd\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.112775 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-run-ovn\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.112807 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-var-lib-openvswitch\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.112816 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-run-netns\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.113262 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21f824c6-1bde-4e58-b4ef-72a56a140abb-ovnkube-script-lib\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.113396 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-slash\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.114831 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21f824c6-1bde-4e58-b4ef-72a56a140abb-ovn-node-metrics-cert\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.114707 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.115575 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21f824c6-1bde-4e58-b4ef-72a56a140abb-ovnkube-config\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.116055 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21f824c6-1bde-4e58-b4ef-72a56a140abb-env-overrides\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.134790 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c6hm\" (UniqueName: \"kubernetes.io/projected/21f824c6-1bde-4e58-b4ef-72a56a140abb-kube-api-access-9c6hm\") pod \"ovnkube-node-h5vs8\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.135736 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.147599 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.156166 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.156202 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.156211 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.156230 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.156244 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:29Z","lastTransitionTime":"2026-02-27T01:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.159672 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.175815 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.195921 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d"} Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.196090 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77"} Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.197898 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerStarted","Data":"593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867"} Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.198007 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerStarted","Data":"f3ffcc7ee50d992faf8579ad01c22fadad60ee37fca67662cdf4f957ac7e981e"} Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.200905 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" event={"ID":"0fc94570-c9c6-41e3-8a2b-0536f371b5da","Type":"ContainerStarted","Data":"e49e9784d33da53a5b9bbb28e6da00a6c32eed2d75fc7a35cfefa46e3c77421f"} Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.207209 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-srbwq" event={"ID":"3c460c23-4b4a-458f-a52e-4208b9942829","Type":"ContainerStarted","Data":"b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d"} Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.207263 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-srbwq" event={"ID":"3c460c23-4b4a-458f-a52e-4208b9942829","Type":"ContainerStarted","Data":"cd31e415939f1f302a3d6ca893a6eeaa345ab96ea2eb48e952c423fdb4589624"} Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.209663 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gv8pz" event={"ID":"6b4fce63-a548-406d-8663-45d1e335b000","Type":"ContainerStarted","Data":"45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826"} Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.209699 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gv8pz" event={"ID":"6b4fce63-a548-406d-8663-45d1e335b000","Type":"ContainerStarted","Data":"3243fbbe75c1c71d71efcc15875c6503fa696f27a43fe080098dee16df4f8c23"} Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.217900 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.233234 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.243978 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.253113 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.258303 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.258412 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.258485 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.258563 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.258620 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:29Z","lastTransitionTime":"2026-02-27T01:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.266002 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.280890 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.292026 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.293206 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:29 crc kubenswrapper[4771]: W0227 01:06:29.305806 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21f824c6_1bde_4e58_b4ef_72a56a140abb.slice/crio-5457b050cedf3d58c016746fe9ae19016484dee6506d5903e4a66dadde91100c WatchSource:0}: Error finding container 5457b050cedf3d58c016746fe9ae19016484dee6506d5903e4a66dadde91100c: Status 404 returned error can't find the container with id 5457b050cedf3d58c016746fe9ae19016484dee6506d5903e4a66dadde91100c Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.306281 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.317939 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.331703 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.347180 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.361647 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.361707 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.361721 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.361737 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.361750 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:29Z","lastTransitionTime":"2026-02-27T01:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.364274 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.386199 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.395853 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.421070 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.461491 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.464126 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.464151 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.464159 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.464171 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.464182 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:29Z","lastTransitionTime":"2026-02-27T01:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.509431 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.545577 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.566903 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.566942 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.566953 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.566965 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.566976 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:29Z","lastTransitionTime":"2026-02-27T01:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.582076 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.629188 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.669102 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.669146 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.669158 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.669174 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.669184 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:29Z","lastTransitionTime":"2026-02-27T01:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.674243 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.715533 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.747092 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.772510 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.772590 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.772608 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.772635 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.772656 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:29Z","lastTransitionTime":"2026-02-27T01:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.792660 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.844276 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.866766 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.875651 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.875679 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.875689 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.875705 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.875717 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:29Z","lastTransitionTime":"2026-02-27T01:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.978367 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.978402 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.978412 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.978427 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:29 crc kubenswrapper[4771]: I0227 01:06:29.978438 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:29Z","lastTransitionTime":"2026-02-27T01:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.080230 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.080284 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.080297 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.080317 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.080329 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:30Z","lastTransitionTime":"2026-02-27T01:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.184311 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.184411 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.184432 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.184458 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.184476 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:30Z","lastTransitionTime":"2026-02-27T01:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.216609 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerStarted","Data":"e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f"} Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.218390 4771 generic.go:334] "Generic (PLEG): container finished" podID="0fc94570-c9c6-41e3-8a2b-0536f371b5da" containerID="45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca" exitCode=0 Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.218448 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" event={"ID":"0fc94570-c9c6-41e3-8a2b-0536f371b5da","Type":"ContainerDied","Data":"45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca"} Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.220406 4771 generic.go:334] "Generic (PLEG): container finished" podID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerID="ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94" exitCode=0 Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.220450 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerDied","Data":"ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94"} Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.220514 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerStarted","Data":"5457b050cedf3d58c016746fe9ae19016484dee6506d5903e4a66dadde91100c"} Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.256108 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.282478 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.289309 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.289376 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.289400 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.289431 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.289457 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:30Z","lastTransitionTime":"2026-02-27T01:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.301007 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.319330 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.340294 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.360694 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.377296 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.393116 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.394919 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.394946 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.394956 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.394972 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.395007 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:30Z","lastTransitionTime":"2026-02-27T01:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.406237 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.418329 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.432714 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.445847 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.459723 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.476189 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.493707 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.498048 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.498095 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.498138 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.498164 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.498182 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:30Z","lastTransitionTime":"2026-02-27T01:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.513155 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.548616 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.586764 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.603221 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.603451 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.603459 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.603472 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.603481 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:30Z","lastTransitionTime":"2026-02-27T01:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.621821 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.672527 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.706361 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.706413 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.706431 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.706456 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.706473 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:30Z","lastTransitionTime":"2026-02-27T01:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.709405 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.753326 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.772504 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.772532 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:30 crc kubenswrapper[4771]: E0227 01:06:30.772643 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.772660 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:30 crc kubenswrapper[4771]: E0227 01:06:30.772795 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:06:30 crc kubenswrapper[4771]: E0227 01:06:30.772918 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.785769 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.809886 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.809920 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.809933 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.809947 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.809959 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:30Z","lastTransitionTime":"2026-02-27T01:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.830257 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.873478 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.911800 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.911828 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.911836 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.911848 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.911856 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:30Z","lastTransitionTime":"2026-02-27T01:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:30 crc kubenswrapper[4771]: I0227 01:06:30.921001 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.015071 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.015315 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.015326 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.015343 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.015356 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:31Z","lastTransitionTime":"2026-02-27T01:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.117188 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.117223 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.117232 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.117245 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.117256 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:31Z","lastTransitionTime":"2026-02-27T01:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.219825 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.219864 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.219872 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.219886 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.219896 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:31Z","lastTransitionTime":"2026-02-27T01:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.225266 4771 generic.go:334] "Generic (PLEG): container finished" podID="0fc94570-c9c6-41e3-8a2b-0536f371b5da" containerID="3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d" exitCode=0 Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.225319 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" event={"ID":"0fc94570-c9c6-41e3-8a2b-0536f371b5da","Type":"ContainerDied","Data":"3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d"} Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.229649 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerStarted","Data":"2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90"} Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.229954 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerStarted","Data":"ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70"} Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.230086 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerStarted","Data":"965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08"} Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.230210 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerStarted","Data":"17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c"} Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.230332 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerStarted","Data":"a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62"} Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.239258 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.250668 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.272163 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.290207 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.304727 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.322696 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.322950 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.322983 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.323019 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.323037 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.323072 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:31Z","lastTransitionTime":"2026-02-27T01:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.335482 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.348585 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.366908 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.382241 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.395229 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.414235 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.425289 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.425318 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.425329 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.425345 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.425356 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:31Z","lastTransitionTime":"2026-02-27T01:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.431709 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.527388 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.527775 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.527799 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.527828 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.527850 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:31Z","lastTransitionTime":"2026-02-27T01:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.630724 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.630793 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.630814 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.630844 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.630864 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:31Z","lastTransitionTime":"2026-02-27T01:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.733853 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.733925 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.733948 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.733980 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.734001 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:31Z","lastTransitionTime":"2026-02-27T01:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.836488 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.836581 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.836608 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.836640 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.836663 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:31Z","lastTransitionTime":"2026-02-27T01:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.939820 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.939905 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.939928 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.939961 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:31 crc kubenswrapper[4771]: I0227 01:06:31.939983 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:31Z","lastTransitionTime":"2026-02-27T01:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.043216 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.043276 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.043292 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.043314 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.043332 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:32Z","lastTransitionTime":"2026-02-27T01:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.146673 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.146730 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.146747 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.146770 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.146788 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:32Z","lastTransitionTime":"2026-02-27T01:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.237674 4771 generic.go:334] "Generic (PLEG): container finished" podID="0fc94570-c9c6-41e3-8a2b-0536f371b5da" containerID="0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107" exitCode=0 Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.237801 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" event={"ID":"0fc94570-c9c6-41e3-8a2b-0536f371b5da","Type":"ContainerDied","Data":"0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107"} Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.243937 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerStarted","Data":"735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1"} Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.249523 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.249608 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.249629 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.249653 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.249672 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:32Z","lastTransitionTime":"2026-02-27T01:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.264630 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:32Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.287214 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:32Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.306879 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:32Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.328105 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:32Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.349338 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:32Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.353317 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.353382 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.353407 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.353438 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.353463 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:32Z","lastTransitionTime":"2026-02-27T01:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.372071 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:32Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.403436 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:32Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.433720 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:32Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.456302 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.456360 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.456378 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.456402 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.456423 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:32Z","lastTransitionTime":"2026-02-27T01:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.458641 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.458810 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.458843 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:32 crc kubenswrapper[4771]: E0227 01:06:32.458916 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:06:48.458885597 +0000 UTC m=+121.396446925 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:06:32 crc kubenswrapper[4771]: E0227 01:06:32.458946 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 01:06:32 crc kubenswrapper[4771]: E0227 01:06:32.459010 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:48.458989979 +0000 UTC m=+121.396551367 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 01:06:32 crc kubenswrapper[4771]: E0227 01:06:32.458950 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 01:06:32 crc kubenswrapper[4771]: E0227 01:06:32.459125 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:48.459104452 +0000 UTC m=+121.396665840 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.466377 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:32Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.487115 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:32Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.507267 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:32Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.520690 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:32Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.539531 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:32Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.559465 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.559531 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:32 crc kubenswrapper[4771]: E0227 01:06:32.559744 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 01:06:32 crc kubenswrapper[4771]: E0227 01:06:32.559768 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 01:06:32 crc kubenswrapper[4771]: E0227 01:06:32.559768 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 01:06:32 crc kubenswrapper[4771]: E0227 01:06:32.559829 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 01:06:32 crc kubenswrapper[4771]: E0227 01:06:32.559852 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:32 crc kubenswrapper[4771]: E0227 01:06:32.559921 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:48.559895312 +0000 UTC m=+121.497456630 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:32 crc kubenswrapper[4771]: E0227 01:06:32.559785 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:32 crc kubenswrapper[4771]: E0227 01:06:32.560101 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:48.560054096 +0000 UTC m=+121.497615454 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.560326 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.560364 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.560379 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.560398 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.560411 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:32Z","lastTransitionTime":"2026-02-27T01:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.662048 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.662074 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.662082 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.662094 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.662103 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:32Z","lastTransitionTime":"2026-02-27T01:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.764839 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.764883 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.764898 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.764921 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.764937 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:32Z","lastTransitionTime":"2026-02-27T01:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.772962 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:32 crc kubenswrapper[4771]: E0227 01:06:32.773104 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.773519 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:32 crc kubenswrapper[4771]: E0227 01:06:32.773649 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.773726 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:32 crc kubenswrapper[4771]: E0227 01:06:32.773802 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.868425 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.868495 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.868514 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.868538 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.868584 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:32Z","lastTransitionTime":"2026-02-27T01:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.971657 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.971727 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.971745 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.971769 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:32 crc kubenswrapper[4771]: I0227 01:06:32.971787 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:32Z","lastTransitionTime":"2026-02-27T01:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.074002 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.074082 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.074106 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.074136 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.074160 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:33Z","lastTransitionTime":"2026-02-27T01:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.178367 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.178427 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.178447 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.178474 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.178496 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:33Z","lastTransitionTime":"2026-02-27T01:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.250792 4771 generic.go:334] "Generic (PLEG): container finished" podID="0fc94570-c9c6-41e3-8a2b-0536f371b5da" containerID="c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9" exitCode=0 Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.250872 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" event={"ID":"0fc94570-c9c6-41e3-8a2b-0536f371b5da","Type":"ContainerDied","Data":"c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9"} Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.286483 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.286584 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.286610 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.286644 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.286666 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:33Z","lastTransitionTime":"2026-02-27T01:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.287412 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:33Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.309220 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:33Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.327075 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:33Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.348001 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:33Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.366911 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:33Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.382799 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:33Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.389525 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.389631 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.389656 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.389686 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.389708 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:33Z","lastTransitionTime":"2026-02-27T01:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.412197 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:33Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.444821 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:33Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.471952 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:33Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.492507 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:33Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.494090 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.494149 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.494168 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.494193 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.494212 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:33Z","lastTransitionTime":"2026-02-27T01:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.511637 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:33Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.530911 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:33Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.551018 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:33Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.596839 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.596888 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.596909 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.596934 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.596951 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:33Z","lastTransitionTime":"2026-02-27T01:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.699938 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.699984 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.700000 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.700022 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.700038 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:33Z","lastTransitionTime":"2026-02-27T01:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.802211 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.802283 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.802311 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.802335 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.802352 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:33Z","lastTransitionTime":"2026-02-27T01:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.905870 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.905934 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.905950 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.905973 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:33 crc kubenswrapper[4771]: I0227 01:06:33.905991 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:33Z","lastTransitionTime":"2026-02-27T01:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.009186 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.009243 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.009262 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.009285 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.009302 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:34Z","lastTransitionTime":"2026-02-27T01:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.112079 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.112139 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.112160 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.112185 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.112203 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:34Z","lastTransitionTime":"2026-02-27T01:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.214405 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.214443 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.214453 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.214470 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.214483 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:34Z","lastTransitionTime":"2026-02-27T01:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.260227 4771 generic.go:334] "Generic (PLEG): container finished" podID="0fc94570-c9c6-41e3-8a2b-0536f371b5da" containerID="ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7" exitCode=0 Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.260311 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" event={"ID":"0fc94570-c9c6-41e3-8a2b-0536f371b5da","Type":"ContainerDied","Data":"ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7"} Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.262511 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886"} Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.269758 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerStarted","Data":"b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411"} Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.285142 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:34Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.305802 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:34Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.317535 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.317595 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.317613 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.317635 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.317656 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:34Z","lastTransitionTime":"2026-02-27T01:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.321716 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:34Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.348496 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:34Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.385146 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:34Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.407584 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:34Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.421421 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.421528 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.421574 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.421599 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.421615 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:34Z","lastTransitionTime":"2026-02-27T01:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.433332 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:34Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.469532 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:34Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.502982 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:34Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.515489 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:34Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.523949 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.523994 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.524008 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.524027 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.524040 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:34Z","lastTransitionTime":"2026-02-27T01:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.527536 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:34Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.538624 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:34Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.548161 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:34Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.557488 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:34Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.565596 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:34Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.576739 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:34Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.595491 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:34Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.611415 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:34Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.626676 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.626705 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.626716 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.626731 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.626740 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:34Z","lastTransitionTime":"2026-02-27T01:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.628088 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:34Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.639629 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:34Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.651909 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:34Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.674237 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:34Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.690947 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:34Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.701788 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:34Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.717032 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:34Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.729759 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.729820 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.729837 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.729862 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.729883 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:34Z","lastTransitionTime":"2026-02-27T01:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.733248 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:34Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.772696 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:34 crc kubenswrapper[4771]: E0227 01:06:34.772864 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.773424 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.773455 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:34 crc kubenswrapper[4771]: E0227 01:06:34.773633 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:06:34 crc kubenswrapper[4771]: E0227 01:06:34.773756 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.832600 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.832659 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.832681 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.832709 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.832733 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:34Z","lastTransitionTime":"2026-02-27T01:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.935283 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.935341 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.935361 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.935386 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:34 crc kubenswrapper[4771]: I0227 01:06:34.935406 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:34Z","lastTransitionTime":"2026-02-27T01:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.038588 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.038649 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.038668 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.038690 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.038707 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:35Z","lastTransitionTime":"2026-02-27T01:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.102382 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-cbt48"] Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.102937 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cbt48" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.105590 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.105942 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.106185 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.106355 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.132013 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.141404 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.141468 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.141486 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.141512 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.141530 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:35Z","lastTransitionTime":"2026-02-27T01:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.171237 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.190587 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5202229e-c4e0-4bcd-8295-85e4e9f4f4ac-serviceca\") pod \"node-ca-cbt48\" (UID: \"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\") " pod="openshift-image-registry/node-ca-cbt48" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.190806 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5202229e-c4e0-4bcd-8295-85e4e9f4f4ac-host\") pod \"node-ca-cbt48\" (UID: \"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\") " pod="openshift-image-registry/node-ca-cbt48" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.190871 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvdcw\" (UniqueName: \"kubernetes.io/projected/5202229e-c4e0-4bcd-8295-85e4e9f4f4ac-kube-api-access-dvdcw\") pod \"node-ca-cbt48\" (UID: \"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\") " pod="openshift-image-registry/node-ca-cbt48" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.194690 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.220677 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.241512 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.243723 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.243813 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.243832 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.243857 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.243874 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:35Z","lastTransitionTime":"2026-02-27T01:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.261524 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.277972 4771 generic.go:334] "Generic (PLEG): container finished" podID="0fc94570-c9c6-41e3-8a2b-0536f371b5da" containerID="bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d" exitCode=0 Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.278041 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" event={"ID":"0fc94570-c9c6-41e3-8a2b-0536f371b5da","Type":"ContainerDied","Data":"bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d"} Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.289304 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.291936 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5202229e-c4e0-4bcd-8295-85e4e9f4f4ac-host\") pod \"node-ca-cbt48\" (UID: \"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\") " pod="openshift-image-registry/node-ca-cbt48" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.292018 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvdcw\" (UniqueName: \"kubernetes.io/projected/5202229e-c4e0-4bcd-8295-85e4e9f4f4ac-kube-api-access-dvdcw\") pod \"node-ca-cbt48\" (UID: \"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\") " pod="openshift-image-registry/node-ca-cbt48" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.292075 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5202229e-c4e0-4bcd-8295-85e4e9f4f4ac-serviceca\") pod \"node-ca-cbt48\" (UID: \"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\") " pod="openshift-image-registry/node-ca-cbt48" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.292142 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5202229e-c4e0-4bcd-8295-85e4e9f4f4ac-host\") pod \"node-ca-cbt48\" (UID: \"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\") " pod="openshift-image-registry/node-ca-cbt48" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.293852 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5202229e-c4e0-4bcd-8295-85e4e9f4f4ac-serviceca\") pod \"node-ca-cbt48\" (UID: \"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\") " pod="openshift-image-registry/node-ca-cbt48" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.313636 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.323381 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvdcw\" (UniqueName: \"kubernetes.io/projected/5202229e-c4e0-4bcd-8295-85e4e9f4f4ac-kube-api-access-dvdcw\") pod \"node-ca-cbt48\" (UID: \"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\") " pod="openshift-image-registry/node-ca-cbt48" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.333930 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.346541 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.346600 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.346612 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.346627 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.346639 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:35Z","lastTransitionTime":"2026-02-27T01:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.361953 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.372908 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.387105 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.401420 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.410829 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.420848 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.427669 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cbt48" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.432717 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.442880 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: W0227 01:06:35.447409 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5202229e_c4e0_4bcd_8295_85e4e9f4f4ac.slice/crio-099dfb6b3ea8ccf9c6a720db335cae665f04e225cc57388c02b3459804c719d5 WatchSource:0}: Error finding container 099dfb6b3ea8ccf9c6a720db335cae665f04e225cc57388c02b3459804c719d5: Status 404 returned error can't find the container with id 099dfb6b3ea8ccf9c6a720db335cae665f04e225cc57388c02b3459804c719d5 Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.448749 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.448775 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.448784 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.448798 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.448807 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:35Z","lastTransitionTime":"2026-02-27T01:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.456449 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.474828 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.486278 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.506335 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.519244 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.532620 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.541907 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.551500 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.551530 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.551538 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.551568 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.551579 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:35Z","lastTransitionTime":"2026-02-27T01:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.553148 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.564444 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.576050 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.587605 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:35Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.657902 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.658214 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.658229 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.658250 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.658270 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:35Z","lastTransitionTime":"2026-02-27T01:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.760388 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.760436 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.760453 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.760478 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.760495 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:35Z","lastTransitionTime":"2026-02-27T01:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.788680 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.863804 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.863855 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.863872 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.863894 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.863910 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:35Z","lastTransitionTime":"2026-02-27T01:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.967097 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.967158 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.967178 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.967202 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:35 crc kubenswrapper[4771]: I0227 01:06:35.967218 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:35Z","lastTransitionTime":"2026-02-27T01:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.070629 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.070704 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.070722 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.070749 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.070768 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:36Z","lastTransitionTime":"2026-02-27T01:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.175181 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.175241 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.175257 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.175282 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.175300 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:36Z","lastTransitionTime":"2026-02-27T01:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.278541 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.278629 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.278648 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.278669 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.278683 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:36Z","lastTransitionTime":"2026-02-27T01:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.288504 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" event={"ID":"0fc94570-c9c6-41e3-8a2b-0536f371b5da","Type":"ContainerStarted","Data":"cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385"} Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.290996 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cbt48" event={"ID":"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac","Type":"ContainerStarted","Data":"7d591794984ea0c81e0d8c756e7c24134aac6c8c15e34cf5119921f2e335041f"} Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.291077 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cbt48" event={"ID":"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac","Type":"ContainerStarted","Data":"099dfb6b3ea8ccf9c6a720db335cae665f04e225cc57388c02b3459804c719d5"} Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.298250 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerStarted","Data":"317365b727c789ebe80fdce56fb311e29b6274d979f71c333bf6e782d3f590df"} Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.298745 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.298810 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.298834 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.309148 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.334236 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.361359 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.381665 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.382910 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.383038 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.383073 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.383086 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.383102 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.383114 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:36Z","lastTransitionTime":"2026-02-27T01:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.387985 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.421092 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.437601 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.459060 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6cc63e-9af8-4fdb-bd3f-ac14e85d530f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 01:04:49.945337 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 01:04:49.947347 1 observer_polling.go:159] Starting file observer\\\\nI0227 01:04:49.977213 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 01:04:49.981579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 01:05:16.797987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 01:05:16.798250 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.477001 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.485505 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.485562 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.485575 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.485590 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.485602 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:36Z","lastTransitionTime":"2026-02-27T01:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.491880 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.505536 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.524140 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.558073 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.577308 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.588683 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.588738 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.588757 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.588782 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.588799 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:36Z","lastTransitionTime":"2026-02-27T01:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.591689 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.614126 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.633929 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.654477 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.672198 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.687017 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.691177 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.691310 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.691373 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.691404 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.691426 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:36Z","lastTransitionTime":"2026-02-27T01:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.728440 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.748005 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.762901 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.772721 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.772829 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:36 crc kubenswrapper[4771]: E0227 01:06:36.773008 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:06:36 crc kubenswrapper[4771]: E0227 01:06:36.772851 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.773153 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:36 crc kubenswrapper[4771]: E0227 01:06:36.773281 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.780796 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.794270 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.794363 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.794417 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.794444 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.794458 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:36Z","lastTransitionTime":"2026-02-27T01:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.797929 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.812076 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.828283 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.850816 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.878150 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://317365b727c789ebe80fdce56fb311e29b6274d979f71c333bf6e782d3f590df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.895390 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d591794984ea0c81e0d8c756e7c24134aac6c8c15e34cf5119921f2e335041f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.897018 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.897089 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.897119 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.897152 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.897178 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:36Z","lastTransitionTime":"2026-02-27T01:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:36 crc kubenswrapper[4771]: I0227 01:06:36.915625 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6cc63e-9af8-4fdb-bd3f-ac14e85d530f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 01:04:49.945337 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 01:04:49.947347 1 observer_polling.go:159] Starting file observer\\\\nI0227 01:04:49.977213 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 01:04:49.981579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 01:05:16.797987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 01:05:16.798250 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:36Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.000378 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.000428 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.000446 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.000470 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.000488 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:37Z","lastTransitionTime":"2026-02-27T01:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.103521 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.103821 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.103902 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.103990 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.104084 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:37Z","lastTransitionTime":"2026-02-27T01:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.207589 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.207656 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.207679 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.207712 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.207735 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:37Z","lastTransitionTime":"2026-02-27T01:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.310706 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.310771 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.310794 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.310821 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.310843 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:37Z","lastTransitionTime":"2026-02-27T01:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.413818 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.413877 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.413897 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.413921 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.413937 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:37Z","lastTransitionTime":"2026-02-27T01:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.516343 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.516399 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.516416 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.516441 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.516458 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:37Z","lastTransitionTime":"2026-02-27T01:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.619153 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.619209 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.619228 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.619250 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.619267 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:37Z","lastTransitionTime":"2026-02-27T01:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.721995 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.722075 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.722097 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.722145 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.722167 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:37Z","lastTransitionTime":"2026-02-27T01:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.801781 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://317365b727c789ebe80fdce56fb311e29b6274d979f71c333bf6e782d3f590df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:37Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.817481 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d591794984ea0c81e0d8c756e7c24134aac6c8c15e34cf5119921f2e335041f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:37Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.827067 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.827137 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.827160 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.827193 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.827215 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:37Z","lastTransitionTime":"2026-02-27T01:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.839216 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6cc63e-9af8-4fdb-bd3f-ac14e85d530f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 01:04:49.945337 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 01:04:49.947347 1 observer_polling.go:159] Starting file observer\\\\nI0227 01:04:49.977213 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 01:04:49.981579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 01:05:16.797987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 01:05:16.798250 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:37Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.859446 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:37Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.878601 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:37Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.896165 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:37Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.919053 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:37Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.934904 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.934957 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.934973 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.934995 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.935015 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:37Z","lastTransitionTime":"2026-02-27T01:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.950751 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:37Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.972706 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:37Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:37 crc kubenswrapper[4771]: I0227 01:06:37.994562 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:37Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.011823 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:38Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.029209 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:38Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.037308 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.037341 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.037351 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.037367 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.037378 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:38Z","lastTransitionTime":"2026-02-27T01:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.049895 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:38Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.072637 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:38Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.092463 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:38Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.139956 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.140015 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.140032 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.140054 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.140073 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:38Z","lastTransitionTime":"2026-02-27T01:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.243202 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.243284 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.243312 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.243347 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.243383 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:38Z","lastTransitionTime":"2026-02-27T01:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.346733 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.346784 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.346796 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.346814 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.346826 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:38Z","lastTransitionTime":"2026-02-27T01:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.450391 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.450573 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.450608 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.450637 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.450662 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:38Z","lastTransitionTime":"2026-02-27T01:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.452928 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.452991 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.453016 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.453041 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.453062 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:38Z","lastTransitionTime":"2026-02-27T01:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:38 crc kubenswrapper[4771]: E0227 01:06:38.472728 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:38Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.476814 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.476869 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.476889 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.476914 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.476932 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:38Z","lastTransitionTime":"2026-02-27T01:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:38 crc kubenswrapper[4771]: E0227 01:06:38.497600 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:38Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.504081 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.504141 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.504160 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.504184 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.504202 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:38Z","lastTransitionTime":"2026-02-27T01:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:38 crc kubenswrapper[4771]: E0227 01:06:38.528286 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:38Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.533401 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.533486 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.533509 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.533576 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.533602 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:38Z","lastTransitionTime":"2026-02-27T01:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:38 crc kubenswrapper[4771]: E0227 01:06:38.553989 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:38Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.559272 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.559322 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.559340 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.559362 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.559381 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:38Z","lastTransitionTime":"2026-02-27T01:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:38 crc kubenswrapper[4771]: E0227 01:06:38.581920 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:38Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:38 crc kubenswrapper[4771]: E0227 01:06:38.582105 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.586667 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.586699 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.586710 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.586727 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.586739 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:38Z","lastTransitionTime":"2026-02-27T01:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.690069 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.690122 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.690137 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.690156 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.690210 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:38Z","lastTransitionTime":"2026-02-27T01:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.772745 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.772821 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.772770 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:38 crc kubenswrapper[4771]: E0227 01:06:38.772929 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:06:38 crc kubenswrapper[4771]: E0227 01:06:38.773028 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:06:38 crc kubenswrapper[4771]: E0227 01:06:38.773143 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.795248 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.795291 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.795307 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.795328 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.795344 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:38Z","lastTransitionTime":"2026-02-27T01:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.898757 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.898821 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.898838 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.898865 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:38 crc kubenswrapper[4771]: I0227 01:06:38.898883 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:38Z","lastTransitionTime":"2026-02-27T01:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.001530 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.001621 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.001643 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.001667 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.001688 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:39Z","lastTransitionTime":"2026-02-27T01:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.105325 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.105740 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.105761 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.105785 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.105803 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:39Z","lastTransitionTime":"2026-02-27T01:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.208804 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.208857 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.208874 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.208897 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.208915 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:39Z","lastTransitionTime":"2026-02-27T01:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.311067 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.311113 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.311131 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.311152 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.311169 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:39Z","lastTransitionTime":"2026-02-27T01:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.312747 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h5vs8_21f824c6-1bde-4e58-b4ef-72a56a140abb/ovnkube-controller/0.log" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.316212 4771 generic.go:334] "Generic (PLEG): container finished" podID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerID="317365b727c789ebe80fdce56fb311e29b6274d979f71c333bf6e782d3f590df" exitCode=1 Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.316262 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerDied","Data":"317365b727c789ebe80fdce56fb311e29b6274d979f71c333bf6e782d3f590df"} Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.317370 4771 scope.go:117] "RemoveContainer" containerID="317365b727c789ebe80fdce56fb311e29b6274d979f71c333bf6e782d3f590df" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.337354 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:39Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.356694 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:39Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.374342 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:39Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.395513 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6cc63e-9af8-4fdb-bd3f-ac14e85d530f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 01:04:49.945337 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 01:04:49.947347 1 observer_polling.go:159] Starting file observer\\\\nI0227 01:04:49.977213 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 01:04:49.981579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 01:05:16.797987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 01:05:16.798250 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:39Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.413858 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.413936 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.413955 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.414336 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.414389 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:39Z","lastTransitionTime":"2026-02-27T01:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.417249 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:39Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.435575 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:39Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.450221 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:39Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.474618 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:39Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.511028 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://317365b727c789ebe80fdce56fb311e29b6274d979f71c333bf6e782d3f590df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://317365b727c789ebe80fdce56fb311e29b6274d979f71c333bf6e782d3f590df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:06:39Z\\\",\\\"message\\\":\\\"06:39.080726 6614 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:39.081319 6614 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:39.081449 6614 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:39.081531 6614 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 01:06:39.081679 6614 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 01:06:39.082240 6614 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 01:06:39.082284 6614 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 01:06:39.082324 6614 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 01:06:39.082349 6614 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 01:06:39.082415 6614 factory.go:656] Stopping watch factory\\\\nI0227 01:06:39.082429 6614 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 01:06:39.082453 6614 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:39Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.517512 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.517577 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.517595 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.517618 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.517637 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:39Z","lastTransitionTime":"2026-02-27T01:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.528851 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d591794984ea0c81e0d8c756e7c24134aac6c8c15e34cf5119921f2e335041f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:39Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.560109 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:39Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.575880 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:39Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.596329 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:39Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.611941 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:39Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.620828 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.620890 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.620911 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.620939 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.620961 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:39Z","lastTransitionTime":"2026-02-27T01:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.626359 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:39Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.724679 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.724982 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.725126 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.725304 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.725433 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:39Z","lastTransitionTime":"2026-02-27T01:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.827942 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.827988 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.828002 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.828019 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.828031 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:39Z","lastTransitionTime":"2026-02-27T01:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.930451 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.930751 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.930907 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.931149 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:39 crc kubenswrapper[4771]: I0227 01:06:39.931363 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:39Z","lastTransitionTime":"2026-02-27T01:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.033893 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.034233 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.034316 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.034397 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.034472 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:40Z","lastTransitionTime":"2026-02-27T01:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.137020 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.137059 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.137070 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.137086 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.137098 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:40Z","lastTransitionTime":"2026-02-27T01:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.239590 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.239634 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.239644 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.239661 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.239676 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:40Z","lastTransitionTime":"2026-02-27T01:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.322021 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h5vs8_21f824c6-1bde-4e58-b4ef-72a56a140abb/ovnkube-controller/0.log" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.325898 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerStarted","Data":"26c597eac9e2878a8cba2c557865510154cca071720a32e1216be0940a5ed393"} Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.326723 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.342171 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.342225 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.342244 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.342267 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.342285 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:40Z","lastTransitionTime":"2026-02-27T01:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.345313 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6cc63e-9af8-4fdb-bd3f-ac14e85d530f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 01:04:49.945337 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 01:04:49.947347 1 observer_polling.go:159] Starting file observer\\\\nI0227 01:04:49.977213 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 01:04:49.981579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 01:05:16.797987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 01:05:16.798250 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:40Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.369173 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:40Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.393427 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:40Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.410290 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:40Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.429722 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:40Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.445298 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.445344 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.445363 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.445386 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.445404 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:40Z","lastTransitionTime":"2026-02-27T01:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.453262 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c597eac9e2878a8cba2c557865510154cca071720a32e1216be0940a5ed393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://317365b727c789ebe80fdce56fb311e29b6274d979f71c333bf6e782d3f590df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:06:39Z\\\",\\\"message\\\":\\\"06:39.080726 6614 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:39.081319 6614 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:39.081449 6614 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:39.081531 6614 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 01:06:39.081679 6614 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 01:06:39.082240 6614 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 01:06:39.082284 6614 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 01:06:39.082324 6614 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 01:06:39.082349 6614 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 01:06:39.082415 6614 factory.go:656] Stopping watch factory\\\\nI0227 01:06:39.082429 6614 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 01:06:39.082453 6614 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:40Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.466580 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d591794984ea0c81e0d8c756e7c24134aac6c8c15e34cf5119921f2e335041f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:40Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.497766 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:40Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.512422 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:40Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.536605 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:40Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.549148 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.549208 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.549225 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.549251 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.549269 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:40Z","lastTransitionTime":"2026-02-27T01:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.555469 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:40Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.570273 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:40Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.584778 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:40Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.600006 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:40Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.613907 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:40Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.652610 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.652664 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.652688 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.652716 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.652736 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:40Z","lastTransitionTime":"2026-02-27T01:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.754911 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.755300 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.755519 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.755740 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.755883 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:40Z","lastTransitionTime":"2026-02-27T01:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.772943 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:40 crc kubenswrapper[4771]: E0227 01:06:40.773242 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.773065 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:40 crc kubenswrapper[4771]: E0227 01:06:40.773653 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.772988 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:40 crc kubenswrapper[4771]: E0227 01:06:40.774148 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.858425 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.858473 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.858490 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.858512 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.858528 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:40Z","lastTransitionTime":"2026-02-27T01:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.961012 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.961219 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.961279 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.961341 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:40 crc kubenswrapper[4771]: I0227 01:06:40.961402 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:40Z","lastTransitionTime":"2026-02-27T01:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.063900 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.063972 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.063996 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.064026 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.064049 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:41Z","lastTransitionTime":"2026-02-27T01:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.167945 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.168026 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.168052 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.168082 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.168105 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:41Z","lastTransitionTime":"2026-02-27T01:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.198535 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9"] Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.199472 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.203817 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.206384 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.215406 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.235592 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.252197 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.271666 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.271760 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.271779 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.271806 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.271825 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:41Z","lastTransitionTime":"2026-02-27T01:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.275312 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7753e0fc-55c7-4f3e-a5ac-026a71aa8a46-env-overrides\") pod \"ovnkube-control-plane-749d76644c-54zs9\" (UID: \"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.275376 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7753e0fc-55c7-4f3e-a5ac-026a71aa8a46-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-54zs9\" (UID: \"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.275322 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6cc63e-9af8-4fdb-bd3f-ac14e85d530f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 01:04:49.945337 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 01:04:49.947347 1 observer_polling.go:159] Starting file observer\\\\nI0227 01:04:49.977213 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 01:04:49.981579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 01:05:16.797987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 01:05:16.798250 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.275456 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7753e0fc-55c7-4f3e-a5ac-026a71aa8a46-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-54zs9\" (UID: \"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.275695 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vft5t\" (UniqueName: \"kubernetes.io/projected/7753e0fc-55c7-4f3e-a5ac-026a71aa8a46-kube-api-access-vft5t\") pod \"ovnkube-control-plane-749d76644c-54zs9\" (UID: \"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.295710 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.315149 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.329384 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.332826 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h5vs8_21f824c6-1bde-4e58-b4ef-72a56a140abb/ovnkube-controller/1.log" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.333794 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h5vs8_21f824c6-1bde-4e58-b4ef-72a56a140abb/ovnkube-controller/0.log" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.337981 4771 generic.go:334] "Generic (PLEG): container finished" podID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerID="26c597eac9e2878a8cba2c557865510154cca071720a32e1216be0940a5ed393" exitCode=1 Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.338050 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerDied","Data":"26c597eac9e2878a8cba2c557865510154cca071720a32e1216be0940a5ed393"} Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.338109 4771 scope.go:117] "RemoveContainer" containerID="317365b727c789ebe80fdce56fb311e29b6274d979f71c333bf6e782d3f590df" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.339798 4771 scope.go:117] "RemoveContainer" containerID="26c597eac9e2878a8cba2c557865510154cca071720a32e1216be0940a5ed393" Feb 27 01:06:41 crc kubenswrapper[4771]: E0227 01:06:41.340288 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-h5vs8_openshift-ovn-kubernetes(21f824c6-1bde-4e58-b4ef-72a56a140abb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.355061 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.375127 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.375173 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.375183 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.375198 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.375207 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:41Z","lastTransitionTime":"2026-02-27T01:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.376815 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7753e0fc-55c7-4f3e-a5ac-026a71aa8a46-env-overrides\") pod \"ovnkube-control-plane-749d76644c-54zs9\" (UID: \"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.376931 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7753e0fc-55c7-4f3e-a5ac-026a71aa8a46-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-54zs9\" (UID: \"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.377077 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7753e0fc-55c7-4f3e-a5ac-026a71aa8a46-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-54zs9\" (UID: \"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.377157 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vft5t\" (UniqueName: \"kubernetes.io/projected/7753e0fc-55c7-4f3e-a5ac-026a71aa8a46-kube-api-access-vft5t\") pod \"ovnkube-control-plane-749d76644c-54zs9\" (UID: \"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.377695 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7753e0fc-55c7-4f3e-a5ac-026a71aa8a46-env-overrides\") pod \"ovnkube-control-plane-749d76644c-54zs9\" (UID: \"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.378142 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7753e0fc-55c7-4f3e-a5ac-026a71aa8a46-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-54zs9\" (UID: \"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.420295 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c597eac9e2878a8cba2c557865510154cca071720a32e1216be0940a5ed393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://317365b727c789ebe80fdce56fb311e29b6274d979f71c333bf6e782d3f590df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:06:39Z\\\",\\\"message\\\":\\\"06:39.080726 6614 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:39.081319 6614 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:39.081449 6614 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:39.081531 6614 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 01:06:39.081679 6614 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 01:06:39.082240 6614 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 01:06:39.082284 6614 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 01:06:39.082324 6614 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 01:06:39.082349 6614 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 01:06:39.082415 6614 factory.go:656] Stopping watch factory\\\\nI0227 01:06:39.082429 6614 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 01:06:39.082453 6614 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.420521 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7753e0fc-55c7-4f3e-a5ac-026a71aa8a46-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-54zs9\" (UID: \"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.426898 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vft5t\" (UniqueName: \"kubernetes.io/projected/7753e0fc-55c7-4f3e-a5ac-026a71aa8a46-kube-api-access-vft5t\") pod \"ovnkube-control-plane-749d76644c-54zs9\" (UID: \"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.437448 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d591794984ea0c81e0d8c756e7c24134aac6c8c15e34cf5119921f2e335041f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.454111 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-54zs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.479916 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.479991 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.480016 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.480045 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.480065 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:41Z","lastTransitionTime":"2026-02-27T01:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.491959 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.514904 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.518757 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.537589 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: W0227 01:06:41.540762 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7753e0fc_55c7_4f3e_a5ac_026a71aa8a46.slice/crio-8b38ea156d0990f4ef679e0dc3725f038847515bd7e07090d92e85ada80033b9 WatchSource:0}: Error finding container 8b38ea156d0990f4ef679e0dc3725f038847515bd7e07090d92e85ada80033b9: Status 404 returned error can't find the container with id 8b38ea156d0990f4ef679e0dc3725f038847515bd7e07090d92e85ada80033b9 Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.562677 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.583589 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.583619 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.583641 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.583655 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.583663 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:41Z","lastTransitionTime":"2026-02-27T01:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.590360 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.610311 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-54zs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.630884 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6cc63e-9af8-4fdb-bd3f-ac14e85d530f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 01:04:49.945337 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 01:04:49.947347 1 observer_polling.go:159] Starting file observer\\\\nI0227 01:04:49.977213 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 01:04:49.981579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 01:05:16.797987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 01:05:16.798250 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.646584 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.665771 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.680112 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.686115 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.686150 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.686160 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.686174 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.686185 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:41Z","lastTransitionTime":"2026-02-27T01:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.696094 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.715079 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c597eac9e2878a8cba2c557865510154cca071720a32e1216be0940a5ed393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://317365b727c789ebe80fdce56fb311e29b6274d979f71c333bf6e782d3f590df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:06:39Z\\\",\\\"message\\\":\\\"06:39.080726 6614 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:39.081319 6614 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:39.081449 6614 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:39.081531 6614 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 01:06:39.081679 6614 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 01:06:39.082240 6614 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 01:06:39.082284 6614 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 01:06:39.082324 6614 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 01:06:39.082349 6614 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 01:06:39.082415 6614 factory.go:656] Stopping watch factory\\\\nI0227 01:06:39.082429 6614 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 01:06:39.082453 6614 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26c597eac9e2878a8cba2c557865510154cca071720a32e1216be0940a5ed393\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:06:40Z\\\",\\\"message\\\":\\\" 01:06:40.415905 6773 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0227 01:06:40.415913 6773 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 01:06:40.415953 6773 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 01:06:40.415965 6773 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 01:06:40.415974 6773 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 01:06:40.415984 6773 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 01:06:40.415993 6773 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 01:06:40.416002 6773 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 01:06:40.416011 6773 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 01:06:40.416031 6773 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:40.416137 6773 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 01:06:40.416155 6773 factory.go:656] Stopping watch factory\\\\nI0227 01:06:40.416412 6773 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 01:06:40.416619 6773 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.729752 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d591794984ea0c81e0d8c756e7c24134aac6c8c15e34cf5119921f2e335041f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.752788 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.766057 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.778606 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.791757 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.791816 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.791835 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.791859 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.791878 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:41Z","lastTransitionTime":"2026-02-27T01:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.793510 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.807103 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.827359 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.843681 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.857524 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:41Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.894381 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.894431 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.894443 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.894460 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.894469 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:41Z","lastTransitionTime":"2026-02-27T01:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.977831 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-24pv2"] Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.978487 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:06:41 crc kubenswrapper[4771]: E0227 01:06:41.978609 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.997660 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.997703 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.997717 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.997732 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:41 crc kubenswrapper[4771]: I0227 01:06:41.997746 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:41Z","lastTransitionTime":"2026-02-27T01:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.012536 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c597eac9e2878a8cba2c557865510154cca071720a32e1216be0940a5ed393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://317365b727c789ebe80fdce56fb311e29b6274d979f71c333bf6e782d3f590df\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:06:39Z\\\",\\\"message\\\":\\\"06:39.080726 6614 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:39.081319 6614 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:39.081449 6614 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:39.081531 6614 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 01:06:39.081679 6614 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 01:06:39.082240 6614 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 01:06:39.082284 6614 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 01:06:39.082324 6614 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 01:06:39.082349 6614 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 01:06:39.082415 6614 factory.go:656] Stopping watch factory\\\\nI0227 01:06:39.082429 6614 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 01:06:39.082453 6614 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26c597eac9e2878a8cba2c557865510154cca071720a32e1216be0940a5ed393\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:06:40Z\\\",\\\"message\\\":\\\" 01:06:40.415905 6773 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0227 01:06:40.415913 6773 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 01:06:40.415953 6773 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 01:06:40.415965 6773 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 01:06:40.415974 6773 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 01:06:40.415984 6773 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 01:06:40.415993 6773 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 01:06:40.416002 6773 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 01:06:40.416011 6773 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 01:06:40.416031 6773 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:40.416137 6773 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 01:06:40.416155 6773 factory.go:656] Stopping watch factory\\\\nI0227 01:06:40.416412 6773 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 01:06:40.416619 6773 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.031310 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d591794984ea0c81e0d8c756e7c24134aac6c8c15e34cf5119921f2e335041f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.045844 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-54zs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.058305 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6cc63e-9af8-4fdb-bd3f-ac14e85d530f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 01:04:49.945337 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 01:04:49.947347 1 observer_polling.go:159] Starting file observer\\\\nI0227 01:04:49.977213 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 01:04:49.981579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 01:05:16.797987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 01:05:16.798250 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.070720 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.084520 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln79k\" (UniqueName: \"kubernetes.io/projected/15dd6a85-eabc-4a32-a283-33bf72d2a041-kube-api-access-ln79k\") pod \"network-metrics-daemon-24pv2\" (UID: \"15dd6a85-eabc-4a32-a283-33bf72d2a041\") " pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.084632 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs\") pod \"network-metrics-daemon-24pv2\" (UID: \"15dd6a85-eabc-4a32-a283-33bf72d2a041\") " pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.088141 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.100635 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.100695 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.100714 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.100739 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.100756 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:42Z","lastTransitionTime":"2026-02-27T01:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.101129 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.121229 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.131825 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-24pv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15dd6a85-eabc-4a32-a283-33bf72d2a041\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-24pv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.163954 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.177190 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.186027 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln79k\" (UniqueName: \"kubernetes.io/projected/15dd6a85-eabc-4a32-a283-33bf72d2a041-kube-api-access-ln79k\") pod \"network-metrics-daemon-24pv2\" (UID: \"15dd6a85-eabc-4a32-a283-33bf72d2a041\") " pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.186153 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs\") pod \"network-metrics-daemon-24pv2\" (UID: \"15dd6a85-eabc-4a32-a283-33bf72d2a041\") " pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:06:42 crc kubenswrapper[4771]: E0227 01:06:42.186292 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 01:06:42 crc kubenswrapper[4771]: E0227 01:06:42.186371 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs podName:15dd6a85-eabc-4a32-a283-33bf72d2a041 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:42.68634925 +0000 UTC m=+115.623910558 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs") pod "network-metrics-daemon-24pv2" (UID: "15dd6a85-eabc-4a32-a283-33bf72d2a041") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.193239 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.203815 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln79k\" (UniqueName: \"kubernetes.io/projected/15dd6a85-eabc-4a32-a283-33bf72d2a041-kube-api-access-ln79k\") pod \"network-metrics-daemon-24pv2\" (UID: \"15dd6a85-eabc-4a32-a283-33bf72d2a041\") " pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.203909 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.203988 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.204012 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.204042 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.204080 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:42Z","lastTransitionTime":"2026-02-27T01:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.206617 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.219444 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.232232 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.244899 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.255404 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.307032 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.307093 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.307109 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.307133 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.307152 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:42Z","lastTransitionTime":"2026-02-27T01:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.346355 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h5vs8_21f824c6-1bde-4e58-b4ef-72a56a140abb/ovnkube-controller/1.log" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.353750 4771 scope.go:117] "RemoveContainer" containerID="26c597eac9e2878a8cba2c557865510154cca071720a32e1216be0940a5ed393" Feb 27 01:06:42 crc kubenswrapper[4771]: E0227 01:06:42.353989 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-h5vs8_openshift-ovn-kubernetes(21f824c6-1bde-4e58-b4ef-72a56a140abb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.356699 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" event={"ID":"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46","Type":"ContainerStarted","Data":"bd417d1b0b53abe6fec3c4c5964722f08a264f22ce487a8ae6447c111dfccc1c"} Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.356757 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" event={"ID":"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46","Type":"ContainerStarted","Data":"1615ac61dec5d7d017c23cffd34575ac0575bc8cb679233a57222eca25dca675"} Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.356777 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" event={"ID":"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46","Type":"ContainerStarted","Data":"8b38ea156d0990f4ef679e0dc3725f038847515bd7e07090d92e85ada80033b9"} Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.377092 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.395507 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.410685 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.410765 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.410790 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.410824 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.410848 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:42Z","lastTransitionTime":"2026-02-27T01:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.412362 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.428019 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d591794984ea0c81e0d8c756e7c24134aac6c8c15e34cf5119921f2e335041f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.442164 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-54zs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.460928 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6cc63e-9af8-4fdb-bd3f-ac14e85d530f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 01:04:49.945337 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 01:04:49.947347 1 observer_polling.go:159] Starting file observer\\\\nI0227 01:04:49.977213 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 01:04:49.981579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 01:05:16.797987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 01:05:16.798250 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.481286 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.498999 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.513820 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.513895 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.513921 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.513953 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.513973 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:42Z","lastTransitionTime":"2026-02-27T01:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.515420 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.538871 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.574305 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c597eac9e2878a8cba2c557865510154cca071720a32e1216be0940a5ed393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26c597eac9e2878a8cba2c557865510154cca071720a32e1216be0940a5ed393\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:06:40Z\\\",\\\"message\\\":\\\" 01:06:40.415905 6773 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0227 01:06:40.415913 6773 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 01:06:40.415953 6773 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 01:06:40.415965 6773 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 01:06:40.415974 6773 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 01:06:40.415984 6773 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 01:06:40.415993 6773 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 01:06:40.416002 6773 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 01:06:40.416011 6773 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 01:06:40.416031 6773 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:40.416137 6773 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 01:06:40.416155 6773 factory.go:656] Stopping watch factory\\\\nI0227 01:06:40.416412 6773 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 01:06:40.416619 6773 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-h5vs8_openshift-ovn-kubernetes(21f824c6-1bde-4e58-b4ef-72a56a140abb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.608217 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.617198 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.617290 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.617313 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.617340 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.617359 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:42Z","lastTransitionTime":"2026-02-27T01:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.632947 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.655834 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.677752 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.692352 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs\") pod \"network-metrics-daemon-24pv2\" (UID: \"15dd6a85-eabc-4a32-a283-33bf72d2a041\") " pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:06:42 crc kubenswrapper[4771]: E0227 01:06:42.692618 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 01:06:42 crc kubenswrapper[4771]: E0227 01:06:42.692721 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs podName:15dd6a85-eabc-4a32-a283-33bf72d2a041 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:43.692695453 +0000 UTC m=+116.630256781 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs") pod "network-metrics-daemon-24pv2" (UID: "15dd6a85-eabc-4a32-a283-33bf72d2a041") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.697614 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.713661 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-24pv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15dd6a85-eabc-4a32-a283-33bf72d2a041\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-24pv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.720173 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.720237 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.720252 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.720267 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.720279 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:42Z","lastTransitionTime":"2026-02-27T01:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.736523 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.751349 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.766734 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.772941 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.772988 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:42 crc kubenswrapper[4771]: E0227 01:06:42.773055 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:06:42 crc kubenswrapper[4771]: E0227 01:06:42.773109 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.773107 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.773668 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:42 crc kubenswrapper[4771]: E0227 01:06:42.773931 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.800804 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c597eac9e2878a8cba2c557865510154cca071720a32e1216be0940a5ed393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26c597eac9e2878a8cba2c557865510154cca071720a32e1216be0940a5ed393\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:06:40Z\\\",\\\"message\\\":\\\" 01:06:40.415905 6773 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0227 01:06:40.415913 6773 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 01:06:40.415953 6773 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 01:06:40.415965 6773 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 01:06:40.415974 6773 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 01:06:40.415984 6773 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 01:06:40.415993 6773 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 01:06:40.416002 6773 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 01:06:40.416011 6773 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 01:06:40.416031 6773 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:40.416137 6773 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 01:06:40.416155 6773 factory.go:656] Stopping watch factory\\\\nI0227 01:06:40.416412 6773 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 01:06:40.416619 6773 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-h5vs8_openshift-ovn-kubernetes(21f824c6-1bde-4e58-b4ef-72a56a140abb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.811532 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d591794984ea0c81e0d8c756e7c24134aac6c8c15e34cf5119921f2e335041f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.822469 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.822522 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.822534 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.822569 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.822582 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:42Z","lastTransitionTime":"2026-02-27T01:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.829391 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1615ac61dec5d7d017c23cffd34575ac0575bc8cb679233a57222eca25dca675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd417d1b0b53abe6fec3c4c5964722f08a264f22ce487a8ae6447c111dfccc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-54zs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.846192 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6cc63e-9af8-4fdb-bd3f-ac14e85d530f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 01:04:49.945337 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 01:04:49.947347 1 observer_polling.go:159] Starting file observer\\\\nI0227 01:04:49.977213 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 01:04:49.981579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 01:05:16.797987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 01:05:16.798250 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.862495 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.879707 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.894854 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.915167 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.926293 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.926339 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.926348 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.926362 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.926371 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:42Z","lastTransitionTime":"2026-02-27T01:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.932326 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-24pv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15dd6a85-eabc-4a32-a283-33bf72d2a041\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-24pv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.963818 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:42 crc kubenswrapper[4771]: I0227 01:06:42.986685 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.001820 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:42Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.020943 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:43Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.029353 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.029421 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.029446 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.029478 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.029499 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:43Z","lastTransitionTime":"2026-02-27T01:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.042659 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:43Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.078417 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:43Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.103040 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:43Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.126722 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:43Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.134020 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.134095 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.134117 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.134148 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.134172 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:43Z","lastTransitionTime":"2026-02-27T01:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.150041 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:43Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.168820 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:43Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.184420 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-24pv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15dd6a85-eabc-4a32-a283-33bf72d2a041\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-24pv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:43Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.202047 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:43Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.219123 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:43Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.233816 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:43Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.237746 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.237808 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.237823 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.237849 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.237869 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:43Z","lastTransitionTime":"2026-02-27T01:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.253878 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6cc63e-9af8-4fdb-bd3f-ac14e85d530f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 01:04:49.945337 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 01:04:49.947347 1 observer_polling.go:159] Starting file observer\\\\nI0227 01:04:49.977213 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 01:04:49.981579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 01:05:16.797987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 01:05:16.798250 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:43Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.275059 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:43Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.293505 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:43Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.310004 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:43Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.334364 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:43Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.341524 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.341637 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.341661 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.341687 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.341705 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:43Z","lastTransitionTime":"2026-02-27T01:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.358442 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c597eac9e2878a8cba2c557865510154cca071720a32e1216be0940a5ed393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26c597eac9e2878a8cba2c557865510154cca071720a32e1216be0940a5ed393\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:06:40Z\\\",\\\"message\\\":\\\" 01:06:40.415905 6773 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0227 01:06:40.415913 6773 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 01:06:40.415953 6773 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 01:06:40.415965 6773 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 01:06:40.415974 6773 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 01:06:40.415984 6773 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 01:06:40.415993 6773 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 01:06:40.416002 6773 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 01:06:40.416011 6773 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 01:06:40.416031 6773 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:40.416137 6773 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 01:06:40.416155 6773 factory.go:656] Stopping watch factory\\\\nI0227 01:06:40.416412 6773 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 01:06:40.416619 6773 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-h5vs8_openshift-ovn-kubernetes(21f824c6-1bde-4e58-b4ef-72a56a140abb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:43Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.373211 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d591794984ea0c81e0d8c756e7c24134aac6c8c15e34cf5119921f2e335041f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:43Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.389082 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1615ac61dec5d7d017c23cffd34575ac0575bc8cb679233a57222eca25dca675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd417d1b0b53abe6fec3c4c5964722f08a264f22ce487a8ae6447c111dfccc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-54zs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:43Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.445238 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.445301 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.445319 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.445344 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.445362 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:43Z","lastTransitionTime":"2026-02-27T01:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.548606 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.548695 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.548714 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.548740 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.548756 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:43Z","lastTransitionTime":"2026-02-27T01:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.651887 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.651946 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.651964 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.651989 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.652009 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:43Z","lastTransitionTime":"2026-02-27T01:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.702048 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs\") pod \"network-metrics-daemon-24pv2\" (UID: \"15dd6a85-eabc-4a32-a283-33bf72d2a041\") " pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:06:43 crc kubenswrapper[4771]: E0227 01:06:43.702309 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 01:06:43 crc kubenswrapper[4771]: E0227 01:06:43.702419 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs podName:15dd6a85-eabc-4a32-a283-33bf72d2a041 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:45.702386105 +0000 UTC m=+118.639947473 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs") pod "network-metrics-daemon-24pv2" (UID: "15dd6a85-eabc-4a32-a283-33bf72d2a041") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.754875 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.754947 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.754964 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.754988 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.755009 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:43Z","lastTransitionTime":"2026-02-27T01:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.772965 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:06:43 crc kubenswrapper[4771]: E0227 01:06:43.773149 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.858231 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.858289 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.858306 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.858329 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.858350 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:43Z","lastTransitionTime":"2026-02-27T01:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.960791 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.960863 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.960879 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.960902 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:43 crc kubenswrapper[4771]: I0227 01:06:43.960916 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:43Z","lastTransitionTime":"2026-02-27T01:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.064378 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.064436 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.064459 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.064491 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.064517 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:44Z","lastTransitionTime":"2026-02-27T01:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.167152 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.167216 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.167234 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.167257 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.167274 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:44Z","lastTransitionTime":"2026-02-27T01:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.270416 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.270478 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.270494 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.270517 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.270540 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:44Z","lastTransitionTime":"2026-02-27T01:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.374025 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.374090 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.374109 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.374133 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.374150 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:44Z","lastTransitionTime":"2026-02-27T01:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.477197 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.477258 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.477279 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.477302 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.477320 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:44Z","lastTransitionTime":"2026-02-27T01:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.580593 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.580670 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.580693 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.580723 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.580746 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:44Z","lastTransitionTime":"2026-02-27T01:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.683444 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.683497 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.683514 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.683536 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.683586 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:44Z","lastTransitionTime":"2026-02-27T01:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.773097 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.773153 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.773122 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:44 crc kubenswrapper[4771]: E0227 01:06:44.773371 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:06:44 crc kubenswrapper[4771]: E0227 01:06:44.773586 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:06:44 crc kubenswrapper[4771]: E0227 01:06:44.773738 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.788641 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.788689 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.788700 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.788717 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.788728 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:44Z","lastTransitionTime":"2026-02-27T01:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.891155 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.891218 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.891236 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.891262 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.891279 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:44Z","lastTransitionTime":"2026-02-27T01:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.995007 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.995068 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.995085 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.995111 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:44 crc kubenswrapper[4771]: I0227 01:06:44.995156 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:44Z","lastTransitionTime":"2026-02-27T01:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.098487 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.098581 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.098601 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.098627 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.098646 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:45Z","lastTransitionTime":"2026-02-27T01:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.202310 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.202391 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.202415 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.202445 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.202464 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:45Z","lastTransitionTime":"2026-02-27T01:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.305366 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.305440 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.305466 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.305497 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.305593 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:45Z","lastTransitionTime":"2026-02-27T01:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.408331 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.408441 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.408460 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.408483 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.408502 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:45Z","lastTransitionTime":"2026-02-27T01:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.511826 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.511881 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.511897 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.511920 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.511962 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:45Z","lastTransitionTime":"2026-02-27T01:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.621097 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.621173 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.621199 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.621231 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.621262 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:45Z","lastTransitionTime":"2026-02-27T01:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.722238 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs\") pod \"network-metrics-daemon-24pv2\" (UID: \"15dd6a85-eabc-4a32-a283-33bf72d2a041\") " pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:06:45 crc kubenswrapper[4771]: E0227 01:06:45.722389 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 01:06:45 crc kubenswrapper[4771]: E0227 01:06:45.722450 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs podName:15dd6a85-eabc-4a32-a283-33bf72d2a041 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:49.722433599 +0000 UTC m=+122.659994927 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs") pod "network-metrics-daemon-24pv2" (UID: "15dd6a85-eabc-4a32-a283-33bf72d2a041") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.724184 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.724212 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.724224 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.724241 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.724254 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:45Z","lastTransitionTime":"2026-02-27T01:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.773059 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:06:45 crc kubenswrapper[4771]: E0227 01:06:45.773259 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.827192 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.827275 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.827300 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.827328 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.827350 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:45Z","lastTransitionTime":"2026-02-27T01:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.930400 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.930464 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.930482 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.930506 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:45 crc kubenswrapper[4771]: I0227 01:06:45.930524 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:45Z","lastTransitionTime":"2026-02-27T01:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.033582 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.033661 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.033683 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.033711 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.033732 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:46Z","lastTransitionTime":"2026-02-27T01:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.136970 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.137058 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.137113 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.137139 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.137156 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:46Z","lastTransitionTime":"2026-02-27T01:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.239959 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.240022 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.240040 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.240065 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.240084 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:46Z","lastTransitionTime":"2026-02-27T01:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.342466 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.342527 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.342544 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.342598 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.342617 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:46Z","lastTransitionTime":"2026-02-27T01:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.445676 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.445752 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.445774 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.445803 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.445824 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:46Z","lastTransitionTime":"2026-02-27T01:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.548720 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.548777 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.548794 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.548818 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.548840 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:46Z","lastTransitionTime":"2026-02-27T01:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.652699 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.652774 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.652792 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.652817 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.652834 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:46Z","lastTransitionTime":"2026-02-27T01:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.756451 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.756516 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.756536 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.756591 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.756618 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:46Z","lastTransitionTime":"2026-02-27T01:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.772497 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.772637 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.772528 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:46 crc kubenswrapper[4771]: E0227 01:06:46.772721 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:06:46 crc kubenswrapper[4771]: E0227 01:06:46.772874 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:06:46 crc kubenswrapper[4771]: E0227 01:06:46.773026 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.859531 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.859592 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.859602 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.859615 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.859625 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:46Z","lastTransitionTime":"2026-02-27T01:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.962355 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.962440 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.962457 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.962482 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:46 crc kubenswrapper[4771]: I0227 01:06:46.962500 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:46Z","lastTransitionTime":"2026-02-27T01:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.066246 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.066330 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.066353 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.066383 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.066404 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:47Z","lastTransitionTime":"2026-02-27T01:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.169101 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.169163 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.169180 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.169206 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.169225 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:47Z","lastTransitionTime":"2026-02-27T01:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.272329 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.272392 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.272411 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.272438 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.272468 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:47Z","lastTransitionTime":"2026-02-27T01:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.376314 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.376397 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.376436 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.376467 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.376488 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:47Z","lastTransitionTime":"2026-02-27T01:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.479970 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.480049 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.480074 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.480104 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.480128 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:47Z","lastTransitionTime":"2026-02-27T01:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.583097 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.583147 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.583166 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.583188 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.583205 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:47Z","lastTransitionTime":"2026-02-27T01:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.687769 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.687829 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.687847 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.687870 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.687887 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:47Z","lastTransitionTime":"2026-02-27T01:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.772218 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:06:47 crc kubenswrapper[4771]: E0227 01:06:47.772877 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:06:47 crc kubenswrapper[4771]: E0227 01:06:47.788489 4771 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.795378 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:47Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.816441 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:47Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.839158 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:47Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.860500 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:47Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.879079 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:47Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:47 crc kubenswrapper[4771]: E0227 01:06:47.890694 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.899161 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:47Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.925526 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:47Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.957884 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://26c597eac9e2878a8cba2c557865510154cca071720a32e1216be0940a5ed393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26c597eac9e2878a8cba2c557865510154cca071720a32e1216be0940a5ed393\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:06:40Z\\\",\\\"message\\\":\\\" 01:06:40.415905 6773 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0227 01:06:40.415913 6773 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 01:06:40.415953 6773 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 01:06:40.415965 6773 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 01:06:40.415974 6773 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 01:06:40.415984 6773 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 01:06:40.415993 6773 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 01:06:40.416002 6773 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 01:06:40.416011 6773 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 01:06:40.416031 6773 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:40.416137 6773 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 01:06:40.416155 6773 factory.go:656] Stopping watch factory\\\\nI0227 01:06:40.416412 6773 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 01:06:40.416619 6773 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-h5vs8_openshift-ovn-kubernetes(21f824c6-1bde-4e58-b4ef-72a56a140abb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:47Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.975015 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d591794984ea0c81e0d8c756e7c24134aac6c8c15e34cf5119921f2e335041f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:47Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:47 crc kubenswrapper[4771]: I0227 01:06:47.987329 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1615ac61dec5d7d017c23cffd34575ac0575bc8cb679233a57222eca25dca675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd417d1b0b53abe6fec3c4c5964722f08a264f22ce487a8ae6447c111dfccc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-54zs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:47Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.000593 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6cc63e-9af8-4fdb-bd3f-ac14e85d530f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 01:04:49.945337 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 01:04:49.947347 1 observer_polling.go:159] Starting file observer\\\\nI0227 01:04:49.977213 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 01:04:49.981579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 01:05:16.797987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 01:05:16.798250 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:47Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.017881 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:48Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.036390 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:48Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.057847 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:48Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.077238 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:48Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.091467 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-24pv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15dd6a85-eabc-4a32-a283-33bf72d2a041\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-24pv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:48Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.124830 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:48Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.552753 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.552906 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.553029 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:48 crc kubenswrapper[4771]: E0227 01:06:48.553084 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:07:20.553040243 +0000 UTC m=+153.490601561 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:06:48 crc kubenswrapper[4771]: E0227 01:06:48.553219 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 01:06:48 crc kubenswrapper[4771]: E0227 01:06:48.553258 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 01:06:48 crc kubenswrapper[4771]: E0227 01:06:48.553319 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 01:07:20.553295281 +0000 UTC m=+153.490856609 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 01:06:48 crc kubenswrapper[4771]: E0227 01:06:48.553351 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 01:07:20.553337762 +0000 UTC m=+153.490899080 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.655460 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.655536 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:48 crc kubenswrapper[4771]: E0227 01:06:48.655739 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 01:06:48 crc kubenswrapper[4771]: E0227 01:06:48.655775 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 01:06:48 crc kubenswrapper[4771]: E0227 01:06:48.655797 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:48 crc kubenswrapper[4771]: E0227 01:06:48.655802 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 01:06:48 crc kubenswrapper[4771]: E0227 01:06:48.655834 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 01:06:48 crc kubenswrapper[4771]: E0227 01:06:48.655853 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:48 crc kubenswrapper[4771]: E0227 01:06:48.655879 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 01:07:20.655851978 +0000 UTC m=+153.593413306 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:48 crc kubenswrapper[4771]: E0227 01:06:48.655923 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 01:07:20.655899109 +0000 UTC m=+153.593460437 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.681333 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.681384 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.681403 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.681427 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.681443 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:48Z","lastTransitionTime":"2026-02-27T01:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:48 crc kubenswrapper[4771]: E0227 01:06:48.701874 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:48Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.707234 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.707331 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.707357 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.707383 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.707400 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:48Z","lastTransitionTime":"2026-02-27T01:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:48 crc kubenswrapper[4771]: E0227 01:06:48.727834 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:48Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.733536 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.733627 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.733644 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.733668 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.733685 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:48Z","lastTransitionTime":"2026-02-27T01:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:48 crc kubenswrapper[4771]: E0227 01:06:48.754120 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:48Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.759358 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.759415 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.759434 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.759460 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.759477 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:48Z","lastTransitionTime":"2026-02-27T01:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.772759 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.772800 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.772772 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:48 crc kubenswrapper[4771]: E0227 01:06:48.772960 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:06:48 crc kubenswrapper[4771]: E0227 01:06:48.773045 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:06:48 crc kubenswrapper[4771]: E0227 01:06:48.773191 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:06:48 crc kubenswrapper[4771]: E0227 01:06:48.780012 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:48Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.785527 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.785632 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.785651 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.785673 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:48 crc kubenswrapper[4771]: I0227 01:06:48.785692 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:48Z","lastTransitionTime":"2026-02-27T01:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:48 crc kubenswrapper[4771]: E0227 01:06:48.805996 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:48Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:48 crc kubenswrapper[4771]: E0227 01:06:48.806224 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 01:06:49 crc kubenswrapper[4771]: I0227 01:06:49.770068 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs\") pod \"network-metrics-daemon-24pv2\" (UID: \"15dd6a85-eabc-4a32-a283-33bf72d2a041\") " pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:06:49 crc kubenswrapper[4771]: E0227 01:06:49.770282 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 01:06:49 crc kubenswrapper[4771]: E0227 01:06:49.770430 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs podName:15dd6a85-eabc-4a32-a283-33bf72d2a041 nodeName:}" failed. No retries permitted until 2026-02-27 01:06:57.770390821 +0000 UTC m=+130.707952159 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs") pod "network-metrics-daemon-24pv2" (UID: "15dd6a85-eabc-4a32-a283-33bf72d2a041") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 01:06:49 crc kubenswrapper[4771]: I0227 01:06:49.772450 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:06:49 crc kubenswrapper[4771]: E0227 01:06:49.772704 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:06:50 crc kubenswrapper[4771]: I0227 01:06:50.773120 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:50 crc kubenswrapper[4771]: I0227 01:06:50.773154 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:50 crc kubenswrapper[4771]: I0227 01:06:50.773201 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:50 crc kubenswrapper[4771]: E0227 01:06:50.773306 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:06:50 crc kubenswrapper[4771]: E0227 01:06:50.773481 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:06:50 crc kubenswrapper[4771]: E0227 01:06:50.773652 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:06:51 crc kubenswrapper[4771]: I0227 01:06:51.772346 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:06:51 crc kubenswrapper[4771]: E0227 01:06:51.772531 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:06:52 crc kubenswrapper[4771]: I0227 01:06:52.772608 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:52 crc kubenswrapper[4771]: I0227 01:06:52.772640 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:52 crc kubenswrapper[4771]: I0227 01:06:52.772608 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:52 crc kubenswrapper[4771]: E0227 01:06:52.772777 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:06:52 crc kubenswrapper[4771]: E0227 01:06:52.772926 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:06:52 crc kubenswrapper[4771]: E0227 01:06:52.773020 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:06:52 crc kubenswrapper[4771]: E0227 01:06:52.892503 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 01:06:53 crc kubenswrapper[4771]: I0227 01:06:53.772938 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:06:53 crc kubenswrapper[4771]: E0227 01:06:53.773119 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:06:54 crc kubenswrapper[4771]: I0227 01:06:54.772596 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:54 crc kubenswrapper[4771]: I0227 01:06:54.772646 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:54 crc kubenswrapper[4771]: E0227 01:06:54.772808 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:06:54 crc kubenswrapper[4771]: I0227 01:06:54.772831 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:54 crc kubenswrapper[4771]: E0227 01:06:54.772963 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:06:54 crc kubenswrapper[4771]: E0227 01:06:54.773126 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:06:55 crc kubenswrapper[4771]: I0227 01:06:55.773180 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:06:55 crc kubenswrapper[4771]: E0227 01:06:55.773382 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:06:56 crc kubenswrapper[4771]: I0227 01:06:56.772652 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:56 crc kubenswrapper[4771]: E0227 01:06:56.772781 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:06:56 crc kubenswrapper[4771]: I0227 01:06:56.772677 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:56 crc kubenswrapper[4771]: I0227 01:06:56.772659 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:56 crc kubenswrapper[4771]: E0227 01:06:56.772867 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:06:56 crc kubenswrapper[4771]: E0227 01:06:56.772925 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:06:56 crc kubenswrapper[4771]: I0227 01:06:56.773619 4771 scope.go:117] "RemoveContainer" containerID="26c597eac9e2878a8cba2c557865510154cca071720a32e1216be0940a5ed393" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.427663 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h5vs8_21f824c6-1bde-4e58-b4ef-72a56a140abb/ovnkube-controller/1.log" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.431070 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerStarted","Data":"e46d4e9f36c95d9f88b09a405895230f9479f1d0779f23564151fb4593a92d3e"} Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.431717 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.446443 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-24pv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15dd6a85-eabc-4a32-a283-33bf72d2a041\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-24pv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.476638 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.498686 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.519227 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.540785 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.560418 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.575935 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.595288 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.610180 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.640958 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e46d4e9f36c95d9f88b09a405895230f9479f1d0779f23564151fb4593a92d3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26c597eac9e2878a8cba2c557865510154cca071720a32e1216be0940a5ed393\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:06:40Z\\\",\\\"message\\\":\\\" 01:06:40.415905 6773 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0227 01:06:40.415913 6773 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 01:06:40.415953 6773 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 01:06:40.415965 6773 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 01:06:40.415974 6773 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 01:06:40.415984 6773 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 01:06:40.415993 6773 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 01:06:40.416002 6773 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 01:06:40.416011 6773 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 01:06:40.416031 6773 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:40.416137 6773 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 01:06:40.416155 6773 factory.go:656] Stopping watch factory\\\\nI0227 01:06:40.416412 6773 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 01:06:40.416619 6773 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.654005 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d591794984ea0c81e0d8c756e7c24134aac6c8c15e34cf5119921f2e335041f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.669085 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1615ac61dec5d7d017c23cffd34575ac0575bc8cb679233a57222eca25dca675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd417d1b0b53abe6fec3c4c5964722f08a264f22ce487a8ae6447c111dfccc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-54zs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.685847 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6cc63e-9af8-4fdb-bd3f-ac14e85d530f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 01:04:49.945337 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 01:04:49.947347 1 observer_polling.go:159] Starting file observer\\\\nI0227 01:04:49.977213 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 01:04:49.981579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 01:05:16.797987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 01:05:16.798250 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.702589 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.717710 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.735509 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.760250 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.772648 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:06:57 crc kubenswrapper[4771]: E0227 01:06:57.772792 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.793305 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.806845 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-24pv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15dd6a85-eabc-4a32-a283-33bf72d2a041\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-24pv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.840978 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.858987 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs\") pod \"network-metrics-daemon-24pv2\" (UID: \"15dd6a85-eabc-4a32-a283-33bf72d2a041\") " pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:06:57 crc kubenswrapper[4771]: E0227 01:06:57.859148 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 01:06:57 crc kubenswrapper[4771]: E0227 01:06:57.859208 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs podName:15dd6a85-eabc-4a32-a283-33bf72d2a041 nodeName:}" failed. No retries permitted until 2026-02-27 01:07:13.859192544 +0000 UTC m=+146.796753842 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs") pod "network-metrics-daemon-24pv2" (UID: "15dd6a85-eabc-4a32-a283-33bf72d2a041") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.870391 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.885881 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: E0227 01:06:57.893036 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.903590 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.921045 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.934443 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.946571 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.961414 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:57 crc kubenswrapper[4771]: I0227 01:06:57.990017 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e46d4e9f36c95d9f88b09a405895230f9479f1d0779f23564151fb4593a92d3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26c597eac9e2878a8cba2c557865510154cca071720a32e1216be0940a5ed393\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:06:40Z\\\",\\\"message\\\":\\\" 01:06:40.415905 6773 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0227 01:06:40.415913 6773 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 01:06:40.415953 6773 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 01:06:40.415965 6773 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 01:06:40.415974 6773 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 01:06:40.415984 6773 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 01:06:40.415993 6773 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 01:06:40.416002 6773 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 01:06:40.416011 6773 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 01:06:40.416031 6773 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:40.416137 6773 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 01:06:40.416155 6773 factory.go:656] Stopping watch factory\\\\nI0227 01:06:40.416412 6773 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 01:06:40.416619 6773 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:57Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.003875 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d591794984ea0c81e0d8c756e7c24134aac6c8c15e34cf5119921f2e335041f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:58Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.017212 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1615ac61dec5d7d017c23cffd34575ac0575bc8cb679233a57222eca25dca675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd417d1b0b53abe6fec3c4c5964722f08a264f22ce487a8ae6447c111dfccc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-54zs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:58Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.036290 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6cc63e-9af8-4fdb-bd3f-ac14e85d530f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 01:04:49.945337 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 01:04:49.947347 1 observer_polling.go:159] Starting file observer\\\\nI0227 01:04:49.977213 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 01:04:49.981579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 01:05:16.797987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 01:05:16.798250 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:58Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.051776 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:58Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.069628 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:58Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.086244 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:58Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.438063 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h5vs8_21f824c6-1bde-4e58-b4ef-72a56a140abb/ovnkube-controller/2.log" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.439075 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h5vs8_21f824c6-1bde-4e58-b4ef-72a56a140abb/ovnkube-controller/1.log" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.444362 4771 generic.go:334] "Generic (PLEG): container finished" podID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerID="e46d4e9f36c95d9f88b09a405895230f9479f1d0779f23564151fb4593a92d3e" exitCode=1 Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.444402 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerDied","Data":"e46d4e9f36c95d9f88b09a405895230f9479f1d0779f23564151fb4593a92d3e"} Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.444477 4771 scope.go:117] "RemoveContainer" containerID="26c597eac9e2878a8cba2c557865510154cca071720a32e1216be0940a5ed393" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.447081 4771 scope.go:117] "RemoveContainer" containerID="e46d4e9f36c95d9f88b09a405895230f9479f1d0779f23564151fb4593a92d3e" Feb 27 01:06:58 crc kubenswrapper[4771]: E0227 01:06:58.447388 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-h5vs8_openshift-ovn-kubernetes(21f824c6-1bde-4e58-b4ef-72a56a140abb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.491531 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:58Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.515950 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:58Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.538690 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:58Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.559185 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:58Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.580721 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:58Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.601420 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-24pv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15dd6a85-eabc-4a32-a283-33bf72d2a041\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-24pv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:58Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.621461 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:58Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.638054 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:58Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.654919 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:58Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.670372 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d591794984ea0c81e0d8c756e7c24134aac6c8c15e34cf5119921f2e335041f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:58Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.689809 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1615ac61dec5d7d017c23cffd34575ac0575bc8cb679233a57222eca25dca675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd417d1b0b53abe6fec3c4c5964722f08a264f22ce487a8ae6447c111dfccc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-54zs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:58Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.710478 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6cc63e-9af8-4fdb-bd3f-ac14e85d530f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 01:04:49.945337 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 01:04:49.947347 1 observer_polling.go:159] Starting file observer\\\\nI0227 01:04:49.977213 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 01:04:49.981579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 01:05:16.797987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 01:05:16.798250 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:58Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.729179 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:58Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.746086 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:58Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.759823 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:58Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.772899 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.772954 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.773076 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:06:58 crc kubenswrapper[4771]: E0227 01:06:58.773425 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:06:58 crc kubenswrapper[4771]: E0227 01:06:58.773535 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:06:58 crc kubenswrapper[4771]: E0227 01:06:58.774018 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.780311 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:58Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.785518 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 27 01:06:58 crc kubenswrapper[4771]: I0227 01:06:58.815717 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e46d4e9f36c95d9f88b09a405895230f9479f1d0779f23564151fb4593a92d3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26c597eac9e2878a8cba2c557865510154cca071720a32e1216be0940a5ed393\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:06:40Z\\\",\\\"message\\\":\\\" 01:06:40.415905 6773 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0227 01:06:40.415913 6773 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 01:06:40.415953 6773 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 01:06:40.415965 6773 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 01:06:40.415974 6773 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 01:06:40.415984 6773 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 01:06:40.415993 6773 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 01:06:40.416002 6773 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 01:06:40.416011 6773 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 01:06:40.416031 6773 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:40.416137 6773 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 01:06:40.416155 6773 factory.go:656] Stopping watch factory\\\\nI0227 01:06:40.416412 6773 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 01:06:40.416619 6773 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e46d4e9f36c95d9f88b09a405895230f9479f1d0779f23564151fb4593a92d3e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:06:57Z\\\",\\\"message\\\":\\\"tor *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.836788 7002 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 01:06:57.836916 7002 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837144 7002 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837341 7002 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837353 7002 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 01:06:57.837407 7002 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 01:06:57.837783 7002 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 01:06:57.837859 7002 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 01:06:57.837910 7002 factory.go:656] Stopping watch factory\\\\nI0227 01:06:57.837945 7002 ovnkube.go:599] Stopped ovnkube\\\\nI0227 01:06:57.837991 7002 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:58Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.016792 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.016862 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.016887 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.016914 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.016935 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:59Z","lastTransitionTime":"2026-02-27T01:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:59 crc kubenswrapper[4771]: E0227 01:06:59.038196 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:59Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.043910 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.043968 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.043987 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.044009 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.044026 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:59Z","lastTransitionTime":"2026-02-27T01:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:59 crc kubenswrapper[4771]: E0227 01:06:59.065120 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:59Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.070995 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.071051 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.071068 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.071090 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.071110 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:59Z","lastTransitionTime":"2026-02-27T01:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:59 crc kubenswrapper[4771]: E0227 01:06:59.091088 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:59Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.096218 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.096277 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.096297 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.096320 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.096338 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:59Z","lastTransitionTime":"2026-02-27T01:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:59 crc kubenswrapper[4771]: E0227 01:06:59.116679 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:59Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.121314 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.121378 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.121396 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.121424 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.121443 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:06:59Z","lastTransitionTime":"2026-02-27T01:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:06:59 crc kubenswrapper[4771]: E0227 01:06:59.143160 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:59Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:59 crc kubenswrapper[4771]: E0227 01:06:59.143398 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.451676 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h5vs8_21f824c6-1bde-4e58-b4ef-72a56a140abb/ovnkube-controller/2.log" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.457968 4771 scope.go:117] "RemoveContainer" containerID="e46d4e9f36c95d9f88b09a405895230f9479f1d0779f23564151fb4593a92d3e" Feb 27 01:06:59 crc kubenswrapper[4771]: E0227 01:06:59.458213 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-h5vs8_openshift-ovn-kubernetes(21f824c6-1bde-4e58-b4ef-72a56a140abb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.477666 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f66a3f1-a1b0-476b-89ab-41828d8a23c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33db46668493c52afb6e30bae58f1197802fc18fcc703f9d05931471b2526bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682d71bc07b1c3a5e3d9be1eb9f1dc1a19176f2bcf9942f18c8966864d51b032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be367e10a4cf507c9297ec9727c7847db6b54785f2efae23213c2a325e50cb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a16d7c04e5c94174f79e9a3d7e2fd64f70f7677c4128e6ff85d90ee3446cc724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a16d7c04e5c94174f79e9a3d7e2fd64f70f7677c4128e6ff85d90ee3446cc724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:59Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.500530 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:59Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.523500 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:59Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.539513 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:59Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.553384 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:59Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.573476 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:59Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.604360 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e46d4e9f36c95d9f88b09a405895230f9479f1d0779f23564151fb4593a92d3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e46d4e9f36c95d9f88b09a405895230f9479f1d0779f23564151fb4593a92d3e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:06:57Z\\\",\\\"message\\\":\\\"tor *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.836788 7002 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 01:06:57.836916 7002 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837144 7002 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837341 7002 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837353 7002 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 01:06:57.837407 7002 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 01:06:57.837783 7002 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 01:06:57.837859 7002 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 01:06:57.837910 7002 factory.go:656] Stopping watch factory\\\\nI0227 01:06:57.837945 7002 ovnkube.go:599] Stopped ovnkube\\\\nI0227 01:06:57.837991 7002 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-h5vs8_openshift-ovn-kubernetes(21f824c6-1bde-4e58-b4ef-72a56a140abb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:59Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.621513 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d591794984ea0c81e0d8c756e7c24134aac6c8c15e34cf5119921f2e335041f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:59Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.635478 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1615ac61dec5d7d017c23cffd34575ac0575bc8cb679233a57222eca25dca675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd417d1b0b53abe6fec3c4c5964722f08a264f22ce487a8ae6447c111dfccc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-54zs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:59Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.651904 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6cc63e-9af8-4fdb-bd3f-ac14e85d530f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 01:04:49.945337 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 01:04:49.947347 1 observer_polling.go:159] Starting file observer\\\\nI0227 01:04:49.977213 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 01:04:49.981579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 01:05:16.797987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 01:05:16.798250 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:59Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.670270 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:59Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.688136 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:59Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.705964 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:59Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.724105 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:59Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.740635 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-24pv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15dd6a85-eabc-4a32-a283-33bf72d2a041\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-24pv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:59Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.772029 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:59Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.773030 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:06:59 crc kubenswrapper[4771]: E0227 01:06:59.773252 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.799476 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:59Z is after 2025-08-24T17:21:41Z" Feb 27 01:06:59 crc kubenswrapper[4771]: I0227 01:06:59.826696 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:06:59Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:00 crc kubenswrapper[4771]: I0227 01:07:00.773237 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:00 crc kubenswrapper[4771]: E0227 01:07:00.773445 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:00 crc kubenswrapper[4771]: I0227 01:07:00.773250 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:00 crc kubenswrapper[4771]: E0227 01:07:00.773602 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:00 crc kubenswrapper[4771]: I0227 01:07:00.773250 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:00 crc kubenswrapper[4771]: E0227 01:07:00.773694 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:01 crc kubenswrapper[4771]: I0227 01:07:01.772811 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:01 crc kubenswrapper[4771]: E0227 01:07:01.773032 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:02 crc kubenswrapper[4771]: I0227 01:07:02.772176 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:02 crc kubenswrapper[4771]: I0227 01:07:02.772181 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:02 crc kubenswrapper[4771]: E0227 01:07:02.772939 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:02 crc kubenswrapper[4771]: E0227 01:07:02.773041 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:02 crc kubenswrapper[4771]: I0227 01:07:02.772194 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:02 crc kubenswrapper[4771]: E0227 01:07:02.773221 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:02 crc kubenswrapper[4771]: E0227 01:07:02.894592 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 01:07:03 crc kubenswrapper[4771]: I0227 01:07:03.773448 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:03 crc kubenswrapper[4771]: E0227 01:07:03.773755 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:04 crc kubenswrapper[4771]: I0227 01:07:04.772910 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:04 crc kubenswrapper[4771]: I0227 01:07:04.772986 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:04 crc kubenswrapper[4771]: E0227 01:07:04.773093 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:04 crc kubenswrapper[4771]: E0227 01:07:04.773301 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:04 crc kubenswrapper[4771]: I0227 01:07:04.772936 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:04 crc kubenswrapper[4771]: E0227 01:07:04.773810 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:05 crc kubenswrapper[4771]: I0227 01:07:05.772823 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:05 crc kubenswrapper[4771]: E0227 01:07:05.773325 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:06 crc kubenswrapper[4771]: I0227 01:07:06.773026 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:06 crc kubenswrapper[4771]: I0227 01:07:06.773065 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:06 crc kubenswrapper[4771]: E0227 01:07:06.773221 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:06 crc kubenswrapper[4771]: E0227 01:07:06.773371 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:06 crc kubenswrapper[4771]: I0227 01:07:06.773067 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:06 crc kubenswrapper[4771]: E0227 01:07:06.774315 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:07 crc kubenswrapper[4771]: I0227 01:07:07.772277 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:07 crc kubenswrapper[4771]: E0227 01:07:07.772516 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:07 crc kubenswrapper[4771]: I0227 01:07:07.813747 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f66a3f1-a1b0-476b-89ab-41828d8a23c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33db46668493c52afb6e30bae58f1197802fc18fcc703f9d05931471b2526bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682d71bc07b1c3a5e3d9be1eb9f1dc1a19176f2bcf9942f18c8966864d51b032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be367e10a4cf507c9297ec9727c7847db6b54785f2efae23213c2a325e50cb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a16d7c04e5c94174f79e9a3d7e2fd64f70f7677c4128e6ff85d90ee3446cc724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a16d7c04e5c94174f79e9a3d7e2fd64f70f7677c4128e6ff85d90ee3446cc724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:07Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:07 crc kubenswrapper[4771]: I0227 01:07:07.836251 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:07Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:07 crc kubenswrapper[4771]: I0227 01:07:07.857181 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:07Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:07 crc kubenswrapper[4771]: I0227 01:07:07.875303 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:07Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:07 crc kubenswrapper[4771]: E0227 01:07:07.895697 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 01:07:07 crc kubenswrapper[4771]: I0227 01:07:07.896475 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6cc63e-9af8-4fdb-bd3f-ac14e85d530f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 01:04:49.945337 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 01:04:49.947347 1 observer_polling.go:159] Starting file observer\\\\nI0227 01:04:49.977213 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 01:04:49.981579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 01:05:16.797987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 01:05:16.798250 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:07Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:07 crc kubenswrapper[4771]: I0227 01:07:07.926494 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:07Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:07 crc kubenswrapper[4771]: I0227 01:07:07.942495 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:07Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:07 crc kubenswrapper[4771]: I0227 01:07:07.953448 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:07Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:07 crc kubenswrapper[4771]: I0227 01:07:07.970490 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:07Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:07 crc kubenswrapper[4771]: I0227 01:07:07.991430 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e46d4e9f36c95d9f88b09a405895230f9479f1d0779f23564151fb4593a92d3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e46d4e9f36c95d9f88b09a405895230f9479f1d0779f23564151fb4593a92d3e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:06:57Z\\\",\\\"message\\\":\\\"tor *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.836788 7002 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 01:06:57.836916 7002 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837144 7002 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837341 7002 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837353 7002 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 01:06:57.837407 7002 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 01:06:57.837783 7002 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 01:06:57.837859 7002 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 01:06:57.837910 7002 factory.go:656] Stopping watch factory\\\\nI0227 01:06:57.837945 7002 ovnkube.go:599] Stopped ovnkube\\\\nI0227 01:06:57.837991 7002 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-h5vs8_openshift-ovn-kubernetes(21f824c6-1bde-4e58-b4ef-72a56a140abb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:07Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:08 crc kubenswrapper[4771]: I0227 01:07:08.012493 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d591794984ea0c81e0d8c756e7c24134aac6c8c15e34cf5119921f2e335041f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:08Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:08 crc kubenswrapper[4771]: I0227 01:07:08.028306 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1615ac61dec5d7d017c23cffd34575ac0575bc8cb679233a57222eca25dca675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd417d1b0b53abe6fec3c4c5964722f08a264f22ce487a8ae6447c111dfccc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-54zs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:08Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:08 crc kubenswrapper[4771]: I0227 01:07:08.061367 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:08Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:08 crc kubenswrapper[4771]: I0227 01:07:08.081864 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:08Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:08 crc kubenswrapper[4771]: I0227 01:07:08.101874 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:08Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:08 crc kubenswrapper[4771]: I0227 01:07:08.122652 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:08Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:08 crc kubenswrapper[4771]: I0227 01:07:08.145330 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:08Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:08 crc kubenswrapper[4771]: I0227 01:07:08.163023 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-24pv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15dd6a85-eabc-4a32-a283-33bf72d2a041\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-24pv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:08Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:08 crc kubenswrapper[4771]: I0227 01:07:08.773320 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:08 crc kubenswrapper[4771]: I0227 01:07:08.773403 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:08 crc kubenswrapper[4771]: I0227 01:07:08.773427 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:08 crc kubenswrapper[4771]: E0227 01:07:08.773544 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:08 crc kubenswrapper[4771]: E0227 01:07:08.773698 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:08 crc kubenswrapper[4771]: E0227 01:07:08.773795 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:09 crc kubenswrapper[4771]: I0227 01:07:09.508918 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:07:09 crc kubenswrapper[4771]: I0227 01:07:09.508980 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:07:09 crc kubenswrapper[4771]: I0227 01:07:09.508998 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:07:09 crc kubenswrapper[4771]: I0227 01:07:09.509022 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:07:09 crc kubenswrapper[4771]: I0227 01:07:09.509040 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:07:09Z","lastTransitionTime":"2026-02-27T01:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:07:09 crc kubenswrapper[4771]: E0227 01:07:09.529581 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:09Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:09 crc kubenswrapper[4771]: I0227 01:07:09.532922 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:07:09 crc kubenswrapper[4771]: I0227 01:07:09.532989 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:07:09 crc kubenswrapper[4771]: I0227 01:07:09.533012 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:07:09 crc kubenswrapper[4771]: I0227 01:07:09.533038 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:07:09 crc kubenswrapper[4771]: I0227 01:07:09.533057 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:07:09Z","lastTransitionTime":"2026-02-27T01:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:07:09 crc kubenswrapper[4771]: E0227 01:07:09.551719 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:09Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:09 crc kubenswrapper[4771]: I0227 01:07:09.556163 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:07:09 crc kubenswrapper[4771]: I0227 01:07:09.556209 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:07:09 crc kubenswrapper[4771]: I0227 01:07:09.556226 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:07:09 crc kubenswrapper[4771]: I0227 01:07:09.556248 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:07:09 crc kubenswrapper[4771]: I0227 01:07:09.556266 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:07:09Z","lastTransitionTime":"2026-02-27T01:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:07:09 crc kubenswrapper[4771]: E0227 01:07:09.572773 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:09Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:09 crc kubenswrapper[4771]: I0227 01:07:09.577813 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:07:09 crc kubenswrapper[4771]: I0227 01:07:09.577881 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:07:09 crc kubenswrapper[4771]: I0227 01:07:09.577907 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:07:09 crc kubenswrapper[4771]: I0227 01:07:09.577933 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:07:09 crc kubenswrapper[4771]: I0227 01:07:09.577950 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:07:09Z","lastTransitionTime":"2026-02-27T01:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:07:09 crc kubenswrapper[4771]: E0227 01:07:09.598177 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:09Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:09 crc kubenswrapper[4771]: I0227 01:07:09.602990 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:07:09 crc kubenswrapper[4771]: I0227 01:07:09.603045 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:07:09 crc kubenswrapper[4771]: I0227 01:07:09.603067 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:07:09 crc kubenswrapper[4771]: I0227 01:07:09.603096 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:07:09 crc kubenswrapper[4771]: I0227 01:07:09.603119 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:07:09Z","lastTransitionTime":"2026-02-27T01:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:07:09 crc kubenswrapper[4771]: E0227 01:07:09.624609 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:09Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:09 crc kubenswrapper[4771]: E0227 01:07:09.624837 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 01:07:09 crc kubenswrapper[4771]: I0227 01:07:09.772336 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:09 crc kubenswrapper[4771]: E0227 01:07:09.772539 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:10 crc kubenswrapper[4771]: I0227 01:07:10.772488 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:10 crc kubenswrapper[4771]: I0227 01:07:10.772536 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:10 crc kubenswrapper[4771]: I0227 01:07:10.772598 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:10 crc kubenswrapper[4771]: E0227 01:07:10.772732 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:10 crc kubenswrapper[4771]: E0227 01:07:10.773026 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:10 crc kubenswrapper[4771]: E0227 01:07:10.773202 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:11 crc kubenswrapper[4771]: I0227 01:07:11.772883 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:11 crc kubenswrapper[4771]: E0227 01:07:11.773099 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:12 crc kubenswrapper[4771]: I0227 01:07:12.773199 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:12 crc kubenswrapper[4771]: I0227 01:07:12.773340 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:12 crc kubenswrapper[4771]: E0227 01:07:12.773411 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:12 crc kubenswrapper[4771]: E0227 01:07:12.773610 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:12 crc kubenswrapper[4771]: I0227 01:07:12.774252 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:12 crc kubenswrapper[4771]: E0227 01:07:12.774474 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:12 crc kubenswrapper[4771]: E0227 01:07:12.897237 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 01:07:13 crc kubenswrapper[4771]: I0227 01:07:13.772713 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:13 crc kubenswrapper[4771]: E0227 01:07:13.772892 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:13 crc kubenswrapper[4771]: I0227 01:07:13.930622 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs\") pod \"network-metrics-daemon-24pv2\" (UID: \"15dd6a85-eabc-4a32-a283-33bf72d2a041\") " pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:13 crc kubenswrapper[4771]: E0227 01:07:13.930786 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 01:07:13 crc kubenswrapper[4771]: E0227 01:07:13.930874 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs podName:15dd6a85-eabc-4a32-a283-33bf72d2a041 nodeName:}" failed. No retries permitted until 2026-02-27 01:07:45.930852039 +0000 UTC m=+178.868413337 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs") pod "network-metrics-daemon-24pv2" (UID: "15dd6a85-eabc-4a32-a283-33bf72d2a041") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 01:07:14 crc kubenswrapper[4771]: I0227 01:07:14.772868 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:14 crc kubenswrapper[4771]: I0227 01:07:14.772903 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:14 crc kubenswrapper[4771]: I0227 01:07:14.773017 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:14 crc kubenswrapper[4771]: E0227 01:07:14.773401 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:14 crc kubenswrapper[4771]: E0227 01:07:14.773521 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:14 crc kubenswrapper[4771]: E0227 01:07:14.773623 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:14 crc kubenswrapper[4771]: I0227 01:07:14.773719 4771 scope.go:117] "RemoveContainer" containerID="e46d4e9f36c95d9f88b09a405895230f9479f1d0779f23564151fb4593a92d3e" Feb 27 01:07:14 crc kubenswrapper[4771]: E0227 01:07:14.773944 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-h5vs8_openshift-ovn-kubernetes(21f824c6-1bde-4e58-b4ef-72a56a140abb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" Feb 27 01:07:15 crc kubenswrapper[4771]: I0227 01:07:15.773043 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:15 crc kubenswrapper[4771]: E0227 01:07:15.773306 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:16 crc kubenswrapper[4771]: I0227 01:07:16.525535 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-srbwq_3c460c23-4b4a-458f-a52e-4208b9942829/kube-multus/0.log" Feb 27 01:07:16 crc kubenswrapper[4771]: I0227 01:07:16.525687 4771 generic.go:334] "Generic (PLEG): container finished" podID="3c460c23-4b4a-458f-a52e-4208b9942829" containerID="b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d" exitCode=1 Feb 27 01:07:16 crc kubenswrapper[4771]: I0227 01:07:16.525737 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-srbwq" event={"ID":"3c460c23-4b4a-458f-a52e-4208b9942829","Type":"ContainerDied","Data":"b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d"} Feb 27 01:07:16 crc kubenswrapper[4771]: I0227 01:07:16.526329 4771 scope.go:117] "RemoveContainer" containerID="b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d" Feb 27 01:07:16 crc kubenswrapper[4771]: I0227 01:07:16.549374 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:07:15Z\\\",\\\"message\\\":\\\"2026-02-27T01:06:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8380b21f-640b-4d86-b7e6-ef54b71437d2\\\\n2026-02-27T01:06:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8380b21f-640b-4d86-b7e6-ef54b71437d2 to /host/opt/cni/bin/\\\\n2026-02-27T01:06:30Z [verbose] multus-daemon started\\\\n2026-02-27T01:06:30Z [verbose] Readiness Indicator file check\\\\n2026-02-27T01:07:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:16 crc kubenswrapper[4771]: I0227 01:07:16.570300 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-24pv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15dd6a85-eabc-4a32-a283-33bf72d2a041\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-24pv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:16 crc kubenswrapper[4771]: I0227 01:07:16.601069 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:16 crc kubenswrapper[4771]: I0227 01:07:16.623938 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:16 crc kubenswrapper[4771]: I0227 01:07:16.646364 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:16 crc kubenswrapper[4771]: I0227 01:07:16.681991 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:16 crc kubenswrapper[4771]: I0227 01:07:16.722220 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f66a3f1-a1b0-476b-89ab-41828d8a23c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33db46668493c52afb6e30bae58f1197802fc18fcc703f9d05931471b2526bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682d71bc07b1c3a5e3d9be1eb9f1dc1a19176f2bcf9942f18c8966864d51b032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be367e10a4cf507c9297ec9727c7847db6b54785f2efae23213c2a325e50cb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a16d7c04e5c94174f79e9a3d7e2fd64f70f7677c4128e6ff85d90ee3446cc724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a16d7c04e5c94174f79e9a3d7e2fd64f70f7677c4128e6ff85d90ee3446cc724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:16 crc kubenswrapper[4771]: I0227 01:07:16.747165 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:16 crc kubenswrapper[4771]: I0227 01:07:16.763402 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:16 crc kubenswrapper[4771]: I0227 01:07:16.772299 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:16 crc kubenswrapper[4771]: E0227 01:07:16.772456 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:16 crc kubenswrapper[4771]: I0227 01:07:16.772673 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:16 crc kubenswrapper[4771]: E0227 01:07:16.772754 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:16 crc kubenswrapper[4771]: I0227 01:07:16.772913 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:16 crc kubenswrapper[4771]: E0227 01:07:16.772987 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:16 crc kubenswrapper[4771]: I0227 01:07:16.776206 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:16 crc kubenswrapper[4771]: I0227 01:07:16.790689 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:16 crc kubenswrapper[4771]: I0227 01:07:16.813724 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e46d4e9f36c95d9f88b09a405895230f9479f1d0779f23564151fb4593a92d3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e46d4e9f36c95d9f88b09a405895230f9479f1d0779f23564151fb4593a92d3e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:06:57Z\\\",\\\"message\\\":\\\"tor *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.836788 7002 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 01:06:57.836916 7002 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837144 7002 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837341 7002 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837353 7002 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 01:06:57.837407 7002 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 01:06:57.837783 7002 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 01:06:57.837859 7002 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 01:06:57.837910 7002 factory.go:656] Stopping watch factory\\\\nI0227 01:06:57.837945 7002 ovnkube.go:599] Stopped ovnkube\\\\nI0227 01:06:57.837991 7002 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-h5vs8_openshift-ovn-kubernetes(21f824c6-1bde-4e58-b4ef-72a56a140abb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:16 crc kubenswrapper[4771]: I0227 01:07:16.826063 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d591794984ea0c81e0d8c756e7c24134aac6c8c15e34cf5119921f2e335041f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:16 crc kubenswrapper[4771]: I0227 01:07:16.837358 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1615ac61dec5d7d017c23cffd34575ac0575bc8cb679233a57222eca25dca675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd417d1b0b53abe6fec3c4c5964722f08a264f22ce487a8ae6447c111dfccc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-54zs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:16 crc kubenswrapper[4771]: I0227 01:07:16.850520 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6cc63e-9af8-4fdb-bd3f-ac14e85d530f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 01:04:49.945337 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 01:04:49.947347 1 observer_polling.go:159] Starting file observer\\\\nI0227 01:04:49.977213 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 01:04:49.981579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 01:05:16.797987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 01:05:16.798250 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:16 crc kubenswrapper[4771]: I0227 01:07:16.864007 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:16 crc kubenswrapper[4771]: I0227 01:07:16.880603 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:16 crc kubenswrapper[4771]: I0227 01:07:16.890479 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:16Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:17 crc kubenswrapper[4771]: I0227 01:07:17.531933 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-srbwq_3c460c23-4b4a-458f-a52e-4208b9942829/kube-multus/0.log" Feb 27 01:07:17 crc kubenswrapper[4771]: I0227 01:07:17.532524 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-srbwq" event={"ID":"3c460c23-4b4a-458f-a52e-4208b9942829","Type":"ContainerStarted","Data":"2c7c6ab7e2f9f9a791788e08cb062278f1f02827724ac88cb37e84e901b8dcda"} Feb 27 01:07:17 crc kubenswrapper[4771]: I0227 01:07:17.554622 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:17 crc kubenswrapper[4771]: I0227 01:07:17.575830 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:17 crc kubenswrapper[4771]: I0227 01:07:17.595636 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:17 crc kubenswrapper[4771]: I0227 01:07:17.619755 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:17 crc kubenswrapper[4771]: I0227 01:07:17.653857 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e46d4e9f36c95d9f88b09a405895230f9479f1d0779f23564151fb4593a92d3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e46d4e9f36c95d9f88b09a405895230f9479f1d0779f23564151fb4593a92d3e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:06:57Z\\\",\\\"message\\\":\\\"tor *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.836788 7002 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 01:06:57.836916 7002 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837144 7002 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837341 7002 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837353 7002 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 01:06:57.837407 7002 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 01:06:57.837783 7002 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 01:06:57.837859 7002 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 01:06:57.837910 7002 factory.go:656] Stopping watch factory\\\\nI0227 01:06:57.837945 7002 ovnkube.go:599] Stopped ovnkube\\\\nI0227 01:06:57.837991 7002 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-h5vs8_openshift-ovn-kubernetes(21f824c6-1bde-4e58-b4ef-72a56a140abb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:17 crc kubenswrapper[4771]: I0227 01:07:17.672580 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d591794984ea0c81e0d8c756e7c24134aac6c8c15e34cf5119921f2e335041f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:17 crc kubenswrapper[4771]: I0227 01:07:17.692695 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1615ac61dec5d7d017c23cffd34575ac0575bc8cb679233a57222eca25dca675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd417d1b0b53abe6fec3c4c5964722f08a264f22ce487a8ae6447c111dfccc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-54zs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:17 crc kubenswrapper[4771]: I0227 01:07:17.714959 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6cc63e-9af8-4fdb-bd3f-ac14e85d530f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 01:04:49.945337 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 01:04:49.947347 1 observer_polling.go:159] Starting file observer\\\\nI0227 01:04:49.977213 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 01:04:49.981579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 01:05:16.797987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 01:05:16.798250 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:17 crc kubenswrapper[4771]: I0227 01:07:17.735306 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:17 crc kubenswrapper[4771]: I0227 01:07:17.759257 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:17 crc kubenswrapper[4771]: I0227 01:07:17.772795 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:17 crc kubenswrapper[4771]: E0227 01:07:17.773012 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:17 crc kubenswrapper[4771]: I0227 01:07:17.784758 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:17 crc kubenswrapper[4771]: I0227 01:07:17.806101 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c7c6ab7e2f9f9a791788e08cb062278f1f02827724ac88cb37e84e901b8dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:07:15Z\\\",\\\"message\\\":\\\"2026-02-27T01:06:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8380b21f-640b-4d86-b7e6-ef54b71437d2\\\\n2026-02-27T01:06:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8380b21f-640b-4d86-b7e6-ef54b71437d2 to /host/opt/cni/bin/\\\\n2026-02-27T01:06:30Z [verbose] multus-daemon started\\\\n2026-02-27T01:06:30Z [verbose] Readiness Indicator file check\\\\n2026-02-27T01:07:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:17 crc kubenswrapper[4771]: I0227 01:07:17.822129 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-24pv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15dd6a85-eabc-4a32-a283-33bf72d2a041\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-24pv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:17 crc kubenswrapper[4771]: I0227 01:07:17.858853 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:17 crc kubenswrapper[4771]: I0227 01:07:17.877388 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:17 crc kubenswrapper[4771]: I0227 01:07:17.897137 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:17 crc kubenswrapper[4771]: E0227 01:07:17.897801 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 01:07:17 crc kubenswrapper[4771]: I0227 01:07:17.916472 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:17 crc kubenswrapper[4771]: I0227 01:07:17.933155 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f66a3f1-a1b0-476b-89ab-41828d8a23c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33db46668493c52afb6e30bae58f1197802fc18fcc703f9d05931471b2526bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682d71bc07b1c3a5e3d9be1eb9f1dc1a19176f2bcf9942f18c8966864d51b032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be367e10a4cf507c9297ec9727c7847db6b54785f2efae23213c2a325e50cb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a16d7c04e5c94174f79e9a3d7e2fd64f70f7677c4128e6ff85d90ee3446cc724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a16d7c04e5c94174f79e9a3d7e2fd64f70f7677c4128e6ff85d90ee3446cc724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:17 crc kubenswrapper[4771]: I0227 01:07:17.954581 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:17 crc kubenswrapper[4771]: I0227 01:07:17.982722 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e46d4e9f36c95d9f88b09a405895230f9479f1d0779f23564151fb4593a92d3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e46d4e9f36c95d9f88b09a405895230f9479f1d0779f23564151fb4593a92d3e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:06:57Z\\\",\\\"message\\\":\\\"tor *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.836788 7002 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 01:06:57.836916 7002 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837144 7002 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837341 7002 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837353 7002 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 01:06:57.837407 7002 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 01:06:57.837783 7002 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 01:06:57.837859 7002 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 01:06:57.837910 7002 factory.go:656] Stopping watch factory\\\\nI0227 01:06:57.837945 7002 ovnkube.go:599] Stopped ovnkube\\\\nI0227 01:06:57.837991 7002 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-h5vs8_openshift-ovn-kubernetes(21f824c6-1bde-4e58-b4ef-72a56a140abb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:17 crc kubenswrapper[4771]: I0227 01:07:17.994456 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d591794984ea0c81e0d8c756e7c24134aac6c8c15e34cf5119921f2e335041f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:17Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:18 crc kubenswrapper[4771]: I0227 01:07:18.009796 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1615ac61dec5d7d017c23cffd34575ac0575bc8cb679233a57222eca25dca675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd417d1b0b53abe6fec3c4c5964722f08a264f22ce487a8ae6447c111dfccc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-54zs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:18 crc kubenswrapper[4771]: I0227 01:07:18.026694 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6cc63e-9af8-4fdb-bd3f-ac14e85d530f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 01:04:49.945337 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 01:04:49.947347 1 observer_polling.go:159] Starting file observer\\\\nI0227 01:04:49.977213 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 01:04:49.981579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 01:05:16.797987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 01:05:16.798250 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:18 crc kubenswrapper[4771]: I0227 01:07:18.040780 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:18 crc kubenswrapper[4771]: I0227 01:07:18.054688 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:18 crc kubenswrapper[4771]: I0227 01:07:18.067739 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:18 crc kubenswrapper[4771]: I0227 01:07:18.087482 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c7c6ab7e2f9f9a791788e08cb062278f1f02827724ac88cb37e84e901b8dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:07:15Z\\\",\\\"message\\\":\\\"2026-02-27T01:06:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8380b21f-640b-4d86-b7e6-ef54b71437d2\\\\n2026-02-27T01:06:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8380b21f-640b-4d86-b7e6-ef54b71437d2 to /host/opt/cni/bin/\\\\n2026-02-27T01:06:30Z [verbose] multus-daemon started\\\\n2026-02-27T01:06:30Z [verbose] Readiness Indicator file check\\\\n2026-02-27T01:07:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:18 crc kubenswrapper[4771]: I0227 01:07:18.102199 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-24pv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15dd6a85-eabc-4a32-a283-33bf72d2a041\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-24pv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:18 crc kubenswrapper[4771]: I0227 01:07:18.133187 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:18 crc kubenswrapper[4771]: I0227 01:07:18.150322 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:18 crc kubenswrapper[4771]: I0227 01:07:18.167631 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:18 crc kubenswrapper[4771]: I0227 01:07:18.183880 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:18 crc kubenswrapper[4771]: I0227 01:07:18.198424 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f66a3f1-a1b0-476b-89ab-41828d8a23c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33db46668493c52afb6e30bae58f1197802fc18fcc703f9d05931471b2526bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682d71bc07b1c3a5e3d9be1eb9f1dc1a19176f2bcf9942f18c8966864d51b032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be367e10a4cf507c9297ec9727c7847db6b54785f2efae23213c2a325e50cb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a16d7c04e5c94174f79e9a3d7e2fd64f70f7677c4128e6ff85d90ee3446cc724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a16d7c04e5c94174f79e9a3d7e2fd64f70f7677c4128e6ff85d90ee3446cc724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:18 crc kubenswrapper[4771]: I0227 01:07:18.213652 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:18 crc kubenswrapper[4771]: I0227 01:07:18.228954 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:18 crc kubenswrapper[4771]: I0227 01:07:18.245363 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:18Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:18 crc kubenswrapper[4771]: I0227 01:07:18.773046 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:18 crc kubenswrapper[4771]: I0227 01:07:18.773091 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:18 crc kubenswrapper[4771]: I0227 01:07:18.773144 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:18 crc kubenswrapper[4771]: E0227 01:07:18.773673 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:18 crc kubenswrapper[4771]: E0227 01:07:18.773874 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:18 crc kubenswrapper[4771]: E0227 01:07:18.774067 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:19 crc kubenswrapper[4771]: I0227 01:07:19.772379 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:19 crc kubenswrapper[4771]: E0227 01:07:19.772605 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:19 crc kubenswrapper[4771]: I0227 01:07:19.801639 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:07:19 crc kubenswrapper[4771]: I0227 01:07:19.801718 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:07:19 crc kubenswrapper[4771]: I0227 01:07:19.801736 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:07:19 crc kubenswrapper[4771]: I0227 01:07:19.801759 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:07:19 crc kubenswrapper[4771]: I0227 01:07:19.801780 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:07:19Z","lastTransitionTime":"2026-02-27T01:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:07:19 crc kubenswrapper[4771]: E0227 01:07:19.822573 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:19 crc kubenswrapper[4771]: I0227 01:07:19.827833 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:07:19 crc kubenswrapper[4771]: I0227 01:07:19.827867 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:07:19 crc kubenswrapper[4771]: I0227 01:07:19.827878 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:07:19 crc kubenswrapper[4771]: I0227 01:07:19.827894 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:07:19 crc kubenswrapper[4771]: I0227 01:07:19.827907 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:07:19Z","lastTransitionTime":"2026-02-27T01:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:07:19 crc kubenswrapper[4771]: E0227 01:07:19.846027 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:19 crc kubenswrapper[4771]: I0227 01:07:19.851458 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:07:19 crc kubenswrapper[4771]: I0227 01:07:19.851519 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:07:19 crc kubenswrapper[4771]: I0227 01:07:19.851537 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:07:19 crc kubenswrapper[4771]: I0227 01:07:19.851601 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:07:19 crc kubenswrapper[4771]: I0227 01:07:19.851622 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:07:19Z","lastTransitionTime":"2026-02-27T01:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:07:19 crc kubenswrapper[4771]: E0227 01:07:19.873419 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:19 crc kubenswrapper[4771]: I0227 01:07:19.879043 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:07:19 crc kubenswrapper[4771]: I0227 01:07:19.879081 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:07:19 crc kubenswrapper[4771]: I0227 01:07:19.879090 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:07:19 crc kubenswrapper[4771]: I0227 01:07:19.879105 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:07:19 crc kubenswrapper[4771]: I0227 01:07:19.879113 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:07:19Z","lastTransitionTime":"2026-02-27T01:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:07:19 crc kubenswrapper[4771]: E0227 01:07:19.900438 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:19 crc kubenswrapper[4771]: I0227 01:07:19.905925 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:07:19 crc kubenswrapper[4771]: I0227 01:07:19.905972 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:07:19 crc kubenswrapper[4771]: I0227 01:07:19.905983 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:07:19 crc kubenswrapper[4771]: I0227 01:07:19.906000 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:07:19 crc kubenswrapper[4771]: I0227 01:07:19.906011 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:07:19Z","lastTransitionTime":"2026-02-27T01:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:07:19 crc kubenswrapper[4771]: E0227 01:07:19.920139 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:19Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:19 crc kubenswrapper[4771]: E0227 01:07:19.920466 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 01:07:20 crc kubenswrapper[4771]: I0227 01:07:20.605967 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:07:20 crc kubenswrapper[4771]: E0227 01:07:20.606206 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:24.60617326 +0000 UTC m=+217.543734558 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:07:20 crc kubenswrapper[4771]: I0227 01:07:20.606332 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:20 crc kubenswrapper[4771]: I0227 01:07:20.606361 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:20 crc kubenswrapper[4771]: E0227 01:07:20.606500 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 01:07:20 crc kubenswrapper[4771]: E0227 01:07:20.606507 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 01:07:20 crc kubenswrapper[4771]: E0227 01:07:20.606637 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:24.606628572 +0000 UTC m=+217.544189970 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 01:07:20 crc kubenswrapper[4771]: E0227 01:07:20.606715 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:24.606692334 +0000 UTC m=+217.544253642 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 01:07:20 crc kubenswrapper[4771]: I0227 01:07:20.707852 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:20 crc kubenswrapper[4771]: I0227 01:07:20.707967 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:20 crc kubenswrapper[4771]: E0227 01:07:20.708028 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 01:07:20 crc kubenswrapper[4771]: E0227 01:07:20.708068 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 01:07:20 crc kubenswrapper[4771]: E0227 01:07:20.708087 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:07:20 crc kubenswrapper[4771]: E0227 01:07:20.708164 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:24.708142444 +0000 UTC m=+217.645703772 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:07:20 crc kubenswrapper[4771]: E0227 01:07:20.708224 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 01:07:20 crc kubenswrapper[4771]: E0227 01:07:20.708255 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 01:07:20 crc kubenswrapper[4771]: E0227 01:07:20.708276 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:07:20 crc kubenswrapper[4771]: E0227 01:07:20.708350 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:24.708325939 +0000 UTC m=+217.645887257 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 01:07:20 crc kubenswrapper[4771]: I0227 01:07:20.772202 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:20 crc kubenswrapper[4771]: I0227 01:07:20.772286 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:20 crc kubenswrapper[4771]: I0227 01:07:20.772215 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:20 crc kubenswrapper[4771]: E0227 01:07:20.772440 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:20 crc kubenswrapper[4771]: E0227 01:07:20.772541 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:20 crc kubenswrapper[4771]: E0227 01:07:20.772744 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:21 crc kubenswrapper[4771]: I0227 01:07:21.773090 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:21 crc kubenswrapper[4771]: E0227 01:07:21.773328 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:22 crc kubenswrapper[4771]: I0227 01:07:22.772335 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:22 crc kubenswrapper[4771]: I0227 01:07:22.772379 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:22 crc kubenswrapper[4771]: E0227 01:07:22.772463 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:22 crc kubenswrapper[4771]: I0227 01:07:22.772355 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:22 crc kubenswrapper[4771]: E0227 01:07:22.772779 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:22 crc kubenswrapper[4771]: E0227 01:07:22.772883 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:22 crc kubenswrapper[4771]: I0227 01:07:22.788343 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 27 01:07:22 crc kubenswrapper[4771]: E0227 01:07:22.899785 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 01:07:23 crc kubenswrapper[4771]: I0227 01:07:23.772618 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:23 crc kubenswrapper[4771]: E0227 01:07:23.772867 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:24 crc kubenswrapper[4771]: I0227 01:07:24.772447 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:24 crc kubenswrapper[4771]: I0227 01:07:24.772498 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:24 crc kubenswrapper[4771]: I0227 01:07:24.772615 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:24 crc kubenswrapper[4771]: E0227 01:07:24.772677 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:24 crc kubenswrapper[4771]: E0227 01:07:24.772849 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:24 crc kubenswrapper[4771]: E0227 01:07:24.773429 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:25 crc kubenswrapper[4771]: I0227 01:07:25.772237 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:25 crc kubenswrapper[4771]: E0227 01:07:25.772468 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:26 crc kubenswrapper[4771]: I0227 01:07:26.773081 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:26 crc kubenswrapper[4771]: I0227 01:07:26.773114 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:26 crc kubenswrapper[4771]: I0227 01:07:26.773178 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:26 crc kubenswrapper[4771]: E0227 01:07:26.773245 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:26 crc kubenswrapper[4771]: E0227 01:07:26.773349 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:26 crc kubenswrapper[4771]: E0227 01:07:26.773437 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:27 crc kubenswrapper[4771]: I0227 01:07:27.773713 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:27 crc kubenswrapper[4771]: E0227 01:07:27.774199 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:27 crc kubenswrapper[4771]: I0227 01:07:27.796690 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f66a3f1-a1b0-476b-89ab-41828d8a23c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33db46668493c52afb6e30bae58f1197802fc18fcc703f9d05931471b2526bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682d71bc07b1c3a5e3d9be1eb9f1dc1a19176f2bcf9942f18c8966864d51b032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be367e10a4cf507c9297ec9727c7847db6b54785f2efae23213c2a325e50cb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a16d7c04e5c94174f79e9a3d7e2fd64f70f7677c4128e6ff85d90ee3446cc724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a16d7c04e5c94174f79e9a3d7e2fd64f70f7677c4128e6ff85d90ee3446cc724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:27Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:27 crc kubenswrapper[4771]: I0227 01:07:27.812109 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fee99e-baf5-4752-b692-3d8104db3d71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86f48f09a5e07a4685c65f576a1110899727e8aed66e70b9cc68ef5ad582c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a00d290ae277671ef6fd27f856398b2bcc7d62616557e2f31ec50efb045d1c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a00d290ae277671ef6fd27f856398b2bcc7d62616557e2f31ec50efb045d1c56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:27Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:27 crc kubenswrapper[4771]: I0227 01:07:27.830796 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:27Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:27 crc kubenswrapper[4771]: I0227 01:07:27.843880 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:27Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:27 crc kubenswrapper[4771]: I0227 01:07:27.857515 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:27Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:27 crc kubenswrapper[4771]: I0227 01:07:27.869724 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1615ac61dec5d7d017c23cffd34575ac0575bc8cb679233a57222eca25dca675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd417d1b0b53abe6fec3c4c5964722f08a264f22ce487a8ae6447c111dfccc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-54zs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:27Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:27 crc kubenswrapper[4771]: I0227 01:07:27.880117 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6cc63e-9af8-4fdb-bd3f-ac14e85d530f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 01:04:49.945337 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 01:04:49.947347 1 observer_polling.go:159] Starting file observer\\\\nI0227 01:04:49.977213 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 01:04:49.981579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 01:05:16.797987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 01:05:16.798250 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:27Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:27 crc kubenswrapper[4771]: I0227 01:07:27.892807 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:27Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:27 crc kubenswrapper[4771]: E0227 01:07:27.900945 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 01:07:27 crc kubenswrapper[4771]: I0227 01:07:27.903535 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:27Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:27 crc kubenswrapper[4771]: I0227 01:07:27.912020 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:27Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:27 crc kubenswrapper[4771]: I0227 01:07:27.924134 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:27Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:27 crc kubenswrapper[4771]: I0227 01:07:27.940507 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e46d4e9f36c95d9f88b09a405895230f9479f1d0779f23564151fb4593a92d3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e46d4e9f36c95d9f88b09a405895230f9479f1d0779f23564151fb4593a92d3e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:06:57Z\\\",\\\"message\\\":\\\"tor *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.836788 7002 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 01:06:57.836916 7002 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837144 7002 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837341 7002 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837353 7002 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 01:06:57.837407 7002 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 01:06:57.837783 7002 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 01:06:57.837859 7002 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 01:06:57.837910 7002 factory.go:656] Stopping watch factory\\\\nI0227 01:06:57.837945 7002 ovnkube.go:599] Stopped ovnkube\\\\nI0227 01:06:57.837991 7002 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-h5vs8_openshift-ovn-kubernetes(21f824c6-1bde-4e58-b4ef-72a56a140abb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:27Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:27 crc kubenswrapper[4771]: I0227 01:07:27.950574 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d591794984ea0c81e0d8c756e7c24134aac6c8c15e34cf5119921f2e335041f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:27Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:27 crc kubenswrapper[4771]: I0227 01:07:27.966957 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:27Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:27 crc kubenswrapper[4771]: I0227 01:07:27.982739 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:27Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:27 crc kubenswrapper[4771]: I0227 01:07:27.995394 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:27Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:28 crc kubenswrapper[4771]: I0227 01:07:28.005755 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:28Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:28 crc kubenswrapper[4771]: I0227 01:07:28.018702 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c7c6ab7e2f9f9a791788e08cb062278f1f02827724ac88cb37e84e901b8dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:07:15Z\\\",\\\"message\\\":\\\"2026-02-27T01:06:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8380b21f-640b-4d86-b7e6-ef54b71437d2\\\\n2026-02-27T01:06:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8380b21f-640b-4d86-b7e6-ef54b71437d2 to /host/opt/cni/bin/\\\\n2026-02-27T01:06:30Z [verbose] multus-daemon started\\\\n2026-02-27T01:06:30Z [verbose] Readiness Indicator file check\\\\n2026-02-27T01:07:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:28Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:28 crc kubenswrapper[4771]: I0227 01:07:28.030176 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-24pv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15dd6a85-eabc-4a32-a283-33bf72d2a041\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-24pv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:28Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:28 crc kubenswrapper[4771]: I0227 01:07:28.772171 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:28 crc kubenswrapper[4771]: I0227 01:07:28.772212 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:28 crc kubenswrapper[4771]: I0227 01:07:28.772286 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:28 crc kubenswrapper[4771]: E0227 01:07:28.772422 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:28 crc kubenswrapper[4771]: E0227 01:07:28.772649 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:28 crc kubenswrapper[4771]: E0227 01:07:28.773022 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:28 crc kubenswrapper[4771]: I0227 01:07:28.773196 4771 scope.go:117] "RemoveContainer" containerID="e46d4e9f36c95d9f88b09a405895230f9479f1d0779f23564151fb4593a92d3e" Feb 27 01:07:29 crc kubenswrapper[4771]: I0227 01:07:29.576417 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h5vs8_21f824c6-1bde-4e58-b4ef-72a56a140abb/ovnkube-controller/2.log" Feb 27 01:07:29 crc kubenswrapper[4771]: I0227 01:07:29.582638 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerStarted","Data":"143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b"} Feb 27 01:07:29 crc kubenswrapper[4771]: I0227 01:07:29.583049 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:07:29 crc kubenswrapper[4771]: I0227 01:07:29.598476 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f66a3f1-a1b0-476b-89ab-41828d8a23c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33db46668493c52afb6e30bae58f1197802fc18fcc703f9d05931471b2526bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682d71bc07b1c3a5e3d9be1eb9f1dc1a19176f2bcf9942f18c8966864d51b032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be367e10a4cf507c9297ec9727c7847db6b54785f2efae23213c2a325e50cb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a16d7c04e5c94174f79e9a3d7e2fd64f70f7677c4128e6ff85d90ee3446cc724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a16d7c04e5c94174f79e9a3d7e2fd64f70f7677c4128e6ff85d90ee3446cc724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:29 crc kubenswrapper[4771]: I0227 01:07:29.609896 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fee99e-baf5-4752-b692-3d8104db3d71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86f48f09a5e07a4685c65f576a1110899727e8aed66e70b9cc68ef5ad582c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a00d290ae277671ef6fd27f856398b2bcc7d62616557e2f31ec50efb045d1c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a00d290ae277671ef6fd27f856398b2bcc7d62616557e2f31ec50efb045d1c56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:29 crc kubenswrapper[4771]: I0227 01:07:29.631235 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:29 crc kubenswrapper[4771]: I0227 01:07:29.648053 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:29 crc kubenswrapper[4771]: I0227 01:07:29.661930 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:29 crc kubenswrapper[4771]: I0227 01:07:29.683216 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e46d4e9f36c95d9f88b09a405895230f9479f1d0779f23564151fb4593a92d3e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:06:57Z\\\",\\\"message\\\":\\\"tor *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.836788 7002 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 01:06:57.836916 7002 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837144 7002 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837341 7002 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837353 7002 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 01:06:57.837407 7002 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 01:06:57.837783 7002 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 01:06:57.837859 7002 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 01:06:57.837910 7002 factory.go:656] Stopping watch factory\\\\nI0227 01:06:57.837945 7002 ovnkube.go:599] Stopped ovnkube\\\\nI0227 01:06:57.837991 7002 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:29 crc kubenswrapper[4771]: I0227 01:07:29.696178 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d591794984ea0c81e0d8c756e7c24134aac6c8c15e34cf5119921f2e335041f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:29 crc kubenswrapper[4771]: I0227 01:07:29.714123 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1615ac61dec5d7d017c23cffd34575ac0575bc8cb679233a57222eca25dca675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd417d1b0b53abe6fec3c4c5964722f08a264f22ce487a8ae6447c111dfccc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-54zs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:29 crc kubenswrapper[4771]: I0227 01:07:29.734918 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6cc63e-9af8-4fdb-bd3f-ac14e85d530f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 01:04:49.945337 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 01:04:49.947347 1 observer_polling.go:159] Starting file observer\\\\nI0227 01:04:49.977213 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 01:04:49.981579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 01:05:16.797987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 01:05:16.798250 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:29 crc kubenswrapper[4771]: I0227 01:07:29.753332 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:29 crc kubenswrapper[4771]: I0227 01:07:29.770427 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:29 crc kubenswrapper[4771]: I0227 01:07:29.772533 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:29 crc kubenswrapper[4771]: E0227 01:07:29.772820 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:29 crc kubenswrapper[4771]: I0227 01:07:29.782767 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:29 crc kubenswrapper[4771]: I0227 01:07:29.802190 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:29 crc kubenswrapper[4771]: I0227 01:07:29.817334 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-24pv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15dd6a85-eabc-4a32-a283-33bf72d2a041\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-24pv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:29 crc kubenswrapper[4771]: I0227 01:07:29.842842 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:29 crc kubenswrapper[4771]: I0227 01:07:29.863581 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:29 crc kubenswrapper[4771]: I0227 01:07:29.882173 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:29 crc kubenswrapper[4771]: I0227 01:07:29.898708 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:29 crc kubenswrapper[4771]: I0227 01:07:29.919504 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c7c6ab7e2f9f9a791788e08cb062278f1f02827724ac88cb37e84e901b8dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:07:15Z\\\",\\\"message\\\":\\\"2026-02-27T01:06:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8380b21f-640b-4d86-b7e6-ef54b71437d2\\\\n2026-02-27T01:06:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8380b21f-640b-4d86-b7e6-ef54b71437d2 to /host/opt/cni/bin/\\\\n2026-02-27T01:06:30Z [verbose] multus-daemon started\\\\n2026-02-27T01:06:30Z [verbose] Readiness Indicator file check\\\\n2026-02-27T01:07:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:29Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.149293 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.149352 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.149370 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.149391 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.149406 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:07:30Z","lastTransitionTime":"2026-02-27T01:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:07:30 crc kubenswrapper[4771]: E0227 01:07:30.167216 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.172540 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.172642 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.172664 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.172689 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.172705 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:07:30Z","lastTransitionTime":"2026-02-27T01:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:07:30 crc kubenswrapper[4771]: E0227 01:07:30.186251 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.190576 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.190649 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.190675 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.190706 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.190723 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:07:30Z","lastTransitionTime":"2026-02-27T01:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:07:30 crc kubenswrapper[4771]: E0227 01:07:30.209087 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.214506 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.214630 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.214658 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.214689 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.214711 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:07:30Z","lastTransitionTime":"2026-02-27T01:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:07:30 crc kubenswrapper[4771]: E0227 01:07:30.228094 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.231886 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.231930 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.231939 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.231958 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.231969 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:07:30Z","lastTransitionTime":"2026-02-27T01:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:07:30 crc kubenswrapper[4771]: E0227 01:07:30.244752 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:30 crc kubenswrapper[4771]: E0227 01:07:30.244894 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.588007 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h5vs8_21f824c6-1bde-4e58-b4ef-72a56a140abb/ovnkube-controller/3.log" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.588765 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h5vs8_21f824c6-1bde-4e58-b4ef-72a56a140abb/ovnkube-controller/2.log" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.591675 4771 generic.go:334] "Generic (PLEG): container finished" podID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerID="143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b" exitCode=1 Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.591720 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerDied","Data":"143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b"} Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.591767 4771 scope.go:117] "RemoveContainer" containerID="e46d4e9f36c95d9f88b09a405895230f9479f1d0779f23564151fb4593a92d3e" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.592401 4771 scope.go:117] "RemoveContainer" containerID="143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b" Feb 27 01:07:30 crc kubenswrapper[4771]: E0227 01:07:30.592601 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-h5vs8_openshift-ovn-kubernetes(21f824c6-1bde-4e58-b4ef-72a56a140abb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.605464 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f66a3f1-a1b0-476b-89ab-41828d8a23c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33db46668493c52afb6e30bae58f1197802fc18fcc703f9d05931471b2526bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682d71bc07b1c3a5e3d9be1eb9f1dc1a19176f2bcf9942f18c8966864d51b032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be367e10a4cf507c9297ec9727c7847db6b54785f2efae23213c2a325e50cb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a16d7c04e5c94174f79e9a3d7e2fd64f70f7677c4128e6ff85d90ee3446cc724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a16d7c04e5c94174f79e9a3d7e2fd64f70f7677c4128e6ff85d90ee3446cc724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.617605 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fee99e-baf5-4752-b692-3d8104db3d71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86f48f09a5e07a4685c65f576a1110899727e8aed66e70b9cc68ef5ad582c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a00d290ae277671ef6fd27f856398b2bcc7d62616557e2f31ec50efb045d1c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a00d290ae277671ef6fd27f856398b2bcc7d62616557e2f31ec50efb045d1c56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.630139 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.648986 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.667279 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.683978 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.696814 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.719857 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.748221 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e46d4e9f36c95d9f88b09a405895230f9479f1d0779f23564151fb4593a92d3e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:06:57Z\\\",\\\"message\\\":\\\"tor *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.836788 7002 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 01:06:57.836916 7002 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837144 7002 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837341 7002 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:06:57.837353 7002 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 01:06:57.837407 7002 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 01:06:57.837783 7002 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 01:06:57.837859 7002 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 01:06:57.837910 7002 factory.go:656] Stopping watch factory\\\\nI0227 01:06:57.837945 7002 ovnkube.go:599] Stopped ovnkube\\\\nI0227 01:06:57.837991 7002 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 01\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:07:29Z\\\",\\\"message\\\":\\\") from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:07:29.647255 7328 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 01:07:29.648736 7328 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 01:07:29.648805 7328 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 01:07:29.648841 7328 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0227 01:07:29.648849 7328 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 01:07:29.648911 7328 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 01:07:29.648926 7328 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 01:07:29.648928 7328 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 01:07:29.648949 7328 factory.go:656] Stopping watch factory\\\\nI0227 01:07:29.648953 7328 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 01:07:29.648965 7328 ovnkube.go:599] Stopped ovnkube\\\\nI0227 01:07:29.648969 7328 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 01:07:29.648961 7328 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 01:07:29.648995 7328 handler.go:208] Removed *v1.Pod event handler 6\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.760683 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d591794984ea0c81e0d8c756e7c24134aac6c8c15e34cf5119921f2e335041f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.772374 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.772386 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:30 crc kubenswrapper[4771]: E0227 01:07:30.772663 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.772386 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:30 crc kubenswrapper[4771]: E0227 01:07:30.772826 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:30 crc kubenswrapper[4771]: E0227 01:07:30.772950 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.777259 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1615ac61dec5d7d017c23cffd34575ac0575bc8cb679233a57222eca25dca675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd417d1b0b53abe6fec3c4c5964722f08a264f22ce487a8ae6447c111dfccc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-54zs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.794461 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6cc63e-9af8-4fdb-bd3f-ac14e85d530f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 01:04:49.945337 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 01:04:49.947347 1 observer_polling.go:159] Starting file observer\\\\nI0227 01:04:49.977213 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 01:04:49.981579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 01:05:16.797987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 01:05:16.798250 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.808128 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.819485 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.836292 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.857531 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c7c6ab7e2f9f9a791788e08cb062278f1f02827724ac88cb37e84e901b8dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:07:15Z\\\",\\\"message\\\":\\\"2026-02-27T01:06:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8380b21f-640b-4d86-b7e6-ef54b71437d2\\\\n2026-02-27T01:06:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8380b21f-640b-4d86-b7e6-ef54b71437d2 to /host/opt/cni/bin/\\\\n2026-02-27T01:06:30Z [verbose] multus-daemon started\\\\n2026-02-27T01:06:30Z [verbose] Readiness Indicator file check\\\\n2026-02-27T01:07:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.874020 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-24pv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15dd6a85-eabc-4a32-a283-33bf72d2a041\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-24pv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.903372 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:30 crc kubenswrapper[4771]: I0227 01:07:30.925394 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:30Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:31 crc kubenswrapper[4771]: I0227 01:07:31.598163 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h5vs8_21f824c6-1bde-4e58-b4ef-72a56a140abb/ovnkube-controller/3.log" Feb 27 01:07:31 crc kubenswrapper[4771]: I0227 01:07:31.602287 4771 scope.go:117] "RemoveContainer" containerID="143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b" Feb 27 01:07:31 crc kubenswrapper[4771]: E0227 01:07:31.602448 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-h5vs8_openshift-ovn-kubernetes(21f824c6-1bde-4e58-b4ef-72a56a140abb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" Feb 27 01:07:31 crc kubenswrapper[4771]: I0227 01:07:31.626779 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:31 crc kubenswrapper[4771]: I0227 01:07:31.650324 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:31 crc kubenswrapper[4771]: I0227 01:07:31.670008 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:31 crc kubenswrapper[4771]: I0227 01:07:31.689078 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:31 crc kubenswrapper[4771]: I0227 01:07:31.710731 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c7c6ab7e2f9f9a791788e08cb062278f1f02827724ac88cb37e84e901b8dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:07:15Z\\\",\\\"message\\\":\\\"2026-02-27T01:06:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8380b21f-640b-4d86-b7e6-ef54b71437d2\\\\n2026-02-27T01:06:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8380b21f-640b-4d86-b7e6-ef54b71437d2 to /host/opt/cni/bin/\\\\n2026-02-27T01:06:30Z [verbose] multus-daemon started\\\\n2026-02-27T01:06:30Z [verbose] Readiness Indicator file check\\\\n2026-02-27T01:07:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:31 crc kubenswrapper[4771]: I0227 01:07:31.728958 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-24pv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15dd6a85-eabc-4a32-a283-33bf72d2a041\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-24pv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:31 crc kubenswrapper[4771]: I0227 01:07:31.745356 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f66a3f1-a1b0-476b-89ab-41828d8a23c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33db46668493c52afb6e30bae58f1197802fc18fcc703f9d05931471b2526bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682d71bc07b1c3a5e3d9be1eb9f1dc1a19176f2bcf9942f18c8966864d51b032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be367e10a4cf507c9297ec9727c7847db6b54785f2efae23213c2a325e50cb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a16d7c04e5c94174f79e9a3d7e2fd64f70f7677c4128e6ff85d90ee3446cc724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a16d7c04e5c94174f79e9a3d7e2fd64f70f7677c4128e6ff85d90ee3446cc724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:31 crc kubenswrapper[4771]: I0227 01:07:31.758946 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fee99e-baf5-4752-b692-3d8104db3d71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86f48f09a5e07a4685c65f576a1110899727e8aed66e70b9cc68ef5ad582c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a00d290ae277671ef6fd27f856398b2bcc7d62616557e2f31ec50efb045d1c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a00d290ae277671ef6fd27f856398b2bcc7d62616557e2f31ec50efb045d1c56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:31 crc kubenswrapper[4771]: I0227 01:07:31.772377 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:31 crc kubenswrapper[4771]: I0227 01:07:31.772479 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:31 crc kubenswrapper[4771]: E0227 01:07:31.772704 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:31 crc kubenswrapper[4771]: I0227 01:07:31.792135 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:31 crc kubenswrapper[4771]: I0227 01:07:31.802699 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:31 crc kubenswrapper[4771]: I0227 01:07:31.817511 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1615ac61dec5d7d017c23cffd34575ac0575bc8cb679233a57222eca25dca675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd417d1b0b53abe6fec3c4c5964722f08a264f22ce487a8ae6447c111dfccc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-54zs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:31 crc kubenswrapper[4771]: I0227 01:07:31.831710 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6cc63e-9af8-4fdb-bd3f-ac14e85d530f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 01:04:49.945337 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 01:04:49.947347 1 observer_polling.go:159] Starting file observer\\\\nI0227 01:04:49.977213 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 01:04:49.981579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 01:05:16.797987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 01:05:16.798250 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:31 crc kubenswrapper[4771]: I0227 01:07:31.848729 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:31 crc kubenswrapper[4771]: I0227 01:07:31.862874 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:31 crc kubenswrapper[4771]: I0227 01:07:31.873999 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:31 crc kubenswrapper[4771]: I0227 01:07:31.891380 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:31 crc kubenswrapper[4771]: I0227 01:07:31.918308 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:07:29Z\\\",\\\"message\\\":\\\") from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:07:29.647255 7328 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 01:07:29.648736 7328 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 01:07:29.648805 7328 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 01:07:29.648841 7328 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0227 01:07:29.648849 7328 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 01:07:29.648911 7328 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 01:07:29.648926 7328 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 01:07:29.648928 7328 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 01:07:29.648949 7328 factory.go:656] Stopping watch factory\\\\nI0227 01:07:29.648953 7328 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 01:07:29.648965 7328 ovnkube.go:599] Stopped ovnkube\\\\nI0227 01:07:29.648969 7328 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 01:07:29.648961 7328 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 01:07:29.648995 7328 handler.go:208] Removed *v1.Pod event handler 6\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:07:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-h5vs8_openshift-ovn-kubernetes(21f824c6-1bde-4e58-b4ef-72a56a140abb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:31 crc kubenswrapper[4771]: I0227 01:07:31.929703 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d591794984ea0c81e0d8c756e7c24134aac6c8c15e34cf5119921f2e335041f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:31Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:32 crc kubenswrapper[4771]: I0227 01:07:32.772648 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:32 crc kubenswrapper[4771]: I0227 01:07:32.772708 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:32 crc kubenswrapper[4771]: E0227 01:07:32.772769 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:32 crc kubenswrapper[4771]: I0227 01:07:32.772800 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:32 crc kubenswrapper[4771]: E0227 01:07:32.772948 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:32 crc kubenswrapper[4771]: E0227 01:07:32.773245 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:32 crc kubenswrapper[4771]: E0227 01:07:32.902704 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 01:07:33 crc kubenswrapper[4771]: I0227 01:07:33.772587 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:33 crc kubenswrapper[4771]: E0227 01:07:33.772782 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:34 crc kubenswrapper[4771]: I0227 01:07:34.772664 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:34 crc kubenswrapper[4771]: I0227 01:07:34.772712 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:34 crc kubenswrapper[4771]: I0227 01:07:34.772664 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:34 crc kubenswrapper[4771]: E0227 01:07:34.772837 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:34 crc kubenswrapper[4771]: E0227 01:07:34.772999 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:34 crc kubenswrapper[4771]: E0227 01:07:34.773089 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:35 crc kubenswrapper[4771]: I0227 01:07:35.772587 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:35 crc kubenswrapper[4771]: E0227 01:07:35.772815 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:36 crc kubenswrapper[4771]: I0227 01:07:36.772422 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:36 crc kubenswrapper[4771]: I0227 01:07:36.772540 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:36 crc kubenswrapper[4771]: I0227 01:07:36.772441 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:36 crc kubenswrapper[4771]: E0227 01:07:36.772629 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:36 crc kubenswrapper[4771]: E0227 01:07:36.772750 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:36 crc kubenswrapper[4771]: E0227 01:07:36.772861 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:37 crc kubenswrapper[4771]: I0227 01:07:37.772181 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:37 crc kubenswrapper[4771]: E0227 01:07:37.772371 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:37 crc kubenswrapper[4771]: I0227 01:07:37.785481 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d591794984ea0c81e0d8c756e7c24134aac6c8c15e34cf5119921f2e335041f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:37Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:37 crc kubenswrapper[4771]: I0227 01:07:37.798768 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1615ac61dec5d7d017c23cffd34575ac0575bc8cb679233a57222eca25dca675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd417d1b0b53abe6fec3c4c5964722f08a264f22ce487a8ae6447c111dfccc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-54zs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:37Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:37 crc kubenswrapper[4771]: I0227 01:07:37.815687 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6cc63e-9af8-4fdb-bd3f-ac14e85d530f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 01:04:49.945337 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 01:04:49.947347 1 observer_polling.go:159] Starting file observer\\\\nI0227 01:04:49.977213 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 01:04:49.981579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 01:05:16.797987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 01:05:16.798250 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:37Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:37 crc kubenswrapper[4771]: I0227 01:07:37.834871 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:37Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:37 crc kubenswrapper[4771]: I0227 01:07:37.853013 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:37Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:37 crc kubenswrapper[4771]: I0227 01:07:37.866282 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:37Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:37 crc kubenswrapper[4771]: E0227 01:07:37.916107 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 01:07:37 crc kubenswrapper[4771]: I0227 01:07:37.923781 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:37Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:37 crc kubenswrapper[4771]: I0227 01:07:37.955588 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:07:29Z\\\",\\\"message\\\":\\\") from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:07:29.647255 7328 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 01:07:29.648736 7328 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 01:07:29.648805 7328 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 01:07:29.648841 7328 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0227 01:07:29.648849 7328 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 01:07:29.648911 7328 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 01:07:29.648926 7328 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 01:07:29.648928 7328 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 01:07:29.648949 7328 factory.go:656] Stopping watch factory\\\\nI0227 01:07:29.648953 7328 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 01:07:29.648965 7328 ovnkube.go:599] Stopped ovnkube\\\\nI0227 01:07:29.648969 7328 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 01:07:29.648961 7328 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 01:07:29.648995 7328 handler.go:208] Removed *v1.Pod event handler 6\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:07:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-h5vs8_openshift-ovn-kubernetes(21f824c6-1bde-4e58-b4ef-72a56a140abb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:37Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:37 crc kubenswrapper[4771]: I0227 01:07:37.986310 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:37Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:38 crc kubenswrapper[4771]: I0227 01:07:38.005117 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:38Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:38 crc kubenswrapper[4771]: I0227 01:07:38.024755 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:38Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:38 crc kubenswrapper[4771]: I0227 01:07:38.041723 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:38Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:38 crc kubenswrapper[4771]: I0227 01:07:38.059503 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c7c6ab7e2f9f9a791788e08cb062278f1f02827724ac88cb37e84e901b8dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:07:15Z\\\",\\\"message\\\":\\\"2026-02-27T01:06:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8380b21f-640b-4d86-b7e6-ef54b71437d2\\\\n2026-02-27T01:06:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8380b21f-640b-4d86-b7e6-ef54b71437d2 to /host/opt/cni/bin/\\\\n2026-02-27T01:06:30Z [verbose] multus-daemon started\\\\n2026-02-27T01:06:30Z [verbose] Readiness Indicator file check\\\\n2026-02-27T01:07:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:38Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:38 crc kubenswrapper[4771]: I0227 01:07:38.072596 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-24pv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15dd6a85-eabc-4a32-a283-33bf72d2a041\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-24pv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:38Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:38 crc kubenswrapper[4771]: I0227 01:07:38.087583 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f66a3f1-a1b0-476b-89ab-41828d8a23c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33db46668493c52afb6e30bae58f1197802fc18fcc703f9d05931471b2526bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682d71bc07b1c3a5e3d9be1eb9f1dc1a19176f2bcf9942f18c8966864d51b032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be367e10a4cf507c9297ec9727c7847db6b54785f2efae23213c2a325e50cb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a16d7c04e5c94174f79e9a3d7e2fd64f70f7677c4128e6ff85d90ee3446cc724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a16d7c04e5c94174f79e9a3d7e2fd64f70f7677c4128e6ff85d90ee3446cc724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:38Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:38 crc kubenswrapper[4771]: I0227 01:07:38.101090 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fee99e-baf5-4752-b692-3d8104db3d71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86f48f09a5e07a4685c65f576a1110899727e8aed66e70b9cc68ef5ad582c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a00d290ae277671ef6fd27f856398b2bcc7d62616557e2f31ec50efb045d1c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a00d290ae277671ef6fd27f856398b2bcc7d62616557e2f31ec50efb045d1c56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:38Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:38 crc kubenswrapper[4771]: I0227 01:07:38.114659 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:38Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:38 crc kubenswrapper[4771]: I0227 01:07:38.130444 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:38Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:38 crc kubenswrapper[4771]: I0227 01:07:38.141045 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:38Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:38 crc kubenswrapper[4771]: I0227 01:07:38.772188 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:38 crc kubenswrapper[4771]: I0227 01:07:38.772242 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:38 crc kubenswrapper[4771]: I0227 01:07:38.772293 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:38 crc kubenswrapper[4771]: E0227 01:07:38.772356 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:38 crc kubenswrapper[4771]: E0227 01:07:38.772439 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:38 crc kubenswrapper[4771]: E0227 01:07:38.772684 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:39 crc kubenswrapper[4771]: I0227 01:07:39.772894 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:39 crc kubenswrapper[4771]: E0227 01:07:39.773077 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.513499 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.513536 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.513548 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.513579 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.513590 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:07:40Z","lastTransitionTime":"2026-02-27T01:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:07:40 crc kubenswrapper[4771]: E0227 01:07:40.528590 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:40Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.532127 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.532184 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.532214 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.532228 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.532236 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:07:40Z","lastTransitionTime":"2026-02-27T01:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:07:40 crc kubenswrapper[4771]: E0227 01:07:40.545748 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:40Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.549476 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.549507 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.549517 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.549531 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.549543 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:07:40Z","lastTransitionTime":"2026-02-27T01:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:07:40 crc kubenswrapper[4771]: E0227 01:07:40.563999 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:40Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.567452 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.567481 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.567490 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.567501 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.567510 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:07:40Z","lastTransitionTime":"2026-02-27T01:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:07:40 crc kubenswrapper[4771]: E0227 01:07:40.590311 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:40Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.593584 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.593619 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.593629 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.593644 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.593654 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:07:40Z","lastTransitionTime":"2026-02-27T01:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:07:40 crc kubenswrapper[4771]: E0227 01:07:40.607372 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"617895cc-625c-4c2b-869d-7397fcc31df7\\\",\\\"systemUUID\\\":\\\"375bc5bf-73cd-4494-8f02-c45b5f7dcf9a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:40Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:40 crc kubenswrapper[4771]: E0227 01:07:40.607643 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.772525 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.772598 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:40 crc kubenswrapper[4771]: I0227 01:07:40.772616 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:40 crc kubenswrapper[4771]: E0227 01:07:40.772878 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:40 crc kubenswrapper[4771]: E0227 01:07:40.772964 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:40 crc kubenswrapper[4771]: E0227 01:07:40.773038 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:41 crc kubenswrapper[4771]: I0227 01:07:41.772903 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:41 crc kubenswrapper[4771]: E0227 01:07:41.773142 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:42 crc kubenswrapper[4771]: I0227 01:07:42.773163 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:42 crc kubenswrapper[4771]: I0227 01:07:42.773220 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:42 crc kubenswrapper[4771]: I0227 01:07:42.773223 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:42 crc kubenswrapper[4771]: E0227 01:07:42.773532 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:42 crc kubenswrapper[4771]: E0227 01:07:42.773712 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:42 crc kubenswrapper[4771]: E0227 01:07:42.773826 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:42 crc kubenswrapper[4771]: E0227 01:07:42.917275 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 01:07:43 crc kubenswrapper[4771]: I0227 01:07:43.773020 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:43 crc kubenswrapper[4771]: E0227 01:07:43.773284 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:44 crc kubenswrapper[4771]: I0227 01:07:44.772104 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:44 crc kubenswrapper[4771]: I0227 01:07:44.772158 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:44 crc kubenswrapper[4771]: I0227 01:07:44.772202 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:44 crc kubenswrapper[4771]: E0227 01:07:44.772295 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:44 crc kubenswrapper[4771]: E0227 01:07:44.772429 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:44 crc kubenswrapper[4771]: E0227 01:07:44.772586 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:45 crc kubenswrapper[4771]: I0227 01:07:45.772795 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:45 crc kubenswrapper[4771]: E0227 01:07:45.773056 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:45 crc kubenswrapper[4771]: I0227 01:07:45.997487 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs\") pod \"network-metrics-daemon-24pv2\" (UID: \"15dd6a85-eabc-4a32-a283-33bf72d2a041\") " pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:45 crc kubenswrapper[4771]: E0227 01:07:45.997753 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 01:07:45 crc kubenswrapper[4771]: E0227 01:07:45.997886 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs podName:15dd6a85-eabc-4a32-a283-33bf72d2a041 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:49.997831526 +0000 UTC m=+242.935392854 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs") pod "network-metrics-daemon-24pv2" (UID: "15dd6a85-eabc-4a32-a283-33bf72d2a041") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 01:07:46 crc kubenswrapper[4771]: I0227 01:07:46.772312 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:46 crc kubenswrapper[4771]: I0227 01:07:46.772304 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:46 crc kubenswrapper[4771]: I0227 01:07:46.772342 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:46 crc kubenswrapper[4771]: E0227 01:07:46.772613 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:46 crc kubenswrapper[4771]: E0227 01:07:46.773106 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:46 crc kubenswrapper[4771]: E0227 01:07:46.773206 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:46 crc kubenswrapper[4771]: I0227 01:07:46.773698 4771 scope.go:117] "RemoveContainer" containerID="143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b" Feb 27 01:07:46 crc kubenswrapper[4771]: E0227 01:07:46.773999 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-h5vs8_openshift-ovn-kubernetes(21f824c6-1bde-4e58-b4ef-72a56a140abb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" Feb 27 01:07:47 crc kubenswrapper[4771]: I0227 01:07:47.772871 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:47 crc kubenswrapper[4771]: E0227 01:07:47.773123 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:47 crc kubenswrapper[4771]: I0227 01:07:47.792732 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:47Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:47 crc kubenswrapper[4771]: I0227 01:07:47.814074 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-srbwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c460c23-4b4a-458f-a52e-4208b9942829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c7c6ab7e2f9f9a791788e08cb062278f1f02827724ac88cb37e84e901b8dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:07:15Z\\\",\\\"message\\\":\\\"2026-02-27T01:06:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8380b21f-640b-4d86-b7e6-ef54b71437d2\\\\n2026-02-27T01:06:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8380b21f-640b-4d86-b7e6-ef54b71437d2 to /host/opt/cni/bin/\\\\n2026-02-27T01:06:30Z [verbose] multus-daemon started\\\\n2026-02-27T01:06:30Z [verbose] Readiness Indicator file check\\\\n2026-02-27T01:07:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6f826\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-srbwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:47Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:47 crc kubenswrapper[4771]: I0227 01:07:47.834953 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-24pv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15dd6a85-eabc-4a32-a283-33bf72d2a041\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ln79k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-24pv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:47Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:47 crc kubenswrapper[4771]: I0227 01:07:47.868504 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a330ea3-f476-4b01-978f-7e19a0d58854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b1f500f35381d9393a7024a187e69dce123a882a38e8ae4897e037960eff910\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://236f1232a4fa975a3e6bf35c61ec0abc7f78ed63662c4d499bbf983108bf7978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45797ebb4bce3fa136b1c00152b17dc7ef9ecbeb3aa09aab50254ad3a13bba57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a61eeb6dd950bc2ca609f5082b6094f1c3b258ecca09daae225f4629269ef14d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://339a1fa2dc21abf3b0351e5d330062cb03eaf70bacc6e8e80fcd1a8110641ee1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78e3f69a3ba53f7586b83962bc3e0fbf0149955bae62f9a7dd8e3da793bd1ea8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5b7cc98fed812ef0a5a773813c7885775dca79aa3e67ba0c8d68a1f5b3956f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65474b5445c977541695ff59a0bf9781958de8318b0c79e267649da5933011e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:47Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:47 crc kubenswrapper[4771]: I0227 01:07:47.889498 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4789374-f7c5-4270-a54a-5fbdd6319021\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:39Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 01:05:38.414173 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 01:05:38.414343 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 01:05:38.415139 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2364349643/tls.crt::/tmp/serving-cert-2364349643/tls.key\\\\\\\"\\\\nI0227 01:05:39.082511 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 01:05:39.087910 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 01:05:39.087941 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 01:05:39.087973 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 01:05:39.087981 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 01:05:39.093406 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0227 01:05:39.093432 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0227 01:05:39.093449 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093459 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 01:05:39.093468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 01:05:39.093474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 01:05:39.093481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 01:05:39.093502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0227 01:05:39.095953 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:05:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:47Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:47 crc kubenswrapper[4771]: I0227 01:07:47.910720 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bcf40eb23de817fca48360883a5374e4cd3288625df5f309055c91f81877c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:47Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:47 crc kubenswrapper[4771]: E0227 01:07:47.918718 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 01:07:47 crc kubenswrapper[4771]: I0227 01:07:47.930262 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f66a3f1-a1b0-476b-89ab-41828d8a23c3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33db46668493c52afb6e30bae58f1197802fc18fcc703f9d05931471b2526bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682d71bc07b1c3a5e3d9be1eb9f1dc1a19176f2bcf9942f18c8966864d51b032\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be367e10a4cf507c9297ec9727c7847db6b54785f2efae23213c2a325e50cb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a16d7c04e5c94174f79e9a3d7e2fd64f70f7677c4128e6ff85d90ee3446cc724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a16d7c04e5c94174f79e9a3d7e2fd64f70f7677c4128e6ff85d90ee3446cc724\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:47Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:47 crc kubenswrapper[4771]: I0227 01:07:47.946805 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"83fee99e-baf5-4752-b692-3d8104db3d71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e86f48f09a5e07a4685c65f576a1110899727e8aed66e70b9cc68ef5ad582c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a00d290ae277671ef6fd27f856398b2bcc7d62616557e2f31ec50efb045d1c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a00d290ae277671ef6fd27f856398b2bcc7d62616557e2f31ec50efb045d1c56\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:04:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:47Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:47 crc kubenswrapper[4771]: I0227 01:07:47.966699 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:47Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:47 crc kubenswrapper[4771]: I0227 01:07:47.986968 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d38e47e7f854180d5efa2847000e86d321b4407a6dcda6edeb94b3782cc2c94d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2af6de58cf79a62c3587a30d48e9a3fdc0f2f0d7c7fae18e3027d605d872b77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:47Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:48 crc kubenswrapper[4771]: I0227 01:07:48.004132 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca81e505-d53f-496e-bd26-7cec669591e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04adda6dd3474af36df84d8f44a5618b2f7547ebf2189ecad77a577fbb1ab0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2xw9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hw7dn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:48Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:48 crc kubenswrapper[4771]: I0227 01:07:48.022798 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gv8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b4fce63-a548-406d-8663-45d1e335b000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45eb1f633ef22355d1ad05bd40f67c6e722dad119491ca22d10da981d8757826\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vrvxz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gv8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:48Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:48 crc kubenswrapper[4771]: I0227 01:07:48.043792 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc94570-c9c6-41e3-8a2b-0536f371b5da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf86497cdf590898e1cebe80870da1b6699e7a8517039fdce90cd0d5040b0385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45703795d8253a1723f2a2a1c98909f7bc516c0a6463c7708e7f126605cb8aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3722e412b451d56cb80fbb00b7eeb3a2ed8da743baf1fc04d07fbf0c7ea47f7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c8f2c2dd232fd274a15faac042c1f31408c750c8be1021124307ce6e685b107\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2bd6bcf34bd4ca2e5e60f2305627ece0921a7b18451e0910681cda17ae938e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff3f926edf9a2a652369857fa7e1a0f299527c4c6b99a9a0d4f49b8bad67e4f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd559f772b5a5499da4d4b3e31d611b385648f7c825ff2dfb46f74777fe0942d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fcw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hhdz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:48Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:48 crc kubenswrapper[4771]: I0227 01:07:48.078234 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21f824c6-1bde-4e58-b4ef-72a56a140abb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T01:07:29Z\\\",\\\"message\\\":\\\") from k8s.io/client-go/informers/factory.go:160\\\\nI0227 01:07:29.647255 7328 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0227 01:07:29.648736 7328 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 01:07:29.648805 7328 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 01:07:29.648841 7328 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0227 01:07:29.648849 7328 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 01:07:29.648911 7328 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 01:07:29.648926 7328 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 01:07:29.648928 7328 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 01:07:29.648949 7328 factory.go:656] Stopping watch factory\\\\nI0227 01:07:29.648953 7328 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 01:07:29.648965 7328 ovnkube.go:599] Stopped ovnkube\\\\nI0227 01:07:29.648969 7328 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 01:07:29.648961 7328 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 01:07:29.648995 7328 handler.go:208] Removed *v1.Pod event handler 6\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:07:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-h5vs8_openshift-ovn-kubernetes(21f824c6-1bde-4e58-b4ef-72a56a140abb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T01:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T01:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9c6hm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-h5vs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:48Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:48 crc kubenswrapper[4771]: I0227 01:07:48.095290 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cbt48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5202229e-c4e0-4bcd-8295-85e4e9f4f4ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d591794984ea0c81e0d8c756e7c24134aac6c8c15e34cf5119921f2e335041f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvdcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cbt48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:48Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:48 crc kubenswrapper[4771]: I0227 01:07:48.115446 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7753e0fc-55c7-4f3e-a5ac-026a71aa8a46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1615ac61dec5d7d017c23cffd34575ac0575bc8cb679233a57222eca25dca675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd417d1b0b53abe6fec3c4c5964722f08a264f22ce487a8ae6447c111dfccc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:06:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-54zs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:48Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:48 crc kubenswrapper[4771]: I0227 01:07:48.135894 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd6cc63e-9af8-4fdb-bd3f-ac14e85d530f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:05:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T01:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://559aae9ebde1e3353fa31e4ab0948f9b28ba95b933fdc26316ce60c4ed59681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://482de5223c2383e7d3e25219b5bbc83570be314da9ed2b2a0f82268221a006c6\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T01:05:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 01:04:49.945337 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 01:04:49.947347 1 observer_polling.go:159] Starting file observer\\\\nI0227 01:04:49.977213 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 01:04:49.981579 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 01:05:16.797987 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 01:05:16.798250 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:05:16Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://593f438eca99f35a3e73dcb0a24ba4da5d91122cd3fbdb0dc437c96240baa2d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ed28ca3311c14ae33774c58dc91745c23b3ad770bacc4c8dae7412fe269ff7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T01:04:47Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:48Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:48 crc kubenswrapper[4771]: I0227 01:07:48.154478 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:48Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:48 crc kubenswrapper[4771]: I0227 01:07:48.172157 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T01:06:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ea1abdcf338068ac8ced798ad33d35e34c0e31f2d32ba78bf1c7857fe8d886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T01:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T01:07:48Z is after 2025-08-24T17:21:41Z" Feb 27 01:07:48 crc kubenswrapper[4771]: I0227 01:07:48.772132 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:48 crc kubenswrapper[4771]: I0227 01:07:48.772227 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:48 crc kubenswrapper[4771]: E0227 01:07:48.772979 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:48 crc kubenswrapper[4771]: I0227 01:07:48.772368 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:48 crc kubenswrapper[4771]: E0227 01:07:48.773067 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:48 crc kubenswrapper[4771]: E0227 01:07:48.772880 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:49 crc kubenswrapper[4771]: I0227 01:07:49.772296 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:49 crc kubenswrapper[4771]: E0227 01:07:49.772587 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:50 crc kubenswrapper[4771]: I0227 01:07:50.773048 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:50 crc kubenswrapper[4771]: I0227 01:07:50.773082 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:50 crc kubenswrapper[4771]: E0227 01:07:50.773883 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:50 crc kubenswrapper[4771]: I0227 01:07:50.773106 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:50 crc kubenswrapper[4771]: E0227 01:07:50.773997 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:50 crc kubenswrapper[4771]: E0227 01:07:50.774156 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:50 crc kubenswrapper[4771]: I0227 01:07:50.990013 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 01:07:50 crc kubenswrapper[4771]: I0227 01:07:50.990077 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 01:07:50 crc kubenswrapper[4771]: I0227 01:07:50.990089 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 01:07:50 crc kubenswrapper[4771]: I0227 01:07:50.990105 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 01:07:50 crc kubenswrapper[4771]: I0227 01:07:50.990119 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T01:07:50Z","lastTransitionTime":"2026-02-27T01:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.078182 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgkzt"] Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.078787 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgkzt" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.082297 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.082319 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.082387 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.084322 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.157220 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/53b7e1d8-cd58-4660-8d0e-b2dd934abc07-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xgkzt\" (UID: \"53b7e1d8-cd58-4660-8d0e-b2dd934abc07\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgkzt" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.157277 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53b7e1d8-cd58-4660-8d0e-b2dd934abc07-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xgkzt\" (UID: \"53b7e1d8-cd58-4660-8d0e-b2dd934abc07\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgkzt" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.157327 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/53b7e1d8-cd58-4660-8d0e-b2dd934abc07-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xgkzt\" (UID: \"53b7e1d8-cd58-4660-8d0e-b2dd934abc07\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgkzt" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.157373 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53b7e1d8-cd58-4660-8d0e-b2dd934abc07-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xgkzt\" (UID: \"53b7e1d8-cd58-4660-8d0e-b2dd934abc07\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgkzt" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.157433 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/53b7e1d8-cd58-4660-8d0e-b2dd934abc07-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xgkzt\" (UID: \"53b7e1d8-cd58-4660-8d0e-b2dd934abc07\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgkzt" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.187346 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podStartSLOduration=138.187326316 podStartE2EDuration="2m18.187326316s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:07:51.170103655 +0000 UTC m=+184.107664953" watchObservedRunningTime="2026-02-27 01:07:51.187326316 +0000 UTC m=+184.124887604" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.203856 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-54zs9" podStartSLOduration=137.203837136 podStartE2EDuration="2m17.203837136s" podCreationTimestamp="2026-02-27 01:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:07:51.187819999 +0000 UTC m=+184.125381287" watchObservedRunningTime="2026-02-27 01:07:51.203837136 +0000 UTC m=+184.141398434" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.204025 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=76.204020832 podStartE2EDuration="1m16.204020832s" podCreationTimestamp="2026-02-27 01:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:07:51.202225583 +0000 UTC m=+184.139786891" watchObservedRunningTime="2026-02-27 01:07:51.204020832 +0000 UTC m=+184.141582130" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.235266 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-gv8pz" podStartSLOduration=138.235248586 podStartE2EDuration="2m18.235248586s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:07:51.234563507 +0000 UTC m=+184.172124795" watchObservedRunningTime="2026-02-27 01:07:51.235248586 +0000 UTC m=+184.172809884" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.249532 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hhdz6" podStartSLOduration=138.249517857 podStartE2EDuration="2m18.249517857s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:07:51.249363863 +0000 UTC m=+184.186925161" watchObservedRunningTime="2026-02-27 01:07:51.249517857 +0000 UTC m=+184.187079165" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.258078 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/53b7e1d8-cd58-4660-8d0e-b2dd934abc07-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xgkzt\" (UID: \"53b7e1d8-cd58-4660-8d0e-b2dd934abc07\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgkzt" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.258129 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53b7e1d8-cd58-4660-8d0e-b2dd934abc07-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xgkzt\" (UID: \"53b7e1d8-cd58-4660-8d0e-b2dd934abc07\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgkzt" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.258172 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/53b7e1d8-cd58-4660-8d0e-b2dd934abc07-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xgkzt\" (UID: \"53b7e1d8-cd58-4660-8d0e-b2dd934abc07\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgkzt" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.258218 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53b7e1d8-cd58-4660-8d0e-b2dd934abc07-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xgkzt\" (UID: \"53b7e1d8-cd58-4660-8d0e-b2dd934abc07\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgkzt" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.258249 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/53b7e1d8-cd58-4660-8d0e-b2dd934abc07-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xgkzt\" (UID: \"53b7e1d8-cd58-4660-8d0e-b2dd934abc07\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgkzt" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.258258 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/53b7e1d8-cd58-4660-8d0e-b2dd934abc07-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xgkzt\" (UID: \"53b7e1d8-cd58-4660-8d0e-b2dd934abc07\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgkzt" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.258604 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/53b7e1d8-cd58-4660-8d0e-b2dd934abc07-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xgkzt\" (UID: \"53b7e1d8-cd58-4660-8d0e-b2dd934abc07\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgkzt" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.259134 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/53b7e1d8-cd58-4660-8d0e-b2dd934abc07-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xgkzt\" (UID: \"53b7e1d8-cd58-4660-8d0e-b2dd934abc07\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgkzt" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.267159 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53b7e1d8-cd58-4660-8d0e-b2dd934abc07-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xgkzt\" (UID: \"53b7e1d8-cd58-4660-8d0e-b2dd934abc07\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgkzt" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.281760 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53b7e1d8-cd58-4660-8d0e-b2dd934abc07-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xgkzt\" (UID: \"53b7e1d8-cd58-4660-8d0e-b2dd934abc07\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgkzt" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.288675 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-cbt48" podStartSLOduration=138.288658993 podStartE2EDuration="2m18.288658993s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:07:51.288261362 +0000 UTC m=+184.225822650" watchObservedRunningTime="2026-02-27 01:07:51.288658993 +0000 UTC m=+184.226220281" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.314218 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=95.314200645 podStartE2EDuration="1m35.314200645s" podCreationTimestamp="2026-02-27 01:06:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:07:51.313194848 +0000 UTC m=+184.250756146" watchObservedRunningTime="2026-02-27 01:07:51.314200645 +0000 UTC m=+184.251761933" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.328726 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=86.328707933 podStartE2EDuration="1m26.328707933s" podCreationTimestamp="2026-02-27 01:06:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:07:51.327928842 +0000 UTC m=+184.265490130" watchObservedRunningTime="2026-02-27 01:07:51.328707933 +0000 UTC m=+184.266269221" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.380924 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-srbwq" podStartSLOduration=138.380906117 podStartE2EDuration="2m18.380906117s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:07:51.370805918 +0000 UTC m=+184.308367206" watchObservedRunningTime="2026-02-27 01:07:51.380906117 +0000 UTC m=+184.318467405" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.401040 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgkzt" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.404405 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=53.404395745 podStartE2EDuration="53.404395745s" podCreationTimestamp="2026-02-27 01:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:07:51.40422768 +0000 UTC m=+184.341788978" watchObservedRunningTime="2026-02-27 01:07:51.404395745 +0000 UTC m=+184.341957033" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.415876 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=29.415863581 podStartE2EDuration="29.415863581s" podCreationTimestamp="2026-02-27 01:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:07:51.415458331 +0000 UTC m=+184.353019629" watchObservedRunningTime="2026-02-27 01:07:51.415863581 +0000 UTC m=+184.353424869" Feb 27 01:07:51 crc kubenswrapper[4771]: W0227 01:07:51.421206 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53b7e1d8_cd58_4660_8d0e_b2dd934abc07.slice/crio-2816a2b3062457a49d6c72a5c5efcba33d032e7bb560fb38a3ae82e61f87f94d WatchSource:0}: Error finding container 2816a2b3062457a49d6c72a5c5efcba33d032e7bb560fb38a3ae82e61f87f94d: Status 404 returned error can't find the container with id 2816a2b3062457a49d6c72a5c5efcba33d032e7bb560fb38a3ae82e61f87f94d Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.669357 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgkzt" event={"ID":"53b7e1d8-cd58-4660-8d0e-b2dd934abc07","Type":"ContainerStarted","Data":"87f5a13f901890c7de875670d29b58f3254aa6a48ee10579ab40474f9484555c"} Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.669429 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgkzt" event={"ID":"53b7e1d8-cd58-4660-8d0e-b2dd934abc07","Type":"ContainerStarted","Data":"2816a2b3062457a49d6c72a5c5efcba33d032e7bb560fb38a3ae82e61f87f94d"} Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.814088 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.815100 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:51 crc kubenswrapper[4771]: E0227 01:07:51.815338 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:51 crc kubenswrapper[4771]: I0227 01:07:51.825469 4771 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 27 01:07:52 crc kubenswrapper[4771]: I0227 01:07:52.772950 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:52 crc kubenswrapper[4771]: I0227 01:07:52.772998 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:52 crc kubenswrapper[4771]: I0227 01:07:52.772972 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:52 crc kubenswrapper[4771]: E0227 01:07:52.773125 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:52 crc kubenswrapper[4771]: E0227 01:07:52.773278 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:52 crc kubenswrapper[4771]: E0227 01:07:52.773336 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:52 crc kubenswrapper[4771]: E0227 01:07:52.919726 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 01:07:53 crc kubenswrapper[4771]: I0227 01:07:53.772537 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:53 crc kubenswrapper[4771]: E0227 01:07:53.772737 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:54 crc kubenswrapper[4771]: I0227 01:07:54.772106 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:54 crc kubenswrapper[4771]: I0227 01:07:54.772274 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:54 crc kubenswrapper[4771]: I0227 01:07:54.772330 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:54 crc kubenswrapper[4771]: E0227 01:07:54.772567 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:54 crc kubenswrapper[4771]: E0227 01:07:54.772687 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:54 crc kubenswrapper[4771]: E0227 01:07:54.772885 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:55 crc kubenswrapper[4771]: I0227 01:07:55.773131 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:55 crc kubenswrapper[4771]: E0227 01:07:55.773642 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:56 crc kubenswrapper[4771]: I0227 01:07:56.773212 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:56 crc kubenswrapper[4771]: I0227 01:07:56.773310 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:56 crc kubenswrapper[4771]: I0227 01:07:56.773403 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:56 crc kubenswrapper[4771]: E0227 01:07:56.773615 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:56 crc kubenswrapper[4771]: E0227 01:07:56.773671 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:56 crc kubenswrapper[4771]: E0227 01:07:56.773535 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:57 crc kubenswrapper[4771]: I0227 01:07:57.772759 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:57 crc kubenswrapper[4771]: E0227 01:07:57.773627 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:07:57 crc kubenswrapper[4771]: I0227 01:07:57.774810 4771 scope.go:117] "RemoveContainer" containerID="143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b" Feb 27 01:07:57 crc kubenswrapper[4771]: E0227 01:07:57.775139 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-h5vs8_openshift-ovn-kubernetes(21f824c6-1bde-4e58-b4ef-72a56a140abb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" Feb 27 01:07:57 crc kubenswrapper[4771]: E0227 01:07:57.921328 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 01:07:58 crc kubenswrapper[4771]: I0227 01:07:58.773084 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:07:58 crc kubenswrapper[4771]: I0227 01:07:58.773122 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:07:58 crc kubenswrapper[4771]: I0227 01:07:58.773220 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:07:58 crc kubenswrapper[4771]: E0227 01:07:58.773332 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:07:58 crc kubenswrapper[4771]: E0227 01:07:58.773431 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:07:58 crc kubenswrapper[4771]: E0227 01:07:58.773536 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:07:59 crc kubenswrapper[4771]: I0227 01:07:59.772672 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:07:59 crc kubenswrapper[4771]: E0227 01:07:59.772882 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:08:00 crc kubenswrapper[4771]: I0227 01:08:00.772631 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:08:00 crc kubenswrapper[4771]: I0227 01:08:00.772670 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:08:00 crc kubenswrapper[4771]: I0227 01:08:00.772694 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:08:00 crc kubenswrapper[4771]: E0227 01:08:00.772756 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:08:00 crc kubenswrapper[4771]: E0227 01:08:00.772855 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:08:00 crc kubenswrapper[4771]: E0227 01:08:00.772988 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:08:01 crc kubenswrapper[4771]: I0227 01:08:01.772104 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:08:01 crc kubenswrapper[4771]: E0227 01:08:01.772324 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:08:02 crc kubenswrapper[4771]: I0227 01:08:02.709915 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-srbwq_3c460c23-4b4a-458f-a52e-4208b9942829/kube-multus/1.log" Feb 27 01:08:02 crc kubenswrapper[4771]: I0227 01:08:02.710857 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-srbwq_3c460c23-4b4a-458f-a52e-4208b9942829/kube-multus/0.log" Feb 27 01:08:02 crc kubenswrapper[4771]: I0227 01:08:02.710925 4771 generic.go:334] "Generic (PLEG): container finished" podID="3c460c23-4b4a-458f-a52e-4208b9942829" containerID="2c7c6ab7e2f9f9a791788e08cb062278f1f02827724ac88cb37e84e901b8dcda" exitCode=1 Feb 27 01:08:02 crc kubenswrapper[4771]: I0227 01:08:02.710966 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-srbwq" event={"ID":"3c460c23-4b4a-458f-a52e-4208b9942829","Type":"ContainerDied","Data":"2c7c6ab7e2f9f9a791788e08cb062278f1f02827724ac88cb37e84e901b8dcda"} Feb 27 01:08:02 crc kubenswrapper[4771]: I0227 01:08:02.711022 4771 scope.go:117] "RemoveContainer" containerID="b8cd304b662d8a405f17df4762a99be3ce2065a7a4e29e37b08be9861d179f0d" Feb 27 01:08:02 crc kubenswrapper[4771]: I0227 01:08:02.711415 4771 scope.go:117] "RemoveContainer" containerID="2c7c6ab7e2f9f9a791788e08cb062278f1f02827724ac88cb37e84e901b8dcda" Feb 27 01:08:02 crc kubenswrapper[4771]: E0227 01:08:02.711720 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-srbwq_openshift-multus(3c460c23-4b4a-458f-a52e-4208b9942829)\"" pod="openshift-multus/multus-srbwq" podUID="3c460c23-4b4a-458f-a52e-4208b9942829" Feb 27 01:08:02 crc kubenswrapper[4771]: I0227 01:08:02.732895 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgkzt" podStartSLOduration=149.732870597 podStartE2EDuration="2m29.732870597s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:07:51.691113674 +0000 UTC m=+184.628675002" watchObservedRunningTime="2026-02-27 01:08:02.732870597 +0000 UTC m=+195.670431925" Feb 27 01:08:02 crc kubenswrapper[4771]: I0227 01:08:02.772500 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:08:02 crc kubenswrapper[4771]: I0227 01:08:02.772530 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:08:02 crc kubenswrapper[4771]: I0227 01:08:02.772528 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:08:02 crc kubenswrapper[4771]: E0227 01:08:02.772922 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:08:02 crc kubenswrapper[4771]: E0227 01:08:02.773028 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:08:02 crc kubenswrapper[4771]: E0227 01:08:02.772736 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:08:02 crc kubenswrapper[4771]: E0227 01:08:02.923890 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 01:08:03 crc kubenswrapper[4771]: I0227 01:08:03.716152 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-srbwq_3c460c23-4b4a-458f-a52e-4208b9942829/kube-multus/1.log" Feb 27 01:08:03 crc kubenswrapper[4771]: I0227 01:08:03.772868 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:08:03 crc kubenswrapper[4771]: E0227 01:08:03.773053 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:08:04 crc kubenswrapper[4771]: I0227 01:08:04.772806 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:08:04 crc kubenswrapper[4771]: I0227 01:08:04.772824 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:08:04 crc kubenswrapper[4771]: E0227 01:08:04.773308 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:08:04 crc kubenswrapper[4771]: I0227 01:08:04.773078 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:08:04 crc kubenswrapper[4771]: E0227 01:08:04.773452 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:08:04 crc kubenswrapper[4771]: E0227 01:08:04.773584 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:08:05 crc kubenswrapper[4771]: I0227 01:08:05.772635 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:08:05 crc kubenswrapper[4771]: E0227 01:08:05.772765 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:08:06 crc kubenswrapper[4771]: I0227 01:08:06.772470 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:08:06 crc kubenswrapper[4771]: I0227 01:08:06.772490 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:08:06 crc kubenswrapper[4771]: E0227 01:08:06.772614 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:08:06 crc kubenswrapper[4771]: I0227 01:08:06.772654 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:08:06 crc kubenswrapper[4771]: E0227 01:08:06.772787 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:08:06 crc kubenswrapper[4771]: E0227 01:08:06.772954 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:08:07 crc kubenswrapper[4771]: I0227 01:08:07.773749 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:08:07 crc kubenswrapper[4771]: E0227 01:08:07.773926 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:08:07 crc kubenswrapper[4771]: E0227 01:08:07.925315 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 01:08:08 crc kubenswrapper[4771]: I0227 01:08:08.772339 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:08:08 crc kubenswrapper[4771]: I0227 01:08:08.772411 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:08:08 crc kubenswrapper[4771]: I0227 01:08:08.772456 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:08:08 crc kubenswrapper[4771]: E0227 01:08:08.772542 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:08:08 crc kubenswrapper[4771]: E0227 01:08:08.772704 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:08:08 crc kubenswrapper[4771]: E0227 01:08:08.772922 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:08:09 crc kubenswrapper[4771]: I0227 01:08:09.773125 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:08:09 crc kubenswrapper[4771]: E0227 01:08:09.773251 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:08:09 crc kubenswrapper[4771]: I0227 01:08:09.774054 4771 scope.go:117] "RemoveContainer" containerID="143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b" Feb 27 01:08:10 crc kubenswrapper[4771]: I0227 01:08:10.665853 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-24pv2"] Feb 27 01:08:10 crc kubenswrapper[4771]: I0227 01:08:10.740590 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h5vs8_21f824c6-1bde-4e58-b4ef-72a56a140abb/ovnkube-controller/3.log" Feb 27 01:08:10 crc kubenswrapper[4771]: I0227 01:08:10.744013 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerStarted","Data":"81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66"} Feb 27 01:08:10 crc kubenswrapper[4771]: I0227 01:08:10.744055 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:08:10 crc kubenswrapper[4771]: E0227 01:08:10.744187 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:08:10 crc kubenswrapper[4771]: I0227 01:08:10.744842 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:08:10 crc kubenswrapper[4771]: I0227 01:08:10.772574 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:08:10 crc kubenswrapper[4771]: I0227 01:08:10.772626 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:08:10 crc kubenswrapper[4771]: E0227 01:08:10.772706 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:08:10 crc kubenswrapper[4771]: I0227 01:08:10.772724 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:08:10 crc kubenswrapper[4771]: E0227 01:08:10.772859 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:08:10 crc kubenswrapper[4771]: E0227 01:08:10.773013 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:08:10 crc kubenswrapper[4771]: I0227 01:08:10.783992 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" podStartSLOduration=157.783967586 podStartE2EDuration="2m37.783967586s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:10.780573804 +0000 UTC m=+203.718135172" watchObservedRunningTime="2026-02-27 01:08:10.783967586 +0000 UTC m=+203.721528904" Feb 27 01:08:12 crc kubenswrapper[4771]: I0227 01:08:12.772571 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:08:12 crc kubenswrapper[4771]: I0227 01:08:12.772589 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:08:12 crc kubenswrapper[4771]: E0227 01:08:12.773080 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:08:12 crc kubenswrapper[4771]: I0227 01:08:12.772673 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:08:12 crc kubenswrapper[4771]: I0227 01:08:12.772677 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:08:12 crc kubenswrapper[4771]: E0227 01:08:12.773246 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:08:12 crc kubenswrapper[4771]: E0227 01:08:12.773352 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:08:12 crc kubenswrapper[4771]: E0227 01:08:12.773431 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:08:12 crc kubenswrapper[4771]: E0227 01:08:12.926697 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 01:08:14 crc kubenswrapper[4771]: I0227 01:08:14.773163 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:08:14 crc kubenswrapper[4771]: I0227 01:08:14.773219 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:08:14 crc kubenswrapper[4771]: I0227 01:08:14.773323 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:08:14 crc kubenswrapper[4771]: I0227 01:08:14.773545 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:08:14 crc kubenswrapper[4771]: E0227 01:08:14.773678 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:08:14 crc kubenswrapper[4771]: I0227 01:08:14.773752 4771 scope.go:117] "RemoveContainer" containerID="2c7c6ab7e2f9f9a791788e08cb062278f1f02827724ac88cb37e84e901b8dcda" Feb 27 01:08:14 crc kubenswrapper[4771]: E0227 01:08:14.773857 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:08:14 crc kubenswrapper[4771]: E0227 01:08:14.774121 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:08:14 crc kubenswrapper[4771]: E0227 01:08:14.774308 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:08:15 crc kubenswrapper[4771]: I0227 01:08:15.762358 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-srbwq_3c460c23-4b4a-458f-a52e-4208b9942829/kube-multus/1.log" Feb 27 01:08:15 crc kubenswrapper[4771]: I0227 01:08:15.762408 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-srbwq" event={"ID":"3c460c23-4b4a-458f-a52e-4208b9942829","Type":"ContainerStarted","Data":"1f5c442299aaf88392fdb9b66293dcde4d1eac2143b2828533d23ec4d8860a72"} Feb 27 01:08:16 crc kubenswrapper[4771]: I0227 01:08:16.772708 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:08:16 crc kubenswrapper[4771]: I0227 01:08:16.772830 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:08:16 crc kubenswrapper[4771]: I0227 01:08:16.772721 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:08:16 crc kubenswrapper[4771]: I0227 01:08:16.772729 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:08:16 crc kubenswrapper[4771]: E0227 01:08:16.773035 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-24pv2" podUID="15dd6a85-eabc-4a32-a283-33bf72d2a041" Feb 27 01:08:16 crc kubenswrapper[4771]: E0227 01:08:16.773079 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 01:08:16 crc kubenswrapper[4771]: E0227 01:08:16.773145 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 01:08:16 crc kubenswrapper[4771]: E0227 01:08:16.773199 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 01:08:18 crc kubenswrapper[4771]: I0227 01:08:18.772348 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:08:18 crc kubenswrapper[4771]: I0227 01:08:18.772348 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:08:18 crc kubenswrapper[4771]: I0227 01:08:18.772377 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:08:18 crc kubenswrapper[4771]: I0227 01:08:18.772532 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:08:18 crc kubenswrapper[4771]: I0227 01:08:18.776330 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 27 01:08:18 crc kubenswrapper[4771]: I0227 01:08:18.776658 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 27 01:08:18 crc kubenswrapper[4771]: I0227 01:08:18.776749 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 27 01:08:18 crc kubenswrapper[4771]: I0227 01:08:18.776767 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 27 01:08:18 crc kubenswrapper[4771]: I0227 01:08:18.776689 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 27 01:08:18 crc kubenswrapper[4771]: I0227 01:08:18.777206 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.487084 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.532357 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4vrtf"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.533109 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-4vrtf" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.533114 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wgwwp"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.534355 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-76f5q"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.535289 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-s7265"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.535826 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-crfkk"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.536156 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s7265" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.536202 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.536316 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.537099 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cqlnh"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.537456 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.537493 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cqlnh" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.540704 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.541792 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.544052 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.545306 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.545618 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.550029 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.554933 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdp46"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.578833 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9scwl"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.579576 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.579664 4771 reflector.go:561] object-"openshift-controller-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.579715 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.579803 4771 reflector.go:561] object-"openshift-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.579835 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.579927 4771 reflector.go:561] object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.579958 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.580010 4771 reflector.go:561] object-"openshift-apiserver"/"image-import-ca": failed to list *v1.ConfigMap: configmaps "image-import-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.580030 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"image-import-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"image-import-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.580144 4771 reflector.go:561] object-"openshift-apiserver"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.580188 4771 reflector.go:561] object-"openshift-apiserver"/"etcd-client": failed to list *v1.Secret: secrets "etcd-client" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.580214 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"etcd-client\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"etcd-client\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.580183 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.580300 4771 reflector.go:561] object-"openshift-authentication-operator"/"authentication-operator-config": failed to list *v1.ConfigMap: configmaps "authentication-operator-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.580322 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"authentication-operator-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"authentication-operator-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.580393 4771 reflector.go:561] object-"openshift-authentication-operator"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.580445 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdp46" Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.580459 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.580472 4771 reflector.go:561] object-"openshift-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.580508 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.580689 4771 reflector.go:561] object-"openshift-cluster-machine-approver"/"machine-approver-tls": failed to list *v1.Secret: secrets "machine-approver-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.580719 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.580722 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"machine-approver-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-approver-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.580689 4771 reflector.go:561] object-"openshift-authentication-operator"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.580759 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.580757 4771 reflector.go:561] object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4": failed to list *v1.Secret: secrets "machine-approver-sa-dockercfg-nl2j4" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.580812 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"machine-approver-sa-dockercfg-nl2j4\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-approver-sa-dockercfg-nl2j4\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.580823 4771 reflector.go:561] object-"openshift-cluster-machine-approver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.580835 4771 reflector.go:561] object-"openshift-cluster-machine-approver"/"machine-approver-config": failed to list *v1.ConfigMap: configmaps "machine-approver-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.580845 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.580854 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"machine-approver-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"machine-approver-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.580870 4771 reflector.go:561] object-"openshift-apiserver"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.580913 4771 reflector.go:561] object-"openshift-apiserver"/"encryption-config-1": failed to list *v1.Secret: secrets "encryption-config-1" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.580918 4771 reflector.go:561] object-"openshift-authentication-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.580928 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"encryption-config-1\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"encryption-config-1\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.580936 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.580945 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.580961 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581016 4771 reflector.go:561] object-"openshift-apiserver"/"etcd-serving-ca": failed to list *v1.ConfigMap: configmaps "etcd-serving-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.581031 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"etcd-serving-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"etcd-serving-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581037 4771 reflector.go:561] object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff": failed to list *v1.Secret: secrets "openshift-apiserver-sa-dockercfg-djjff" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581049 4771 reflector.go:561] object-"openshift-apiserver"/"audit-1": failed to list *v1.ConfigMap: configmaps "audit-1" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581077 4771 reflector.go:561] object-"openshift-apiserver"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.581074 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"audit-1\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"audit-1\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.580972 4771 reflector.go:561] object-"openshift-cluster-machine-approver"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.581092 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.581068 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"openshift-apiserver-sa-dockercfg-djjff\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-apiserver-sa-dockercfg-djjff\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.581111 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581142 4771 reflector.go:561] object-"openshift-controller-manager"/"client-ca": failed to list *v1.ConfigMap: configmaps "client-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.581158 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"client-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"client-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581165 4771 reflector.go:561] object-"openshift-controller-manager"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.581187 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581197 4771 reflector.go:561] object-"openshift-authentication-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.581201 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.581218 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581229 4771 reflector.go:561] object-"openshift-authentication-operator"/"service-ca-bundle": failed to list *v1.ConfigMap: configmaps "service-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.581253 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"service-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"service-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581266 4771 reflector.go:561] object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq": failed to list *v1.Secret: secrets "oauth-apiserver-sa-dockercfg-6r2bq" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.581283 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"oauth-apiserver-sa-dockercfg-6r2bq\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"oauth-apiserver-sa-dockercfg-6r2bq\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.581146 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581306 4771 reflector.go:561] object-"openshift-oauth-apiserver"/"encryption-config-1": failed to list *v1.Secret: secrets "encryption-config-1" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.581328 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"encryption-config-1\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"encryption-config-1\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581375 4771 reflector.go:561] object-"openshift-oauth-apiserver"/"audit-1": failed to list *v1.ConfigMap: configmaps "audit-1" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.581394 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"audit-1\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"audit-1\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581439 4771 reflector.go:561] object-"openshift-oauth-apiserver"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581452 4771 reflector.go:561] object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-samples-operator": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.581458 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.581471 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-samples-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.581505 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-65dsm"] Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581514 4771 reflector.go:561] object-"openshift-oauth-apiserver"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.581863 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581541 4771 reflector.go:561] object-"openshift-oauth-apiserver"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581545 4771 reflector.go:561] object-"openshift-oauth-apiserver"/"etcd-client": failed to list *v1.Secret: secrets "etcd-client" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.581973 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.582005 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"etcd-client\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"etcd-client\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.581589 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581591 4771 reflector.go:561] object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c": failed to list *v1.Secret: secrets "openshift-controller-manager-sa-dockercfg-msq4c" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.582062 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-controller-manager-sa-dockercfg-msq4c\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-controller-manager-sa-dockercfg-msq4c\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581586 4771 reflector.go:561] object-"openshift-oauth-apiserver"/"etcd-serving-ca": failed to list *v1.ConfigMap: configmaps "etcd-serving-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.582098 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"etcd-serving-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"etcd-serving-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581613 4771 reflector.go:561] object-"openshift-route-controller-manager"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.582117 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581626 4771 reflector.go:561] object-"openshift-cluster-samples-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-samples-operator": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.582144 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-samples-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581635 4771 reflector.go:561] object-"openshift-route-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.582171 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581649 4771 reflector.go:561] object-"openshift-route-controller-manager"/"client-ca": failed to list *v1.ConfigMap: configmaps "client-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.582195 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"client-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"client-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581671 4771 reflector.go:561] object-"openshift-oauth-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.582221 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581674 4771 reflector.go:561] object-"openshift-cluster-samples-operator"/"samples-operator-tls": failed to list *v1.Secret: secrets "samples-operator-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-samples-operator": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.582242 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"samples-operator-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-samples-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.582124 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581681 4771 reflector.go:561] object-"openshift-route-controller-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.582274 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581701 4771 reflector.go:561] object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj": failed to list *v1.Secret: secrets "authentication-operator-dockercfg-mz9bj" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.582300 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"authentication-operator-dockercfg-mz9bj\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"authentication-operator-dockercfg-mz9bj\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581714 4771 reflector.go:561] object-"openshift-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.582325 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581727 4771 reflector.go:561] object-"openshift-apiserver"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.582349 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.581723 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.581724 4771 reflector.go:561] object-"openshift-controller-manager"/"openshift-global-ca": failed to list *v1.ConfigMap: configmaps "openshift-global-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.582427 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-global-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-global-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.582785 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.583248 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-gd7gl"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.583939 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-gd7gl" Feb 27 01:08:21 crc kubenswrapper[4771]: W0227 01:08:21.586965 4771 reflector.go:561] object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w": failed to list *v1.Secret: secrets "cluster-samples-operator-dockercfg-xpp9w" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-samples-operator": no relationship found between node 'crc' and this object Feb 27 01:08:21 crc kubenswrapper[4771]: E0227 01:08:21.586997 4771 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-xpp9w\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cluster-samples-operator-dockercfg-xpp9w\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-samples-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.593316 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.596826 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qwgc"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.597836 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qwgc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.598432 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28znj"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.599028 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28znj" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.601483 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zv9vk"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.602066 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zv9vk" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.608920 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-audit-policies\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.608966 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-image-import-ca\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.609010 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-client-ca\") pod \"controller-manager-879f6c89f-wgwwp\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.609032 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvjg4\" (UniqueName: \"kubernetes.io/projected/ca03f0a2-fdee-42d5-a671-212f7b35b6aa-kube-api-access-lvjg4\") pod \"cluster-samples-operator-665b6dd947-cqlnh\" (UID: \"ca03f0a2-fdee-42d5-a671-212f7b35b6aa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cqlnh" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.609079 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7bd5b18f-fa8c-46d4-a571-630a67b14023-images\") pod \"machine-api-operator-5694c8668f-4vrtf\" (UID: \"7bd5b18f-fa8c-46d4-a571-630a67b14023\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4vrtf" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.609115 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j9dm\" (UniqueName: \"kubernetes.io/projected/58839f3c-374c-43d0-ac2e-32c497ead461-kube-api-access-9j9dm\") pod \"downloads-7954f5f757-gd7gl\" (UID: \"58839f3c-374c-43d0-ac2e-32c497ead461\") " pod="openshift-console/downloads-7954f5f757-gd7gl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.609159 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db8009a0-8b08-421c-8f35-e3127b0b5e8e-console-serving-cert\") pod \"console-f9d7485db-65dsm\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.609182 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.609204 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-config\") pod \"authentication-operator-69f744f599-crfkk\" (UID: \"608f3fea-4388-4d6b-8795-fbba59621e28\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.609244 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9f091ab-b345-4bf0-ac8e-b44181c8553f-serving-cert\") pod \"controller-manager-879f6c89f-wgwwp\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.609272 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pwmh\" (UniqueName: \"kubernetes.io/projected/608f3fea-4388-4d6b-8795-fbba59621e28-kube-api-access-9pwmh\") pod \"authentication-operator-69f744f599-crfkk\" (UID: \"608f3fea-4388-4d6b-8795-fbba59621e28\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.609316 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/608f3fea-4388-4d6b-8795-fbba59621e28-serving-cert\") pod \"authentication-operator-69f744f599-crfkk\" (UID: \"608f3fea-4388-4d6b-8795-fbba59621e28\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.609348 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-service-ca-bundle\") pod \"authentication-operator-69f744f599-crfkk\" (UID: \"608f3fea-4388-4d6b-8795-fbba59621e28\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.609438 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/179b172a-a753-4f11-9532-63816979538a-serving-cert\") pod \"route-controller-manager-6576b87f9c-g6lg5\" (UID: \"179b172a-a753-4f11-9532-63816979538a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.609462 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4fsd\" (UniqueName: \"kubernetes.io/projected/2e58f1a0-a75d-4280-8cfc-c249696d0b38-kube-api-access-b4fsd\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.609515 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhdh9\" (UniqueName: \"kubernetes.io/projected/dedd2c80-3f88-4871-82b4-7744b17d00fc-kube-api-access-mhdh9\") pod \"machine-approver-56656f9798-s7265\" (UID: \"dedd2c80-3f88-4871-82b4-7744b17d00fc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s7265" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.609611 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-config\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.609675 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db8009a0-8b08-421c-8f35-e3127b0b5e8e-service-ca\") pod \"console-f9d7485db-65dsm\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.609698 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.609767 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-etcd-serving-ca\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.609792 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.609861 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b3ec1be-ffa3-4733-ac99-7c86693297d7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-vdp46\" (UID: \"5b3ec1be-ffa3-4733-ac99-7c86693297d7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdp46" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.609923 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-serving-cert\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.609948 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.610018 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wgwwp\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.610088 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db8009a0-8b08-421c-8f35-e3127b0b5e8e-trusted-ca-bundle\") pod \"console-f9d7485db-65dsm\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.610115 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-etcd-client\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.610188 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dedd2c80-3f88-4871-82b4-7744b17d00fc-auth-proxy-config\") pod \"machine-approver-56656f9798-s7265\" (UID: \"dedd2c80-3f88-4871-82b4-7744b17d00fc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s7265" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.611253 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzzvb\" (UniqueName: \"kubernetes.io/projected/179b172a-a753-4f11-9532-63816979538a-kube-api-access-xzzvb\") pod \"route-controller-manager-6576b87f9c-g6lg5\" (UID: \"179b172a-a753-4f11-9532-63816979538a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.611284 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-crfkk\" (UID: \"608f3fea-4388-4d6b-8795-fbba59621e28\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.611375 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f4eaf94a-ef2d-48bb-8762-bad950a6918a-node-pullsecrets\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.611543 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qck5x\" (UniqueName: \"kubernetes.io/projected/db8009a0-8b08-421c-8f35-e3127b0b5e8e-kube-api-access-qck5x\") pod \"console-f9d7485db-65dsm\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.611594 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8x9z\" (UniqueName: \"kubernetes.io/projected/f4eaf94a-ef2d-48bb-8762-bad950a6918a-kube-api-access-m8x9z\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.611617 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bd5b18f-fa8c-46d4-a571-630a67b14023-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4vrtf\" (UID: \"7bd5b18f-fa8c-46d4-a571-630a67b14023\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4vrtf" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.611659 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.611678 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-encryption-config\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.611700 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-audit-policies\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.611741 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bd5b18f-fa8c-46d4-a571-630a67b14023-config\") pod \"machine-api-operator-5694c8668f-4vrtf\" (UID: \"7bd5b18f-fa8c-46d4-a571-630a67b14023\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4vrtf" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.611792 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.611822 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.611847 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4eaf94a-ef2d-48bb-8762-bad950a6918a-audit-dir\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.611900 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b3ec1be-ffa3-4733-ac99-7c86693297d7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-vdp46\" (UID: \"5b3ec1be-ffa3-4733-ac99-7c86693297d7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdp46" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.611921 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db8009a0-8b08-421c-8f35-e3127b0b5e8e-oauth-serving-cert\") pod \"console-f9d7485db-65dsm\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.611943 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dedd2c80-3f88-4871-82b4-7744b17d00fc-machine-approver-tls\") pod \"machine-approver-56656f9798-s7265\" (UID: \"dedd2c80-3f88-4871-82b4-7744b17d00fc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s7265" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.611962 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dedd2c80-3f88-4871-82b4-7744b17d00fc-config\") pod \"machine-approver-56656f9798-s7265\" (UID: \"dedd2c80-3f88-4871-82b4-7744b17d00fc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s7265" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.611980 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca03f0a2-fdee-42d5-a671-212f7b35b6aa-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-cqlnh\" (UID: \"ca03f0a2-fdee-42d5-a671-212f7b35b6aa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cqlnh" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.611999 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2c8n\" (UniqueName: \"kubernetes.io/projected/7bd5b18f-fa8c-46d4-a571-630a67b14023-kube-api-access-b2c8n\") pod \"machine-api-operator-5694c8668f-4vrtf\" (UID: \"7bd5b18f-fa8c-46d4-a571-630a67b14023\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4vrtf" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.612021 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-encryption-config\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.612042 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hph8x\" (UniqueName: \"kubernetes.io/projected/b9f091ab-b345-4bf0-ac8e-b44181c8553f-kube-api-access-hph8x\") pod \"controller-manager-879f6c89f-wgwwp\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.612060 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.612081 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/179b172a-a753-4f11-9532-63816979538a-config\") pod \"route-controller-manager-6576b87f9c-g6lg5\" (UID: \"179b172a-a753-4f11-9532-63816979538a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.612100 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db8009a0-8b08-421c-8f35-e3127b0b5e8e-console-oauth-config\") pod \"console-f9d7485db-65dsm\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.612123 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.612142 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b3ec1be-ffa3-4733-ac99-7c86693297d7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-vdp46\" (UID: \"5b3ec1be-ffa3-4733-ac99-7c86693297d7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdp46" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.612162 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db8009a0-8b08-421c-8f35-e3127b0b5e8e-console-config\") pod \"console-f9d7485db-65dsm\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.612189 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.612211 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/179b172a-a753-4f11-9532-63816979538a-client-ca\") pod \"route-controller-manager-6576b87f9c-g6lg5\" (UID: \"179b172a-a753-4f11-9532-63816979538a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.612231 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.612254 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.612280 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.612303 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.612326 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-audit-dir\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.612347 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-etcd-client\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.612368 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2e58f1a0-a75d-4280-8cfc-c249696d0b38-audit-dir\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.612399 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-audit\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.612419 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-serving-cert\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.612440 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khtxn\" (UniqueName: \"kubernetes.io/projected/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-kube-api-access-khtxn\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.612459 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-config\") pod \"controller-manager-879f6c89f-wgwwp\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.612481 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r259g\" (UniqueName: \"kubernetes.io/projected/5b3ec1be-ffa3-4733-ac99-7c86693297d7-kube-api-access-r259g\") pod \"cluster-image-registry-operator-dc59b4c8b-vdp46\" (UID: \"5b3ec1be-ffa3-4733-ac99-7c86693297d7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdp46" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.616765 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.617564 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.617726 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.617737 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.618782 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.619907 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.620156 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.620219 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.620319 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.620591 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.620649 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.620741 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.620828 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.620876 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.621476 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.621658 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.621097 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.622170 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.620828 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.622254 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.622306 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wm5j7"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.628752 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.628979 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qr2pd"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.629620 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ltdq5"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.630145 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ltdq5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.630467 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjdh5"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.630561 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qr2pd" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.631174 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjdh5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.631335 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wm5j7" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.632488 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.632583 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.632715 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.632733 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.632861 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.632746 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.633567 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.633682 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tsf86"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.634358 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tsf86" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.652813 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.653069 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.653255 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.653413 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.653431 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rmxxc"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.653536 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.654445 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-rmxxc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.656224 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.656416 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.661318 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.661335 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.661618 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.661635 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.662664 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.662817 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.662846 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.662955 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.667298 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.671505 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5mtwr"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.675236 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-b77x6"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.675708 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-z6ptp"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.675891 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-b77x6" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.676100 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5mtwr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.676960 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-z6ptp" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.681034 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.685053 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m6vbm"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.685813 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.687309 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m6vbm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.688087 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535900-m4svc"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.689232 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-m4svc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.690683 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jwx72"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.691335 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jwx72" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.691813 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535908-hhvn5"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.693510 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-98pdr"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.693739 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535908-hhvn5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.694361 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-98pdr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.694360 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tct95"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.695268 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tct95" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.696494 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rmkfg"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.696946 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.697858 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-76f5q"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.698645 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-rmkfg" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.700047 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khn28"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.700446 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khn28" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.703427 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4ltbb"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.703832 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4ltbb" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.707246 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zmxc8"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.707968 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.710798 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4vrtf"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.712296 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nskbr"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.712951 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-audit\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.712977 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-serving-cert\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713008 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/257bed8a-876b-4f5e-8a4c-66c1e47b33dc-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zv9vk\" (UID: \"257bed8a-876b-4f5e-8a4c-66c1e47b33dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zv9vk" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713028 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62xs6\" (UniqueName: \"kubernetes.io/projected/ebdc7a41-2398-46bd-9724-aca23394d4b3-kube-api-access-62xs6\") pod \"kube-storage-version-migrator-operator-b67b599dd-wm5j7\" (UID: \"ebdc7a41-2398-46bd-9724-aca23394d4b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wm5j7" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713047 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khtxn\" (UniqueName: \"kubernetes.io/projected/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-kube-api-access-khtxn\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713063 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-config\") pod \"controller-manager-879f6c89f-wgwwp\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713078 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/257bed8a-876b-4f5e-8a4c-66c1e47b33dc-config\") pod \"kube-apiserver-operator-766d6c64bb-zv9vk\" (UID: \"257bed8a-876b-4f5e-8a4c-66c1e47b33dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zv9vk" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713093 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f93aff21-0c7f-43b8-a1da-2c35dbfd8831-images\") pod \"machine-config-operator-74547568cd-qr2pd\" (UID: \"f93aff21-0c7f-43b8-a1da-2c35dbfd8831\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qr2pd" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713110 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r259g\" (UniqueName: \"kubernetes.io/projected/5b3ec1be-ffa3-4733-ac99-7c86693297d7-kube-api-access-r259g\") pod \"cluster-image-registry-operator-dc59b4c8b-vdp46\" (UID: \"5b3ec1be-ffa3-4733-ac99-7c86693297d7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdp46" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713126 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpxvj\" (UniqueName: \"kubernetes.io/projected/fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b-kube-api-access-gpxvj\") pod \"etcd-operator-b45778765-rmxxc\" (UID: \"fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rmxxc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713151 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-audit-policies\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713166 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-image-import-ca\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713184 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkvq9\" (UniqueName: \"kubernetes.io/projected/d5d1adad-cc9f-4d57-8099-d8e3323da190-kube-api-access-pkvq9\") pod \"machine-config-controller-84d6567774-ltdq5\" (UID: \"d5d1adad-cc9f-4d57-8099-d8e3323da190\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ltdq5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713199 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-client-ca\") pod \"controller-manager-879f6c89f-wgwwp\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713214 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvjg4\" (UniqueName: \"kubernetes.io/projected/ca03f0a2-fdee-42d5-a671-212f7b35b6aa-kube-api-access-lvjg4\") pod \"cluster-samples-operator-665b6dd947-cqlnh\" (UID: \"ca03f0a2-fdee-42d5-a671-212f7b35b6aa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cqlnh" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713229 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7bd5b18f-fa8c-46d4-a571-630a67b14023-images\") pod \"machine-api-operator-5694c8668f-4vrtf\" (UID: \"7bd5b18f-fa8c-46d4-a571-630a67b14023\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4vrtf" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713250 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/257bed8a-876b-4f5e-8a4c-66c1e47b33dc-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zv9vk\" (UID: \"257bed8a-876b-4f5e-8a4c-66c1e47b33dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zv9vk" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713264 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6de81df-af0d-4ebe-b254-7a45c4eb5312-config-volume\") pod \"collect-profiles-29535900-m4svc\" (UID: \"b6de81df-af0d-4ebe-b254-7a45c4eb5312\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-m4svc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713277 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f93aff21-0c7f-43b8-a1da-2c35dbfd8831-proxy-tls\") pod \"machine-config-operator-74547568cd-qr2pd\" (UID: \"f93aff21-0c7f-43b8-a1da-2c35dbfd8831\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qr2pd" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713291 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/423b8446-879f-47e3-9779-14373f259598-serving-cert\") pod \"console-operator-58897d9998-z6ptp\" (UID: \"423b8446-879f-47e3-9779-14373f259598\") " pod="openshift-console-operator/console-operator-58897d9998-z6ptp" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713306 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrdrg\" (UniqueName: \"kubernetes.io/projected/62c59a17-8b65-4876-a007-1cb1f45a7c2b-kube-api-access-jrdrg\") pod \"control-plane-machine-set-operator-78cbb6b69f-2qwgc\" (UID: \"62c59a17-8b65-4876-a007-1cb1f45a7c2b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qwgc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713328 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j9dm\" (UniqueName: \"kubernetes.io/projected/58839f3c-374c-43d0-ac2e-32c497ead461-kube-api-access-9j9dm\") pod \"downloads-7954f5f757-gd7gl\" (UID: \"58839f3c-374c-43d0-ac2e-32c497ead461\") " pod="openshift-console/downloads-7954f5f757-gd7gl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713343 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db8009a0-8b08-421c-8f35-e3127b0b5e8e-console-serving-cert\") pod \"console-f9d7485db-65dsm\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713359 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b-config\") pod \"etcd-operator-b45778765-rmxxc\" (UID: \"fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rmxxc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713379 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713398 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-config\") pod \"authentication-operator-69f744f599-crfkk\" (UID: \"608f3fea-4388-4d6b-8795-fbba59621e28\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713416 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9f091ab-b345-4bf0-ac8e-b44181c8553f-serving-cert\") pod \"controller-manager-879f6c89f-wgwwp\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713432 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssnz8\" (UniqueName: \"kubernetes.io/projected/7f792bed-0aa4-455f-8fb7-2b26d76a6172-kube-api-access-ssnz8\") pod \"openshift-apiserver-operator-796bbdcf4f-pjdh5\" (UID: \"7f792bed-0aa4-455f-8fb7-2b26d76a6172\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjdh5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713449 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvlhp\" (UniqueName: \"kubernetes.io/projected/423b8446-879f-47e3-9779-14373f259598-kube-api-access-vvlhp\") pod \"console-operator-58897d9998-z6ptp\" (UID: \"423b8446-879f-47e3-9779-14373f259598\") " pod="openshift-console-operator/console-operator-58897d9998-z6ptp" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713464 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc136422-ec9e-411c-9c1c-5704e6033226-serving-cert\") pod \"openshift-config-operator-7777fb866f-5mtwr\" (UID: \"dc136422-ec9e-411c-9c1c-5704e6033226\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5mtwr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713481 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pwmh\" (UniqueName: \"kubernetes.io/projected/608f3fea-4388-4d6b-8795-fbba59621e28-kube-api-access-9pwmh\") pod \"authentication-operator-69f744f599-crfkk\" (UID: \"608f3fea-4388-4d6b-8795-fbba59621e28\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713498 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf82l\" (UniqueName: \"kubernetes.io/projected/6be7b2b4-9297-4d34-8ebc-72e57afda4e4-kube-api-access-zf82l\") pod \"openshift-controller-manager-operator-756b6f6bc6-tsf86\" (UID: \"6be7b2b4-9297-4d34-8ebc-72e57afda4e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tsf86" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713513 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72432eea-a601-4d93-8aee-41ff9573ff0a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m6vbm\" (UID: \"72432eea-a601-4d93-8aee-41ff9573ff0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m6vbm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713529 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/608f3fea-4388-4d6b-8795-fbba59621e28-serving-cert\") pod \"authentication-operator-69f744f599-crfkk\" (UID: \"608f3fea-4388-4d6b-8795-fbba59621e28\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713562 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-service-ca-bundle\") pod \"authentication-operator-69f744f599-crfkk\" (UID: \"608f3fea-4388-4d6b-8795-fbba59621e28\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713577 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebdc7a41-2398-46bd-9724-aca23394d4b3-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-wm5j7\" (UID: \"ebdc7a41-2398-46bd-9724-aca23394d4b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wm5j7" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713593 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/179b172a-a753-4f11-9532-63816979538a-serving-cert\") pod \"route-controller-manager-6576b87f9c-g6lg5\" (UID: \"179b172a-a753-4f11-9532-63816979538a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713608 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6be7b2b4-9297-4d34-8ebc-72e57afda4e4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tsf86\" (UID: \"6be7b2b4-9297-4d34-8ebc-72e57afda4e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tsf86" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713623 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b-etcd-ca\") pod \"etcd-operator-b45778765-rmxxc\" (UID: \"fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rmxxc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713640 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhdh9\" (UniqueName: \"kubernetes.io/projected/dedd2c80-3f88-4871-82b4-7744b17d00fc-kube-api-access-mhdh9\") pod \"machine-approver-56656f9798-s7265\" (UID: \"dedd2c80-3f88-4871-82b4-7744b17d00fc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s7265" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713656 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-config\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713674 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db8009a0-8b08-421c-8f35-e3127b0b5e8e-service-ca\") pod \"console-f9d7485db-65dsm\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713692 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713708 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4fsd\" (UniqueName: \"kubernetes.io/projected/2e58f1a0-a75d-4280-8cfc-c249696d0b38-kube-api-access-b4fsd\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713723 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/581a9e83-a359-4b05-b9c0-0d4c8d39277b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-28znj\" (UID: \"581a9e83-a359-4b05-b9c0-0d4c8d39277b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28znj" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713739 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-264dv\" (UniqueName: \"kubernetes.io/projected/dc136422-ec9e-411c-9c1c-5704e6033226-kube-api-access-264dv\") pod \"openshift-config-operator-7777fb866f-5mtwr\" (UID: \"dc136422-ec9e-411c-9c1c-5704e6033226\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5mtwr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713760 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-etcd-serving-ca\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713778 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713793 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b3ec1be-ffa3-4733-ac99-7c86693297d7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-vdp46\" (UID: \"5b3ec1be-ffa3-4733-ac99-7c86693297d7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdp46" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713808 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-serving-cert\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713822 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/dc136422-ec9e-411c-9c1c-5704e6033226-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5mtwr\" (UID: \"dc136422-ec9e-411c-9c1c-5704e6033226\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5mtwr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713821 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-audit-policies\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713838 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebdc7a41-2398-46bd-9724-aca23394d4b3-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-wm5j7\" (UID: \"ebdc7a41-2398-46bd-9724-aca23394d4b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wm5j7" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713871 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tdq29"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.714296 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tdq29" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.714480 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nskbr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.713872 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.714643 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wgwwp\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.714664 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db8009a0-8b08-421c-8f35-e3127b0b5e8e-trusted-ca-bundle\") pod \"console-f9d7485db-65dsm\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.714683 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-etcd-client\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.714703 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e49e35fa-4abf-4adb-8e2d-48f71bd28c18-metrics-tls\") pod \"dns-operator-744455d44c-b77x6\" (UID: \"e49e35fa-4abf-4adb-8e2d-48f71bd28c18\") " pod="openshift-dns-operator/dns-operator-744455d44c-b77x6" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.714709 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-h4dqr"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.714719 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b-etcd-client\") pod \"etcd-operator-b45778765-rmxxc\" (UID: \"fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rmxxc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.714735 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/423b8446-879f-47e3-9779-14373f259598-trusted-ca\") pod \"console-operator-58897d9998-z6ptp\" (UID: \"423b8446-879f-47e3-9779-14373f259598\") " pod="openshift-console-operator/console-operator-58897d9998-z6ptp" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.715136 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-h4dqr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.715966 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db8009a0-8b08-421c-8f35-e3127b0b5e8e-trusted-ca-bundle\") pod \"console-f9d7485db-65dsm\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.716008 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grbg2\" (UniqueName: \"kubernetes.io/projected/b6de81df-af0d-4ebe-b254-7a45c4eb5312-kube-api-access-grbg2\") pod \"collect-profiles-29535900-m4svc\" (UID: \"b6de81df-af0d-4ebe-b254-7a45c4eb5312\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-m4svc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.716053 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dedd2c80-3f88-4871-82b4-7744b17d00fc-auth-proxy-config\") pod \"machine-approver-56656f9798-s7265\" (UID: \"dedd2c80-3f88-4871-82b4-7744b17d00fc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s7265" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.716659 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7bd5b18f-fa8c-46d4-a571-630a67b14023-images\") pod \"machine-api-operator-5694c8668f-4vrtf\" (UID: \"7bd5b18f-fa8c-46d4-a571-630a67b14023\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4vrtf" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.717144 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzzvb\" (UniqueName: \"kubernetes.io/projected/179b172a-a753-4f11-9532-63816979538a-kube-api-access-xzzvb\") pod \"route-controller-manager-6576b87f9c-g6lg5\" (UID: \"179b172a-a753-4f11-9532-63816979538a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.717167 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-crfkk\" (UID: \"608f3fea-4388-4d6b-8795-fbba59621e28\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.717186 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f4eaf94a-ef2d-48bb-8762-bad950a6918a-node-pullsecrets\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.717203 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qck5x\" (UniqueName: \"kubernetes.io/projected/db8009a0-8b08-421c-8f35-e3127b0b5e8e-kube-api-access-qck5x\") pod \"console-f9d7485db-65dsm\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.717221 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6be7b2b4-9297-4d34-8ebc-72e57afda4e4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tsf86\" (UID: \"6be7b2b4-9297-4d34-8ebc-72e57afda4e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tsf86" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.717238 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d5d1adad-cc9f-4d57-8099-d8e3323da190-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ltdq5\" (UID: \"d5d1adad-cc9f-4d57-8099-d8e3323da190\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ltdq5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.717261 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8x9z\" (UniqueName: \"kubernetes.io/projected/f4eaf94a-ef2d-48bb-8762-bad950a6918a-kube-api-access-m8x9z\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.717280 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bd5b18f-fa8c-46d4-a571-630a67b14023-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4vrtf\" (UID: \"7bd5b18f-fa8c-46d4-a571-630a67b14023\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4vrtf" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.717299 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.717387 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-encryption-config\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.717404 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-audit-policies\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.717420 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bd5b18f-fa8c-46d4-a571-630a67b14023-config\") pod \"machine-api-operator-5694c8668f-4vrtf\" (UID: \"7bd5b18f-fa8c-46d4-a571-630a67b14023\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4vrtf" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.717442 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b-etcd-service-ca\") pod \"etcd-operator-b45778765-rmxxc\" (UID: \"fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rmxxc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.717462 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.717477 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.717494 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4eaf94a-ef2d-48bb-8762-bad950a6918a-audit-dir\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.717528 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b3ec1be-ffa3-4733-ac99-7c86693297d7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-vdp46\" (UID: \"5b3ec1be-ffa3-4733-ac99-7c86693297d7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdp46" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.717562 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72432eea-a601-4d93-8aee-41ff9573ff0a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m6vbm\" (UID: \"72432eea-a601-4d93-8aee-41ff9573ff0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m6vbm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.717581 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/581a9e83-a359-4b05-b9c0-0d4c8d39277b-config\") pod \"kube-controller-manager-operator-78b949d7b-28znj\" (UID: \"581a9e83-a359-4b05-b9c0-0d4c8d39277b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28znj" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.717588 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wgwwp"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.717602 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dedd2c80-3f88-4871-82b4-7744b17d00fc-machine-approver-tls\") pod \"machine-approver-56656f9798-s7265\" (UID: \"dedd2c80-3f88-4871-82b4-7744b17d00fc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s7265" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.717682 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dedd2c80-3f88-4871-82b4-7744b17d00fc-config\") pod \"machine-approver-56656f9798-s7265\" (UID: \"dedd2c80-3f88-4871-82b4-7744b17d00fc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s7265" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.717779 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f4eaf94a-ef2d-48bb-8762-bad950a6918a-node-pullsecrets\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.717827 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b3ec1be-ffa3-4733-ac99-7c86693297d7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-vdp46\" (UID: \"5b3ec1be-ffa3-4733-ac99-7c86693297d7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdp46" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.718144 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.719018 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bd5b18f-fa8c-46d4-a571-630a67b14023-config\") pod \"machine-api-operator-5694c8668f-4vrtf\" (UID: \"7bd5b18f-fa8c-46d4-a571-630a67b14023\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4vrtf" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.719107 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db8009a0-8b08-421c-8f35-e3127b0b5e8e-oauth-serving-cert\") pod \"console-f9d7485db-65dsm\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.719167 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2c8n\" (UniqueName: \"kubernetes.io/projected/7bd5b18f-fa8c-46d4-a571-630a67b14023-kube-api-access-b2c8n\") pod \"machine-api-operator-5694c8668f-4vrtf\" (UID: \"7bd5b18f-fa8c-46d4-a571-630a67b14023\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4vrtf" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.719190 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f792bed-0aa4-455f-8fb7-2b26d76a6172-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-pjdh5\" (UID: \"7f792bed-0aa4-455f-8fb7-2b26d76a6172\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjdh5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.719210 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/423b8446-879f-47e3-9779-14373f259598-config\") pod \"console-operator-58897d9998-z6ptp\" (UID: \"423b8446-879f-47e3-9779-14373f259598\") " pod="openshift-console-operator/console-operator-58897d9998-z6ptp" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.719231 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.719285 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca03f0a2-fdee-42d5-a671-212f7b35b6aa-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-cqlnh\" (UID: \"ca03f0a2-fdee-42d5-a671-212f7b35b6aa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cqlnh" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.719338 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4eaf94a-ef2d-48bb-8762-bad950a6918a-audit-dir\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.719401 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.719444 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hph8x\" (UniqueName: \"kubernetes.io/projected/b9f091ab-b345-4bf0-ac8e-b44181c8553f-kube-api-access-hph8x\") pod \"controller-manager-879f6c89f-wgwwp\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.719514 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.719774 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db8009a0-8b08-421c-8f35-e3127b0b5e8e-service-ca\") pod \"console-f9d7485db-65dsm\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.719778 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.719832 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-encryption-config\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.719881 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz8jm\" (UniqueName: \"kubernetes.io/projected/706b5440-ad63-4d92-9708-96ce6d6926b8-kube-api-access-kz8jm\") pod \"migrator-59844c95c7-jwx72\" (UID: \"706b5440-ad63-4d92-9708-96ce6d6926b8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jwx72" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.719915 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6de81df-af0d-4ebe-b254-7a45c4eb5312-secret-volume\") pod \"collect-profiles-29535900-m4svc\" (UID: \"b6de81df-af0d-4ebe-b254-7a45c4eb5312\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-m4svc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.719934 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b-serving-cert\") pod \"etcd-operator-b45778765-rmxxc\" (UID: \"fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rmxxc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.719972 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db8009a0-8b08-421c-8f35-e3127b0b5e8e-oauth-serving-cert\") pod \"console-f9d7485db-65dsm\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.720031 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/179b172a-a753-4f11-9532-63816979538a-config\") pod \"route-controller-manager-6576b87f9c-g6lg5\" (UID: \"179b172a-a753-4f11-9532-63816979538a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.720061 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.720074 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db8009a0-8b08-421c-8f35-e3127b0b5e8e-console-oauth-config\") pod \"console-f9d7485db-65dsm\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.720115 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f93aff21-0c7f-43b8-a1da-2c35dbfd8831-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qr2pd\" (UID: \"f93aff21-0c7f-43b8-a1da-2c35dbfd8831\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qr2pd" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.720325 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.720740 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b3ec1be-ffa3-4733-ac99-7c86693297d7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-vdp46\" (UID: \"5b3ec1be-ffa3-4733-ac99-7c86693297d7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdp46" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.720818 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/62c59a17-8b65-4876-a007-1cb1f45a7c2b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2qwgc\" (UID: \"62c59a17-8b65-4876-a007-1cb1f45a7c2b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qwgc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.720784 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db8009a0-8b08-421c-8f35-e3127b0b5e8e-console-serving-cert\") pod \"console-f9d7485db-65dsm\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.720841 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9qk2\" (UniqueName: \"kubernetes.io/projected/f93aff21-0c7f-43b8-a1da-2c35dbfd8831-kube-api-access-t9qk2\") pod \"machine-config-operator-74547568cd-qr2pd\" (UID: \"f93aff21-0c7f-43b8-a1da-2c35dbfd8831\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qr2pd" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.720885 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.720909 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/179b172a-a753-4f11-9532-63816979538a-client-ca\") pod \"route-controller-manager-6576b87f9c-g6lg5\" (UID: \"179b172a-a753-4f11-9532-63816979538a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.720931 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db8009a0-8b08-421c-8f35-e3127b0b5e8e-console-config\") pod \"console-f9d7485db-65dsm\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.720952 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.720971 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.720992 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.721010 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.721029 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-audit-dir\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.721052 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lz5m\" (UniqueName: \"kubernetes.io/projected/e49e35fa-4abf-4adb-8e2d-48f71bd28c18-kube-api-access-8lz5m\") pod \"dns-operator-744455d44c-b77x6\" (UID: \"e49e35fa-4abf-4adb-8e2d-48f71bd28c18\") " pod="openshift-dns-operator/dns-operator-744455d44c-b77x6" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.721070 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5d1adad-cc9f-4d57-8099-d8e3323da190-proxy-tls\") pod \"machine-config-controller-84d6567774-ltdq5\" (UID: \"d5d1adad-cc9f-4d57-8099-d8e3323da190\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ltdq5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.721087 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/581a9e83-a359-4b05-b9c0-0d4c8d39277b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-28znj\" (UID: \"581a9e83-a359-4b05-b9c0-0d4c8d39277b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28znj" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.721106 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-etcd-client\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.721123 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2e58f1a0-a75d-4280-8cfc-c249696d0b38-audit-dir\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.721144 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f792bed-0aa4-455f-8fb7-2b26d76a6172-config\") pod \"openshift-apiserver-operator-796bbdcf4f-pjdh5\" (UID: \"7f792bed-0aa4-455f-8fb7-2b26d76a6172\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjdh5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.721164 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72432eea-a601-4d93-8aee-41ff9573ff0a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m6vbm\" (UID: \"72432eea-a601-4d93-8aee-41ff9573ff0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m6vbm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.721296 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.721404 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-audit-dir\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.721600 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2e58f1a0-a75d-4280-8cfc-c249696d0b38-audit-dir\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.721700 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db8009a0-8b08-421c-8f35-e3127b0b5e8e-console-config\") pod \"console-f9d7485db-65dsm\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.721788 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.722097 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bdlp9"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.722771 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.722958 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bdlp9" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.724067 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b3ec1be-ffa3-4733-ac99-7c86693297d7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-vdp46\" (UID: \"5b3ec1be-ffa3-4733-ac99-7c86693297d7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdp46" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.724594 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rwr4f"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.725655 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.725997 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bd5b18f-fa8c-46d4-a571-630a67b14023-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4vrtf\" (UID: \"7bd5b18f-fa8c-46d4-a571-630a67b14023\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4vrtf" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.726237 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.728057 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9scwl"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.729865 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cqlnh"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.731186 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.731876 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.732733 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-crqhd"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.735098 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.736043 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-crqhd" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.737726 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.742847 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db8009a0-8b08-421c-8f35-e3127b0b5e8e-console-oauth-config\") pod \"console-f9d7485db-65dsm\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.743485 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-crfkk"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.746136 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjdh5"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.747132 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.747686 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zv9vk"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.753596 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ltdq5"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.756847 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535908-hhvn5"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.757338 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.759452 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rmxxc"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.761902 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qr2pd"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.763866 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.766769 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-gd7gl"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.770415 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-65dsm"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.777248 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.781277 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qwgc"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.781342 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5mtwr"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.781356 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tct95"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.781681 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdp46"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.785598 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28znj"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.789751 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-z6ptp"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.790951 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tsf86"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.791839 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rwr4f"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.792797 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khn28"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.793759 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-b77x6"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.794747 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-98pdr"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.795698 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4ltbb"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.796718 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rmkfg"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.797087 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.797823 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jwx72"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.799401 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535900-m4svc"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.800016 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m6vbm"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.801117 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wm5j7"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.802071 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wf8jh"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.802615 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wf8jh" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.803110 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-77b5k"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.803627 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-77b5k" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.804154 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nskbr"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.805014 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bdlp9"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.806005 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-crqhd"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.807023 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zmxc8"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.807957 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tdq29"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.809078 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wf8jh"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.810076 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mwrgv"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.811023 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-mwrgv" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.811263 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mwrgv"] Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.818353 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.821577 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b-etcd-ca\") pod \"etcd-operator-b45778765-rmxxc\" (UID: \"fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rmxxc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.821611 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9n4h\" (UniqueName: \"kubernetes.io/projected/fd53cc9e-5423-4ad7-afe5-54824c08341e-kube-api-access-v9n4h\") pod \"dns-default-crqhd\" (UID: \"fd53cc9e-5423-4ad7-afe5-54824c08341e\") " pod="openshift-dns/dns-default-crqhd" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.821631 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c2be6688-5fef-4657-9eea-235fc8bb13f7-profile-collector-cert\") pod \"catalog-operator-68c6474976-khn28\" (UID: \"c2be6688-5fef-4657-9eea-235fc8bb13f7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khn28" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.821648 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f-metrics-tls\") pod \"ingress-operator-5b745b69d9-bdlp9\" (UID: \"af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bdlp9" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.821684 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/581a9e83-a359-4b05-b9c0-0d4c8d39277b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-28znj\" (UID: \"581a9e83-a359-4b05-b9c0-0d4c8d39277b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28znj" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.821706 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/83ee3033-e504-40f2-9c72-70d863d0d333-certs\") pod \"machine-config-server-77b5k\" (UID: \"83ee3033-e504-40f2-9c72-70d863d0d333\") " pod="openshift-machine-config-operator/machine-config-server-77b5k" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.821741 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/dc136422-ec9e-411c-9c1c-5704e6033226-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5mtwr\" (UID: \"dc136422-ec9e-411c-9c1c-5704e6033226\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5mtwr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.821788 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/740f2438-5f9c-40bb-ae51-77aac4708ab9-service-ca-bundle\") pod \"router-default-5444994796-h4dqr\" (UID: \"740f2438-5f9c-40bb-ae51-77aac4708ab9\") " pod="openshift-ingress/router-default-5444994796-h4dqr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.821824 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grbg2\" (UniqueName: \"kubernetes.io/projected/b6de81df-af0d-4ebe-b254-7a45c4eb5312-kube-api-access-grbg2\") pod \"collect-profiles-29535900-m4svc\" (UID: \"b6de81df-af0d-4ebe-b254-7a45c4eb5312\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-m4svc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.821878 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e49e35fa-4abf-4adb-8e2d-48f71bd28c18-metrics-tls\") pod \"dns-operator-744455d44c-b77x6\" (UID: \"e49e35fa-4abf-4adb-8e2d-48f71bd28c18\") " pod="openshift-dns-operator/dns-operator-744455d44c-b77x6" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.821982 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6be7b2b4-9297-4d34-8ebc-72e57afda4e4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tsf86\" (UID: \"6be7b2b4-9297-4d34-8ebc-72e57afda4e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tsf86" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.822016 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d5d1adad-cc9f-4d57-8099-d8e3323da190-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ltdq5\" (UID: \"d5d1adad-cc9f-4d57-8099-d8e3323da190\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ltdq5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.822040 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr2k6\" (UniqueName: \"kubernetes.io/projected/963fd070-b5e6-4a67-afd6-d056aacf8bc2-kube-api-access-sr2k6\") pod \"marketplace-operator-79b997595-rwr4f\" (UID: \"963fd070-b5e6-4a67-afd6-d056aacf8bc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.822062 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/50a07abb-e77f-450d-990f-3c9e3b0360d9-signing-key\") pod \"service-ca-9c57cc56f-rmkfg\" (UID: \"50a07abb-e77f-450d-990f-3c9e3b0360d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-rmkfg" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.822105 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b-etcd-service-ca\") pod \"etcd-operator-b45778765-rmxxc\" (UID: \"fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rmxxc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.822128 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee145931-4993-4e80-88c9-1f8a4f46e77c-config\") pod \"service-ca-operator-777779d784-nskbr\" (UID: \"ee145931-4993-4e80-88c9-1f8a4f46e77c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nskbr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.822133 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/dc136422-ec9e-411c-9c1c-5704e6033226-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5mtwr\" (UID: \"dc136422-ec9e-411c-9c1c-5704e6033226\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5mtwr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.822166 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d58c6fa9-b6f3-4c21-bc67-a36e569fbfeb-cert\") pod \"ingress-canary-wf8jh\" (UID: \"d58c6fa9-b6f3-4c21-bc67-a36e569fbfeb\") " pod="openshift-ingress-canary/ingress-canary-wf8jh" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.822199 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfm6g\" (UniqueName: \"kubernetes.io/projected/6c8a25c9-89d7-4606-8f6e-fe1b46b061eb-kube-api-access-gfm6g\") pod \"packageserver-d55dfcdfc-tdq29\" (UID: \"6c8a25c9-89d7-4606-8f6e-fe1b46b061eb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tdq29" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.822237 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f792bed-0aa4-455f-8fb7-2b26d76a6172-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-pjdh5\" (UID: \"7f792bed-0aa4-455f-8fb7-2b26d76a6172\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjdh5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.822262 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/423b8446-879f-47e3-9779-14373f259598-config\") pod \"console-operator-58897d9998-z6ptp\" (UID: \"423b8446-879f-47e3-9779-14373f259598\") " pod="openshift-console-operator/console-operator-58897d9998-z6ptp" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.822317 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b-serving-cert\") pod \"etcd-operator-b45778765-rmxxc\" (UID: \"fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rmxxc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.822344 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz8jm\" (UniqueName: \"kubernetes.io/projected/706b5440-ad63-4d92-9708-96ce6d6926b8-kube-api-access-kz8jm\") pod \"migrator-59844c95c7-jwx72\" (UID: \"706b5440-ad63-4d92-9708-96ce6d6926b8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jwx72" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.822366 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6de81df-af0d-4ebe-b254-7a45c4eb5312-secret-volume\") pod \"collect-profiles-29535900-m4svc\" (UID: \"b6de81df-af0d-4ebe-b254-7a45c4eb5312\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-m4svc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.822770 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/83ee3033-e504-40f2-9c72-70d863d0d333-node-bootstrap-token\") pod \"machine-config-server-77b5k\" (UID: \"83ee3033-e504-40f2-9c72-70d863d0d333\") " pod="openshift-machine-config-operator/machine-config-server-77b5k" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.822787 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d5d1adad-cc9f-4d57-8099-d8e3323da190-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ltdq5\" (UID: \"d5d1adad-cc9f-4d57-8099-d8e3323da190\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ltdq5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.822799 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c2be6688-5fef-4657-9eea-235fc8bb13f7-srv-cert\") pod \"catalog-operator-68c6474976-khn28\" (UID: \"c2be6688-5fef-4657-9eea-235fc8bb13f7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khn28" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.822992 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b9b84607-b33c-4c44-8331-9e09df2cccfe-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-98pdr\" (UID: \"b9b84607-b33c-4c44-8331-9e09df2cccfe\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-98pdr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823114 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lz5m\" (UniqueName: \"kubernetes.io/projected/e49e35fa-4abf-4adb-8e2d-48f71bd28c18-kube-api-access-8lz5m\") pod \"dns-operator-744455d44c-b77x6\" (UID: \"e49e35fa-4abf-4adb-8e2d-48f71bd28c18\") " pod="openshift-dns-operator/dns-operator-744455d44c-b77x6" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823143 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5d1adad-cc9f-4d57-8099-d8e3323da190-proxy-tls\") pod \"machine-config-controller-84d6567774-ltdq5\" (UID: \"d5d1adad-cc9f-4d57-8099-d8e3323da190\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ltdq5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823164 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f792bed-0aa4-455f-8fb7-2b26d76a6172-config\") pod \"openshift-apiserver-operator-796bbdcf4f-pjdh5\" (UID: \"7f792bed-0aa4-455f-8fb7-2b26d76a6172\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjdh5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823212 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72432eea-a601-4d93-8aee-41ff9573ff0a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m6vbm\" (UID: \"72432eea-a601-4d93-8aee-41ff9573ff0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m6vbm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823240 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62xs6\" (UniqueName: \"kubernetes.io/projected/ebdc7a41-2398-46bd-9724-aca23394d4b3-kube-api-access-62xs6\") pod \"kube-storage-version-migrator-operator-b67b599dd-wm5j7\" (UID: \"ebdc7a41-2398-46bd-9724-aca23394d4b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wm5j7" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823262 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd53cc9e-5423-4ad7-afe5-54824c08341e-config-volume\") pod \"dns-default-crqhd\" (UID: \"fd53cc9e-5423-4ad7-afe5-54824c08341e\") " pod="openshift-dns/dns-default-crqhd" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823301 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/602fa36f-0b9d-4ce5-abc5-b8d894cbf0fb-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tct95\" (UID: \"602fa36f-0b9d-4ce5-abc5-b8d894cbf0fb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tct95" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823322 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpxvj\" (UniqueName: \"kubernetes.io/projected/fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b-kube-api-access-gpxvj\") pod \"etcd-operator-b45778765-rmxxc\" (UID: \"fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rmxxc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823347 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/257bed8a-876b-4f5e-8a4c-66c1e47b33dc-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zv9vk\" (UID: \"257bed8a-876b-4f5e-8a4c-66c1e47b33dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zv9vk" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823362 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6de81df-af0d-4ebe-b254-7a45c4eb5312-config-volume\") pod \"collect-profiles-29535900-m4svc\" (UID: \"b6de81df-af0d-4ebe-b254-7a45c4eb5312\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-m4svc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823383 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk5ks\" (UniqueName: \"kubernetes.io/projected/ee145931-4993-4e80-88c9-1f8a4f46e77c-kube-api-access-tk5ks\") pod \"service-ca-operator-777779d784-nskbr\" (UID: \"ee145931-4993-4e80-88c9-1f8a4f46e77c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nskbr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823402 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/423b8446-879f-47e3-9779-14373f259598-serving-cert\") pod \"console-operator-58897d9998-z6ptp\" (UID: \"423b8446-879f-47e3-9779-14373f259598\") " pod="openshift-console-operator/console-operator-58897d9998-z6ptp" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823419 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrdrg\" (UniqueName: \"kubernetes.io/projected/62c59a17-8b65-4876-a007-1cb1f45a7c2b-kube-api-access-jrdrg\") pod \"control-plane-machine-set-operator-78cbb6b69f-2qwgc\" (UID: \"62c59a17-8b65-4876-a007-1cb1f45a7c2b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qwgc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823443 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b-config\") pod \"etcd-operator-b45778765-rmxxc\" (UID: \"fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rmxxc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823458 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f-trusted-ca\") pod \"ingress-operator-5b745b69d9-bdlp9\" (UID: \"af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bdlp9" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823474 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqttp\" (UniqueName: \"kubernetes.io/projected/83ee3033-e504-40f2-9c72-70d863d0d333-kube-api-access-tqttp\") pod \"machine-config-server-77b5k\" (UID: \"83ee3033-e504-40f2-9c72-70d863d0d333\") " pod="openshift-machine-config-operator/machine-config-server-77b5k" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823492 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zlpg\" (UniqueName: \"kubernetes.io/projected/d58c6fa9-b6f3-4c21-bc67-a36e569fbfeb-kube-api-access-8zlpg\") pod \"ingress-canary-wf8jh\" (UID: \"d58c6fa9-b6f3-4c21-bc67-a36e569fbfeb\") " pod="openshift-ingress-canary/ingress-canary-wf8jh" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823508 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4r9g\" (UniqueName: \"kubernetes.io/projected/602fa36f-0b9d-4ce5-abc5-b8d894cbf0fb-kube-api-access-h4r9g\") pod \"package-server-manager-789f6589d5-tct95\" (UID: \"602fa36f-0b9d-4ce5-abc5-b8d894cbf0fb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tct95" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823536 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf82l\" (UniqueName: \"kubernetes.io/projected/6be7b2b4-9297-4d34-8ebc-72e57afda4e4-kube-api-access-zf82l\") pod \"openshift-controller-manager-operator-756b6f6bc6-tsf86\" (UID: \"6be7b2b4-9297-4d34-8ebc-72e57afda4e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tsf86" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823571 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr4jg\" (UniqueName: \"kubernetes.io/projected/50a07abb-e77f-450d-990f-3c9e3b0360d9-kube-api-access-pr4jg\") pod \"service-ca-9c57cc56f-rmkfg\" (UID: \"50a07abb-e77f-450d-990f-3c9e3b0360d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-rmkfg" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823599 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebdc7a41-2398-46bd-9724-aca23394d4b3-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-wm5j7\" (UID: \"ebdc7a41-2398-46bd-9724-aca23394d4b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wm5j7" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823622 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6be7b2b4-9297-4d34-8ebc-72e57afda4e4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tsf86\" (UID: \"6be7b2b4-9297-4d34-8ebc-72e57afda4e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tsf86" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823641 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a6e4a2c7-0dd0-471a-a63c-aacd72b2cb17-srv-cert\") pod \"olm-operator-6b444d44fb-4ltbb\" (UID: \"a6e4a2c7-0dd0-471a-a63c-aacd72b2cb17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4ltbb" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823664 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-264dv\" (UniqueName: \"kubernetes.io/projected/dc136422-ec9e-411c-9c1c-5704e6033226-kube-api-access-264dv\") pod \"openshift-config-operator-7777fb866f-5mtwr\" (UID: \"dc136422-ec9e-411c-9c1c-5704e6033226\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5mtwr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823681 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdbfg\" (UniqueName: \"kubernetes.io/projected/c2be6688-5fef-4657-9eea-235fc8bb13f7-kube-api-access-kdbfg\") pod \"catalog-operator-68c6474976-khn28\" (UID: \"c2be6688-5fef-4657-9eea-235fc8bb13f7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khn28" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823717 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f792bed-0aa4-455f-8fb7-2b26d76a6172-config\") pod \"openshift-apiserver-operator-796bbdcf4f-pjdh5\" (UID: \"7f792bed-0aa4-455f-8fb7-2b26d76a6172\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjdh5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823729 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebdc7a41-2398-46bd-9724-aca23394d4b3-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-wm5j7\" (UID: \"ebdc7a41-2398-46bd-9724-aca23394d4b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wm5j7" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823749 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6c8a25c9-89d7-4606-8f6e-fe1b46b061eb-apiservice-cert\") pod \"packageserver-d55dfcdfc-tdq29\" (UID: \"6c8a25c9-89d7-4606-8f6e-fe1b46b061eb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tdq29" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823764 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a6e4a2c7-0dd0-471a-a63c-aacd72b2cb17-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4ltbb\" (UID: \"a6e4a2c7-0dd0-471a-a63c-aacd72b2cb17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4ltbb" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823781 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/963fd070-b5e6-4a67-afd6-d056aacf8bc2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rwr4f\" (UID: \"963fd070-b5e6-4a67-afd6-d056aacf8bc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823855 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b-etcd-client\") pod \"etcd-operator-b45778765-rmxxc\" (UID: \"fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rmxxc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.823876 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/423b8446-879f-47e3-9779-14373f259598-trusted-ca\") pod \"console-operator-58897d9998-z6ptp\" (UID: \"423b8446-879f-47e3-9779-14373f259598\") " pod="openshift-console-operator/console-operator-58897d9998-z6ptp" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.824134 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttp4j\" (UniqueName: \"kubernetes.io/projected/e0d5634e-ce3f-40a5-b85d-64f8c4708c59-kube-api-access-ttp4j\") pod \"auto-csr-approver-29535908-hhvn5\" (UID: \"e0d5634e-ce3f-40a5-b85d-64f8c4708c59\") " pod="openshift-infra/auto-csr-approver-29535908-hhvn5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.824189 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/740f2438-5f9c-40bb-ae51-77aac4708ab9-default-certificate\") pod \"router-default-5444994796-h4dqr\" (UID: \"740f2438-5f9c-40bb-ae51-77aac4708ab9\") " pod="openshift-ingress/router-default-5444994796-h4dqr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.824210 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd53cc9e-5423-4ad7-afe5-54824c08341e-metrics-tls\") pod \"dns-default-crqhd\" (UID: \"fd53cc9e-5423-4ad7-afe5-54824c08341e\") " pod="openshift-dns/dns-default-crqhd" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.824236 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/50a07abb-e77f-450d-990f-3c9e3b0360d9-signing-cabundle\") pod \"service-ca-9c57cc56f-rmkfg\" (UID: \"50a07abb-e77f-450d-990f-3c9e3b0360d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-rmkfg" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.824266 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/581a9e83-a359-4b05-b9c0-0d4c8d39277b-config\") pod \"kube-controller-manager-operator-78b949d7b-28znj\" (UID: \"581a9e83-a359-4b05-b9c0-0d4c8d39277b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28znj" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.824330 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72432eea-a601-4d93-8aee-41ff9573ff0a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m6vbm\" (UID: \"72432eea-a601-4d93-8aee-41ff9573ff0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m6vbm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.824408 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/963fd070-b5e6-4a67-afd6-d056aacf8bc2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rwr4f\" (UID: \"963fd070-b5e6-4a67-afd6-d056aacf8bc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.824435 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f93aff21-0c7f-43b8-a1da-2c35dbfd8831-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qr2pd\" (UID: \"f93aff21-0c7f-43b8-a1da-2c35dbfd8831\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qr2pd" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.824454 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/62c59a17-8b65-4876-a007-1cb1f45a7c2b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2qwgc\" (UID: \"62c59a17-8b65-4876-a007-1cb1f45a7c2b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qwgc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.824472 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9qk2\" (UniqueName: \"kubernetes.io/projected/f93aff21-0c7f-43b8-a1da-2c35dbfd8831-kube-api-access-t9qk2\") pod \"machine-config-operator-74547568cd-qr2pd\" (UID: \"f93aff21-0c7f-43b8-a1da-2c35dbfd8831\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qr2pd" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.824494 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/740f2438-5f9c-40bb-ae51-77aac4708ab9-metrics-certs\") pod \"router-default-5444994796-h4dqr\" (UID: \"740f2438-5f9c-40bb-ae51-77aac4708ab9\") " pod="openshift-ingress/router-default-5444994796-h4dqr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.824505 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebdc7a41-2398-46bd-9724-aca23394d4b3-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-wm5j7\" (UID: \"ebdc7a41-2398-46bd-9724-aca23394d4b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wm5j7" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.824517 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bdlp9\" (UID: \"af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bdlp9" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.824635 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/740f2438-5f9c-40bb-ae51-77aac4708ab9-stats-auth\") pod \"router-default-5444994796-h4dqr\" (UID: \"740f2438-5f9c-40bb-ae51-77aac4708ab9\") " pod="openshift-ingress/router-default-5444994796-h4dqr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.824664 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/581a9e83-a359-4b05-b9c0-0d4c8d39277b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-28znj\" (UID: \"581a9e83-a359-4b05-b9c0-0d4c8d39277b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28znj" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.824697 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj7ld\" (UniqueName: \"kubernetes.io/projected/a6e4a2c7-0dd0-471a-a63c-aacd72b2cb17-kube-api-access-sj7ld\") pod \"olm-operator-6b444d44fb-4ltbb\" (UID: \"a6e4a2c7-0dd0-471a-a63c-aacd72b2cb17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4ltbb" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.824731 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/257bed8a-876b-4f5e-8a4c-66c1e47b33dc-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zv9vk\" (UID: \"257bed8a-876b-4f5e-8a4c-66c1e47b33dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zv9vk" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.824845 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/581a9e83-a359-4b05-b9c0-0d4c8d39277b-config\") pod \"kube-controller-manager-operator-78b949d7b-28znj\" (UID: \"581a9e83-a359-4b05-b9c0-0d4c8d39277b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28znj" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.824755 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f93aff21-0c7f-43b8-a1da-2c35dbfd8831-images\") pod \"machine-config-operator-74547568cd-qr2pd\" (UID: \"f93aff21-0c7f-43b8-a1da-2c35dbfd8831\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qr2pd" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.825033 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6c8a25c9-89d7-4606-8f6e-fe1b46b061eb-webhook-cert\") pod \"packageserver-d55dfcdfc-tdq29\" (UID: \"6c8a25c9-89d7-4606-8f6e-fe1b46b061eb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tdq29" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.825163 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/257bed8a-876b-4f5e-8a4c-66c1e47b33dc-config\") pod \"kube-apiserver-operator-766d6c64bb-zv9vk\" (UID: \"257bed8a-876b-4f5e-8a4c-66c1e47b33dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zv9vk" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.825167 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/581a9e83-a359-4b05-b9c0-0d4c8d39277b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-28znj\" (UID: \"581a9e83-a359-4b05-b9c0-0d4c8d39277b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28znj" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.825189 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgv82\" (UniqueName: \"kubernetes.io/projected/b9b84607-b33c-4c44-8331-9e09df2cccfe-kube-api-access-wgv82\") pod \"multus-admission-controller-857f4d67dd-98pdr\" (UID: \"b9b84607-b33c-4c44-8331-9e09df2cccfe\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-98pdr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.825222 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkvq9\" (UniqueName: \"kubernetes.io/projected/d5d1adad-cc9f-4d57-8099-d8e3323da190-kube-api-access-pkvq9\") pod \"machine-config-controller-84d6567774-ltdq5\" (UID: \"d5d1adad-cc9f-4d57-8099-d8e3323da190\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ltdq5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.825242 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s45ck\" (UniqueName: \"kubernetes.io/projected/740f2438-5f9c-40bb-ae51-77aac4708ab9-kube-api-access-s45ck\") pod \"router-default-5444994796-h4dqr\" (UID: \"740f2438-5f9c-40bb-ae51-77aac4708ab9\") " pod="openshift-ingress/router-default-5444994796-h4dqr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.825260 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnlw6\" (UniqueName: \"kubernetes.io/projected/af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f-kube-api-access-vnlw6\") pod \"ingress-operator-5b745b69d9-bdlp9\" (UID: \"af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bdlp9" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.825289 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f93aff21-0c7f-43b8-a1da-2c35dbfd8831-proxy-tls\") pod \"machine-config-operator-74547568cd-qr2pd\" (UID: \"f93aff21-0c7f-43b8-a1da-2c35dbfd8831\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qr2pd" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.825307 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6c8a25c9-89d7-4606-8f6e-fe1b46b061eb-tmpfs\") pod \"packageserver-d55dfcdfc-tdq29\" (UID: \"6c8a25c9-89d7-4606-8f6e-fe1b46b061eb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tdq29" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.825324 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee145931-4993-4e80-88c9-1f8a4f46e77c-serving-cert\") pod \"service-ca-operator-777779d784-nskbr\" (UID: \"ee145931-4993-4e80-88c9-1f8a4f46e77c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nskbr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.825361 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssnz8\" (UniqueName: \"kubernetes.io/projected/7f792bed-0aa4-455f-8fb7-2b26d76a6172-kube-api-access-ssnz8\") pod \"openshift-apiserver-operator-796bbdcf4f-pjdh5\" (UID: \"7f792bed-0aa4-455f-8fb7-2b26d76a6172\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjdh5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.825377 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f93aff21-0c7f-43b8-a1da-2c35dbfd8831-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qr2pd\" (UID: \"f93aff21-0c7f-43b8-a1da-2c35dbfd8831\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qr2pd" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.825402 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc136422-ec9e-411c-9c1c-5704e6033226-serving-cert\") pod \"openshift-config-operator-7777fb866f-5mtwr\" (UID: \"dc136422-ec9e-411c-9c1c-5704e6033226\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5mtwr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.825423 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvlhp\" (UniqueName: \"kubernetes.io/projected/423b8446-879f-47e3-9779-14373f259598-kube-api-access-vvlhp\") pod \"console-operator-58897d9998-z6ptp\" (UID: \"423b8446-879f-47e3-9779-14373f259598\") " pod="openshift-console-operator/console-operator-58897d9998-z6ptp" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.825441 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72432eea-a601-4d93-8aee-41ff9573ff0a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m6vbm\" (UID: \"72432eea-a601-4d93-8aee-41ff9573ff0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m6vbm" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.825718 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/257bed8a-876b-4f5e-8a4c-66c1e47b33dc-config\") pod \"kube-apiserver-operator-766d6c64bb-zv9vk\" (UID: \"257bed8a-876b-4f5e-8a4c-66c1e47b33dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zv9vk" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.825820 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f93aff21-0c7f-43b8-a1da-2c35dbfd8831-images\") pod \"machine-config-operator-74547568cd-qr2pd\" (UID: \"f93aff21-0c7f-43b8-a1da-2c35dbfd8831\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qr2pd" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.826539 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f792bed-0aa4-455f-8fb7-2b26d76a6172-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-pjdh5\" (UID: \"7f792bed-0aa4-455f-8fb7-2b26d76a6172\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjdh5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.827621 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/257bed8a-876b-4f5e-8a4c-66c1e47b33dc-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zv9vk\" (UID: \"257bed8a-876b-4f5e-8a4c-66c1e47b33dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zv9vk" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.827825 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/62c59a17-8b65-4876-a007-1cb1f45a7c2b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2qwgc\" (UID: \"62c59a17-8b65-4876-a007-1cb1f45a7c2b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qwgc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.828135 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5d1adad-cc9f-4d57-8099-d8e3323da190-proxy-tls\") pod \"machine-config-controller-84d6567774-ltdq5\" (UID: \"d5d1adad-cc9f-4d57-8099-d8e3323da190\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ltdq5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.828743 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f93aff21-0c7f-43b8-a1da-2c35dbfd8831-proxy-tls\") pod \"machine-config-operator-74547568cd-qr2pd\" (UID: \"f93aff21-0c7f-43b8-a1da-2c35dbfd8831\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qr2pd" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.832885 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebdc7a41-2398-46bd-9724-aca23394d4b3-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-wm5j7\" (UID: \"ebdc7a41-2398-46bd-9724-aca23394d4b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wm5j7" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.837658 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.846328 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6be7b2b4-9297-4d34-8ebc-72e57afda4e4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tsf86\" (UID: \"6be7b2b4-9297-4d34-8ebc-72e57afda4e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tsf86" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.857889 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.901212 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.917442 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.925251 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b-serving-cert\") pod \"etcd-operator-b45778765-rmxxc\" (UID: \"fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rmxxc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.927046 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c2be6688-5fef-4657-9eea-235fc8bb13f7-profile-collector-cert\") pod \"catalog-operator-68c6474976-khn28\" (UID: \"c2be6688-5fef-4657-9eea-235fc8bb13f7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khn28" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.927093 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9n4h\" (UniqueName: \"kubernetes.io/projected/fd53cc9e-5423-4ad7-afe5-54824c08341e-kube-api-access-v9n4h\") pod \"dns-default-crqhd\" (UID: \"fd53cc9e-5423-4ad7-afe5-54824c08341e\") " pod="openshift-dns/dns-default-crqhd" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.927119 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f-metrics-tls\") pod \"ingress-operator-5b745b69d9-bdlp9\" (UID: \"af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bdlp9" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.927160 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/740f2438-5f9c-40bb-ae51-77aac4708ab9-service-ca-bundle\") pod \"router-default-5444994796-h4dqr\" (UID: \"740f2438-5f9c-40bb-ae51-77aac4708ab9\") " pod="openshift-ingress/router-default-5444994796-h4dqr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.927187 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/83ee3033-e504-40f2-9c72-70d863d0d333-certs\") pod \"machine-config-server-77b5k\" (UID: \"83ee3033-e504-40f2-9c72-70d863d0d333\") " pod="openshift-machine-config-operator/machine-config-server-77b5k" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.927251 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr2k6\" (UniqueName: \"kubernetes.io/projected/963fd070-b5e6-4a67-afd6-d056aacf8bc2-kube-api-access-sr2k6\") pod \"marketplace-operator-79b997595-rwr4f\" (UID: \"963fd070-b5e6-4a67-afd6-d056aacf8bc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.927280 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/50a07abb-e77f-450d-990f-3c9e3b0360d9-signing-key\") pod \"service-ca-9c57cc56f-rmkfg\" (UID: \"50a07abb-e77f-450d-990f-3c9e3b0360d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-rmkfg" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.927315 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d58c6fa9-b6f3-4c21-bc67-a36e569fbfeb-cert\") pod \"ingress-canary-wf8jh\" (UID: \"d58c6fa9-b6f3-4c21-bc67-a36e569fbfeb\") " pod="openshift-ingress-canary/ingress-canary-wf8jh" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.927338 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee145931-4993-4e80-88c9-1f8a4f46e77c-config\") pod \"service-ca-operator-777779d784-nskbr\" (UID: \"ee145931-4993-4e80-88c9-1f8a4f46e77c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nskbr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.927367 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfm6g\" (UniqueName: \"kubernetes.io/projected/6c8a25c9-89d7-4606-8f6e-fe1b46b061eb-kube-api-access-gfm6g\") pod \"packageserver-d55dfcdfc-tdq29\" (UID: \"6c8a25c9-89d7-4606-8f6e-fe1b46b061eb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tdq29" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.927472 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/83ee3033-e504-40f2-9c72-70d863d0d333-node-bootstrap-token\") pod \"machine-config-server-77b5k\" (UID: \"83ee3033-e504-40f2-9c72-70d863d0d333\") " pod="openshift-machine-config-operator/machine-config-server-77b5k" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.927520 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c2be6688-5fef-4657-9eea-235fc8bb13f7-srv-cert\") pod \"catalog-operator-68c6474976-khn28\" (UID: \"c2be6688-5fef-4657-9eea-235fc8bb13f7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khn28" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.927586 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b9b84607-b33c-4c44-8331-9e09df2cccfe-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-98pdr\" (UID: \"b9b84607-b33c-4c44-8331-9e09df2cccfe\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-98pdr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.927669 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd53cc9e-5423-4ad7-afe5-54824c08341e-config-volume\") pod \"dns-default-crqhd\" (UID: \"fd53cc9e-5423-4ad7-afe5-54824c08341e\") " pod="openshift-dns/dns-default-crqhd" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.927744 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/602fa36f-0b9d-4ce5-abc5-b8d894cbf0fb-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tct95\" (UID: \"602fa36f-0b9d-4ce5-abc5-b8d894cbf0fb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tct95" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.927863 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk5ks\" (UniqueName: \"kubernetes.io/projected/ee145931-4993-4e80-88c9-1f8a4f46e77c-kube-api-access-tk5ks\") pod \"service-ca-operator-777779d784-nskbr\" (UID: \"ee145931-4993-4e80-88c9-1f8a4f46e77c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nskbr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.927918 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f-trusted-ca\") pod \"ingress-operator-5b745b69d9-bdlp9\" (UID: \"af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bdlp9" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.927968 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4r9g\" (UniqueName: \"kubernetes.io/projected/602fa36f-0b9d-4ce5-abc5-b8d894cbf0fb-kube-api-access-h4r9g\") pod \"package-server-manager-789f6589d5-tct95\" (UID: \"602fa36f-0b9d-4ce5-abc5-b8d894cbf0fb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tct95" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.928007 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqttp\" (UniqueName: \"kubernetes.io/projected/83ee3033-e504-40f2-9c72-70d863d0d333-kube-api-access-tqttp\") pod \"machine-config-server-77b5k\" (UID: \"83ee3033-e504-40f2-9c72-70d863d0d333\") " pod="openshift-machine-config-operator/machine-config-server-77b5k" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.928040 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zlpg\" (UniqueName: \"kubernetes.io/projected/d58c6fa9-b6f3-4c21-bc67-a36e569fbfeb-kube-api-access-8zlpg\") pod \"ingress-canary-wf8jh\" (UID: \"d58c6fa9-b6f3-4c21-bc67-a36e569fbfeb\") " pod="openshift-ingress-canary/ingress-canary-wf8jh" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.928072 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr4jg\" (UniqueName: \"kubernetes.io/projected/50a07abb-e77f-450d-990f-3c9e3b0360d9-kube-api-access-pr4jg\") pod \"service-ca-9c57cc56f-rmkfg\" (UID: \"50a07abb-e77f-450d-990f-3c9e3b0360d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-rmkfg" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.928159 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a6e4a2c7-0dd0-471a-a63c-aacd72b2cb17-srv-cert\") pod \"olm-operator-6b444d44fb-4ltbb\" (UID: \"a6e4a2c7-0dd0-471a-a63c-aacd72b2cb17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4ltbb" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.928288 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdbfg\" (UniqueName: \"kubernetes.io/projected/c2be6688-5fef-4657-9eea-235fc8bb13f7-kube-api-access-kdbfg\") pod \"catalog-operator-68c6474976-khn28\" (UID: \"c2be6688-5fef-4657-9eea-235fc8bb13f7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khn28" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.928354 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6c8a25c9-89d7-4606-8f6e-fe1b46b061eb-apiservice-cert\") pod \"packageserver-d55dfcdfc-tdq29\" (UID: \"6c8a25c9-89d7-4606-8f6e-fe1b46b061eb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tdq29" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.928410 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a6e4a2c7-0dd0-471a-a63c-aacd72b2cb17-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4ltbb\" (UID: \"a6e4a2c7-0dd0-471a-a63c-aacd72b2cb17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4ltbb" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.928446 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/963fd070-b5e6-4a67-afd6-d056aacf8bc2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rwr4f\" (UID: \"963fd070-b5e6-4a67-afd6-d056aacf8bc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.928535 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttp4j\" (UniqueName: \"kubernetes.io/projected/e0d5634e-ce3f-40a5-b85d-64f8c4708c59-kube-api-access-ttp4j\") pod \"auto-csr-approver-29535908-hhvn5\" (UID: \"e0d5634e-ce3f-40a5-b85d-64f8c4708c59\") " pod="openshift-infra/auto-csr-approver-29535908-hhvn5" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.928830 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/740f2438-5f9c-40bb-ae51-77aac4708ab9-default-certificate\") pod \"router-default-5444994796-h4dqr\" (UID: \"740f2438-5f9c-40bb-ae51-77aac4708ab9\") " pod="openshift-ingress/router-default-5444994796-h4dqr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.928874 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd53cc9e-5423-4ad7-afe5-54824c08341e-metrics-tls\") pod \"dns-default-crqhd\" (UID: \"fd53cc9e-5423-4ad7-afe5-54824c08341e\") " pod="openshift-dns/dns-default-crqhd" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.928906 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/50a07abb-e77f-450d-990f-3c9e3b0360d9-signing-cabundle\") pod \"service-ca-9c57cc56f-rmkfg\" (UID: \"50a07abb-e77f-450d-990f-3c9e3b0360d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-rmkfg" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.929030 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/963fd070-b5e6-4a67-afd6-d056aacf8bc2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rwr4f\" (UID: \"963fd070-b5e6-4a67-afd6-d056aacf8bc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.929080 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bdlp9\" (UID: \"af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bdlp9" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.929125 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/740f2438-5f9c-40bb-ae51-77aac4708ab9-metrics-certs\") pod \"router-default-5444994796-h4dqr\" (UID: \"740f2438-5f9c-40bb-ae51-77aac4708ab9\") " pod="openshift-ingress/router-default-5444994796-h4dqr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.929166 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/740f2438-5f9c-40bb-ae51-77aac4708ab9-stats-auth\") pod \"router-default-5444994796-h4dqr\" (UID: \"740f2438-5f9c-40bb-ae51-77aac4708ab9\") " pod="openshift-ingress/router-default-5444994796-h4dqr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.929224 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj7ld\" (UniqueName: \"kubernetes.io/projected/a6e4a2c7-0dd0-471a-a63c-aacd72b2cb17-kube-api-access-sj7ld\") pod \"olm-operator-6b444d44fb-4ltbb\" (UID: \"a6e4a2c7-0dd0-471a-a63c-aacd72b2cb17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4ltbb" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.929271 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6c8a25c9-89d7-4606-8f6e-fe1b46b061eb-webhook-cert\") pod \"packageserver-d55dfcdfc-tdq29\" (UID: \"6c8a25c9-89d7-4606-8f6e-fe1b46b061eb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tdq29" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.929330 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgv82\" (UniqueName: \"kubernetes.io/projected/b9b84607-b33c-4c44-8331-9e09df2cccfe-kube-api-access-wgv82\") pod \"multus-admission-controller-857f4d67dd-98pdr\" (UID: \"b9b84607-b33c-4c44-8331-9e09df2cccfe\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-98pdr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.929385 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s45ck\" (UniqueName: \"kubernetes.io/projected/740f2438-5f9c-40bb-ae51-77aac4708ab9-kube-api-access-s45ck\") pod \"router-default-5444994796-h4dqr\" (UID: \"740f2438-5f9c-40bb-ae51-77aac4708ab9\") " pod="openshift-ingress/router-default-5444994796-h4dqr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.929427 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnlw6\" (UniqueName: \"kubernetes.io/projected/af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f-kube-api-access-vnlw6\") pod \"ingress-operator-5b745b69d9-bdlp9\" (UID: \"af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bdlp9" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.929489 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6c8a25c9-89d7-4606-8f6e-fe1b46b061eb-tmpfs\") pod \"packageserver-d55dfcdfc-tdq29\" (UID: \"6c8a25c9-89d7-4606-8f6e-fe1b46b061eb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tdq29" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.929532 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee145931-4993-4e80-88c9-1f8a4f46e77c-serving-cert\") pod \"service-ca-operator-777779d784-nskbr\" (UID: \"ee145931-4993-4e80-88c9-1f8a4f46e77c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nskbr" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.930096 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6c8a25c9-89d7-4606-8f6e-fe1b46b061eb-tmpfs\") pod \"packageserver-d55dfcdfc-tdq29\" (UID: \"6c8a25c9-89d7-4606-8f6e-fe1b46b061eb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tdq29" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.937857 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.947723 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b-etcd-client\") pod \"etcd-operator-b45778765-rmxxc\" (UID: \"fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rmxxc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.957456 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.978040 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.984805 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b-config\") pod \"etcd-operator-b45778765-rmxxc\" (UID: \"fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rmxxc" Feb 27 01:08:21 crc kubenswrapper[4771]: I0227 01:08:21.999243 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.002804 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b-etcd-ca\") pod \"etcd-operator-b45778765-rmxxc\" (UID: \"fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rmxxc" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.017226 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.022806 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b-etcd-service-ca\") pod \"etcd-operator-b45778765-rmxxc\" (UID: \"fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rmxxc" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.037706 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.058071 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.065209 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6be7b2b4-9297-4d34-8ebc-72e57afda4e4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tsf86\" (UID: \"6be7b2b4-9297-4d34-8ebc-72e57afda4e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tsf86" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.077410 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.097129 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.117911 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.138021 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.146791 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e49e35fa-4abf-4adb-8e2d-48f71bd28c18-metrics-tls\") pod \"dns-operator-744455d44c-b77x6\" (UID: \"e49e35fa-4abf-4adb-8e2d-48f71bd28c18\") " pod="openshift-dns-operator/dns-operator-744455d44c-b77x6" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.157546 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.177799 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.198383 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.209710 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc136422-ec9e-411c-9c1c-5704e6033226-serving-cert\") pod \"openshift-config-operator-7777fb866f-5mtwr\" (UID: \"dc136422-ec9e-411c-9c1c-5704e6033226\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5mtwr" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.217493 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.238640 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.257689 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.268315 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/423b8446-879f-47e3-9779-14373f259598-serving-cert\") pod \"console-operator-58897d9998-z6ptp\" (UID: \"423b8446-879f-47e3-9779-14373f259598\") " pod="openshift-console-operator/console-operator-58897d9998-z6ptp" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.277656 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.283158 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/423b8446-879f-47e3-9779-14373f259598-config\") pod \"console-operator-58897d9998-z6ptp\" (UID: \"423b8446-879f-47e3-9779-14373f259598\") " pod="openshift-console-operator/console-operator-58897d9998-z6ptp" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.306769 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.317001 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/423b8446-879f-47e3-9779-14373f259598-trusted-ca\") pod \"console-operator-58897d9998-z6ptp\" (UID: \"423b8446-879f-47e3-9779-14373f259598\") " pod="openshift-console-operator/console-operator-58897d9998-z6ptp" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.318361 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.338505 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.358769 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.364986 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72432eea-a601-4d93-8aee-41ff9573ff0a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m6vbm\" (UID: \"72432eea-a601-4d93-8aee-41ff9573ff0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m6vbm" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.378027 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.398535 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.418453 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.428097 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72432eea-a601-4d93-8aee-41ff9573ff0a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m6vbm\" (UID: \"72432eea-a601-4d93-8aee-41ff9573ff0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m6vbm" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.438439 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.442957 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a6e4a2c7-0dd0-471a-a63c-aacd72b2cb17-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4ltbb\" (UID: \"a6e4a2c7-0dd0-471a-a63c-aacd72b2cb17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4ltbb" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.445793 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6de81df-af0d-4ebe-b254-7a45c4eb5312-secret-volume\") pod \"collect-profiles-29535900-m4svc\" (UID: \"b6de81df-af0d-4ebe-b254-7a45c4eb5312\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-m4svc" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.452454 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c2be6688-5fef-4657-9eea-235fc8bb13f7-profile-collector-cert\") pod \"catalog-operator-68c6474976-khn28\" (UID: \"c2be6688-5fef-4657-9eea-235fc8bb13f7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khn28" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.457627 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.464726 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6de81df-af0d-4ebe-b254-7a45c4eb5312-config-volume\") pod \"collect-profiles-29535900-m4svc\" (UID: \"b6de81df-af0d-4ebe-b254-7a45c4eb5312\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-m4svc" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.477362 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.497972 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.518374 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.537223 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.557472 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.576977 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.597361 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.618128 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.638237 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.657974 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.671713 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b9b84607-b33c-4c44-8331-9e09df2cccfe-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-98pdr\" (UID: \"b9b84607-b33c-4c44-8331-9e09df2cccfe\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-98pdr" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.678469 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.696756 4771 request.go:700] Waited for 1.001306739s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpackage-server-manager-serving-cert&limit=500&resourceVersion=0 Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.698256 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.711902 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/602fa36f-0b9d-4ce5-abc5-b8d894cbf0fb-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tct95\" (UID: \"602fa36f-0b9d-4ce5-abc5-b8d894cbf0fb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tct95" Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.713275 4771 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.713318 4771 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.713358 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-config podName:b9f091ab-b345-4bf0-ac8e-b44181c8553f nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.213336323 +0000 UTC m=+216.150897641 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-config") pod "controller-manager-879f6c89f-wgwwp" (UID: "b9f091ab-b345-4bf0-ac8e-b44181c8553f") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.713394 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-audit podName:f4eaf94a-ef2d-48bb-8762-bad950a6918a nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.213373194 +0000 UTC m=+216.150934572 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-audit") pod "apiserver-76f77b778f-76f5q" (UID: "f4eaf94a-ef2d-48bb-8762-bad950a6918a") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.713283 4771 secret.go:188] Couldn't get secret openshift-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.713521 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-serving-cert podName:f4eaf94a-ef2d-48bb-8762-bad950a6918a nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.213488677 +0000 UTC m=+216.151050005 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-serving-cert") pod "apiserver-76f77b778f-76f5q" (UID: "f4eaf94a-ef2d-48bb-8762-bad950a6918a") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.714702 4771 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.714725 4771 secret.go:188] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.714742 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-client-ca podName:b9f091ab-b345-4bf0-ac8e-b44181c8553f nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.214733981 +0000 UTC m=+216.152295269 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-client-ca") pod "controller-manager-879f6c89f-wgwwp" (UID: "b9f091ab-b345-4bf0-ac8e-b44181c8553f") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.714776 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/179b172a-a753-4f11-9532-63816979538a-serving-cert podName:179b172a-a753-4f11-9532-63816979538a nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.214764382 +0000 UTC m=+216.152325670 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/179b172a-a753-4f11-9532-63816979538a-serving-cert") pod "route-controller-manager-6576b87f9c-g6lg5" (UID: "179b172a-a753-4f11-9532-63816979538a") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.714785 4771 configmap.go:193] Couldn't get configMap openshift-apiserver/image-import-ca: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.714856 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-image-import-ca podName:f4eaf94a-ef2d-48bb-8762-bad950a6918a nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.214836073 +0000 UTC m=+216.152397391 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-import-ca" (UniqueName: "kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-image-import-ca") pod "apiserver-76f77b778f-76f5q" (UID: "f4eaf94a-ef2d-48bb-8762-bad950a6918a") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.715996 4771 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.716032 4771 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.716054 4771 configmap.go:193] Couldn't get configMap openshift-apiserver/config: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.716068 4771 configmap.go:193] Couldn't get configMap openshift-authentication-operator/authentication-operator-config: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.716082 4771 secret.go:188] Couldn't get secret openshift-oauth-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.716057 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-proxy-ca-bundles podName:b9f091ab-b345-4bf0-ac8e-b44181c8553f nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.216045026 +0000 UTC m=+216.153606414 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-proxy-ca-bundles") pod "controller-manager-879f6c89f-wgwwp" (UID: "b9f091ab-b345-4bf0-ac8e-b44181c8553f") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.716120 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-etcd-serving-ca podName:2e58f1a0-a75d-4280-8cfc-c249696d0b38 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.216108038 +0000 UTC m=+216.153669406 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-etcd-serving-ca") pod "apiserver-7bbb656c7d-sf4rl" (UID: "2e58f1a0-a75d-4280-8cfc-c249696d0b38") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.716135 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-config podName:f4eaf94a-ef2d-48bb-8762-bad950a6918a nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.216128018 +0000 UTC m=+216.153689406 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-config") pod "apiserver-76f77b778f-76f5q" (UID: "f4eaf94a-ef2d-48bb-8762-bad950a6918a") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.716056 4771 secret.go:188] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.716149 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-config podName:608f3fea-4388-4d6b-8795-fbba59621e28 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.216142658 +0000 UTC m=+216.153704066 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-config") pod "authentication-operator-69f744f599-crfkk" (UID: "608f3fea-4388-4d6b-8795-fbba59621e28") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.716166 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-serving-cert podName:2e58f1a0-a75d-4280-8cfc-c249696d0b38 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.216159329 +0000 UTC m=+216.153720747 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-serving-cert") pod "apiserver-7bbb656c7d-sf4rl" (UID: "2e58f1a0-a75d-4280-8cfc-c249696d0b38") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.716000 4771 secret.go:188] Couldn't get secret openshift-oauth-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.716183 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9f091ab-b345-4bf0-ac8e-b44181c8553f-serving-cert podName:b9f091ab-b345-4bf0-ac8e-b44181c8553f nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.216175409 +0000 UTC m=+216.153736817 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b9f091ab-b345-4bf0-ac8e-b44181c8553f-serving-cert") pod "controller-manager-879f6c89f-wgwwp" (UID: "b9f091ab-b345-4bf0-ac8e-b44181c8553f") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.716275 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-etcd-client podName:2e58f1a0-a75d-4280-8cfc-c249696d0b38 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.216253551 +0000 UTC m=+216.153814879 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-etcd-client") pod "apiserver-7bbb656c7d-sf4rl" (UID: "2e58f1a0-a75d-4280-8cfc-c249696d0b38") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.717779 4771 secret.go:188] Couldn't get secret openshift-cluster-machine-approver/machine-approver-tls: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.717817 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dedd2c80-3f88-4871-82b4-7744b17d00fc-machine-approver-tls podName:dedd2c80-3f88-4871-82b4-7744b17d00fc nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.217807003 +0000 UTC m=+216.155368291 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-approver-tls" (UniqueName: "kubernetes.io/secret/dedd2c80-3f88-4871-82b4-7744b17d00fc-machine-approver-tls") pod "machine-approver-56656f9798-s7265" (UID: "dedd2c80-3f88-4871-82b4-7744b17d00fc") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.717929 4771 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/machine-approver-config: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.717960 4771 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.717996 4771 secret.go:188] Couldn't get secret openshift-authentication-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.718020 4771 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.717966 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dedd2c80-3f88-4871-82b4-7744b17d00fc-config podName:dedd2c80-3f88-4871-82b4-7744b17d00fc nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.217957097 +0000 UTC m=+216.155518495 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/dedd2c80-3f88-4871-82b4-7744b17d00fc-config") pod "machine-approver-56656f9798-s7265" (UID: "dedd2c80-3f88-4871-82b4-7744b17d00fc") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.717916 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.718044 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-etcd-serving-ca podName:f4eaf94a-ef2d-48bb-8762-bad950a6918a nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.218035619 +0000 UTC m=+216.155597017 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-etcd-serving-ca") pod "apiserver-76f77b778f-76f5q" (UID: "f4eaf94a-ef2d-48bb-8762-bad950a6918a") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.718062 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/608f3fea-4388-4d6b-8795-fbba59621e28-serving-cert podName:608f3fea-4388-4d6b-8795-fbba59621e28 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.218057149 +0000 UTC m=+216.155618547 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/608f3fea-4388-4d6b-8795-fbba59621e28-serving-cert") pod "authentication-operator-69f744f599-crfkk" (UID: "608f3fea-4388-4d6b-8795-fbba59621e28") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.718085 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dedd2c80-3f88-4871-82b4-7744b17d00fc-auth-proxy-config podName:dedd2c80-3f88-4871-82b4-7744b17d00fc nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.21807023 +0000 UTC m=+216.155631618 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/dedd2c80-3f88-4871-82b4-7744b17d00fc-auth-proxy-config") pod "machine-approver-56656f9798-s7265" (UID: "dedd2c80-3f88-4871-82b4-7744b17d00fc") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.718162 4771 configmap.go:193] Couldn't get configMap openshift-authentication-operator/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.718207 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-trusted-ca-bundle podName:608f3fea-4388-4d6b-8795-fbba59621e28 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.218196373 +0000 UTC m=+216.155757781 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-trusted-ca-bundle") pod "authentication-operator-69f744f599-crfkk" (UID: "608f3fea-4388-4d6b-8795-fbba59621e28") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.718217 4771 configmap.go:193] Couldn't get configMap openshift-authentication-operator/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.718327 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-service-ca-bundle podName:608f3fea-4388-4d6b-8795-fbba59621e28 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.218305026 +0000 UTC m=+216.155866414 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-service-ca-bundle") pod "authentication-operator-69f744f599-crfkk" (UID: "608f3fea-4388-4d6b-8795-fbba59621e28") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.718369 4771 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.718389 4771 secret.go:188] Couldn't get secret openshift-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.718400 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-audit-policies podName:2e58f1a0-a75d-4280-8cfc-c249696d0b38 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.218392508 +0000 UTC m=+216.155953916 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit-policies" (UniqueName: "kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-audit-policies") pod "apiserver-7bbb656c7d-sf4rl" (UID: "2e58f1a0-a75d-4280-8cfc-c249696d0b38") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.718423 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-encryption-config podName:f4eaf94a-ef2d-48bb-8762-bad950a6918a nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.218413748 +0000 UTC m=+216.155975146 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-encryption-config") pod "apiserver-76f77b778f-76f5q" (UID: "f4eaf94a-ef2d-48bb-8762-bad950a6918a") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.721708 4771 secret.go:188] Couldn't get secret openshift-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.721773 4771 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.721782 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-etcd-client podName:f4eaf94a-ef2d-48bb-8762-bad950a6918a nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.221763349 +0000 UTC m=+216.159324667 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-etcd-client") pod "apiserver-76f77b778f-76f5q" (UID: "f4eaf94a-ef2d-48bb-8762-bad950a6918a") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.721786 4771 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.721824 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/179b172a-a753-4f11-9532-63816979538a-config podName:179b172a-a753-4f11-9532-63816979538a nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.22181509 +0000 UTC m=+216.159376498 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/179b172a-a753-4f11-9532-63816979538a-config") pod "route-controller-manager-6576b87f9c-g6lg5" (UID: "179b172a-a753-4f11-9532-63816979538a") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.721853 4771 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.721870 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/179b172a-a753-4f11-9532-63816979538a-client-ca podName:179b172a-a753-4f11-9532-63816979538a nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.221857641 +0000 UTC m=+216.159419029 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/179b172a-a753-4f11-9532-63816979538a-client-ca") pod "route-controller-manager-6576b87f9c-g6lg5" (UID: "179b172a-a753-4f11-9532-63816979538a") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.721913 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-trusted-ca-bundle podName:2e58f1a0-a75d-4280-8cfc-c249696d0b38 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.221904072 +0000 UTC m=+216.159465470 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-trusted-ca-bundle") pod "apiserver-7bbb656c7d-sf4rl" (UID: "2e58f1a0-a75d-4280-8cfc-c249696d0b38") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.721879 4771 secret.go:188] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.721965 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca03f0a2-fdee-42d5-a671-212f7b35b6aa-samples-operator-tls podName:ca03f0a2-fdee-42d5-a671-212f7b35b6aa nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.221949963 +0000 UTC m=+216.159511291 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ca03f0a2-fdee-42d5-a671-212f7b35b6aa-samples-operator-tls") pod "cluster-samples-operator-665b6dd947-cqlnh" (UID: "ca03f0a2-fdee-42d5-a671-212f7b35b6aa") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.721893 4771 secret.go:188] Couldn't get secret openshift-oauth-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.722017 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-encryption-config podName:2e58f1a0-a75d-4280-8cfc-c249696d0b38 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.222003705 +0000 UTC m=+216.159565093 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-encryption-config") pod "apiserver-7bbb656c7d-sf4rl" (UID: "2e58f1a0-a75d-4280-8cfc-c249696d0b38") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.721920 4771 configmap.go:193] Couldn't get configMap openshift-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.722053 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-trusted-ca-bundle podName:f4eaf94a-ef2d-48bb-8762-bad950a6918a nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.222046396 +0000 UTC m=+216.159607884 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-trusted-ca-bundle") pod "apiserver-76f77b778f-76f5q" (UID: "f4eaf94a-ef2d-48bb-8762-bad950a6918a") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.737878 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.757539 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.772371 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/50a07abb-e77f-450d-990f-3c9e3b0360d9-signing-key\") pod \"service-ca-9c57cc56f-rmkfg\" (UID: \"50a07abb-e77f-450d-990f-3c9e3b0360d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-rmkfg" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.778321 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.796711 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.801232 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/50a07abb-e77f-450d-990f-3c9e3b0360d9-signing-cabundle\") pod \"service-ca-9c57cc56f-rmkfg\" (UID: \"50a07abb-e77f-450d-990f-3c9e3b0360d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-rmkfg" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.818539 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.832749 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c2be6688-5fef-4657-9eea-235fc8bb13f7-srv-cert\") pod \"catalog-operator-68c6474976-khn28\" (UID: \"c2be6688-5fef-4657-9eea-235fc8bb13f7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khn28" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.837738 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.842961 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a6e4a2c7-0dd0-471a-a63c-aacd72b2cb17-srv-cert\") pod \"olm-operator-6b444d44fb-4ltbb\" (UID: \"a6e4a2c7-0dd0-471a-a63c-aacd72b2cb17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4ltbb" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.858221 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.879818 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.898153 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.928278 4771 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.928361 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83ee3033-e504-40f2-9c72-70d863d0d333-node-bootstrap-token podName:83ee3033-e504-40f2-9c72-70d863d0d333 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.428341597 +0000 UTC m=+216.365902875 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/83ee3033-e504-40f2-9c72-70d863d0d333-node-bootstrap-token") pod "machine-config-server-77b5k" (UID: "83ee3033-e504-40f2-9c72-70d863d0d333") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.928359 4771 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.928378 4771 secret.go:188] Couldn't get secret openshift-ingress-operator/metrics-tls: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.928398 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83ee3033-e504-40f2-9c72-70d863d0d333-certs podName:83ee3033-e504-40f2-9c72-70d863d0d333 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.428388968 +0000 UTC m=+216.365950256 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/83ee3033-e504-40f2-9c72-70d863d0d333-certs") pod "machine-config-server-77b5k" (UID: "83ee3033-e504-40f2-9c72-70d863d0d333") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.928414 4771 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.928441 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d58c6fa9-b6f3-4c21-bc67-a36e569fbfeb-cert podName:d58c6fa9-b6f3-4c21-bc67-a36e569fbfeb nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.428433959 +0000 UTC m=+216.365995247 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d58c6fa9-b6f3-4c21-bc67-a36e569fbfeb-cert") pod "ingress-canary-wf8jh" (UID: "d58c6fa9-b6f3-4c21-bc67-a36e569fbfeb") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.928478 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f-metrics-tls podName:af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.42845148 +0000 UTC m=+216.366012798 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f-metrics-tls") pod "ingress-operator-5b745b69d9-bdlp9" (UID: "af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.928480 4771 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.928519 4771 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.928531 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ee145931-4993-4e80-88c9-1f8a4f46e77c-config podName:ee145931-4993-4e80-88c9-1f8a4f46e77c nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.428518662 +0000 UTC m=+216.366079990 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/ee145931-4993-4e80-88c9-1f8a4f46e77c-config") pod "service-ca-operator-777779d784-nskbr" (UID: "ee145931-4993-4e80-88c9-1f8a4f46e77c") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.928566 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fd53cc9e-5423-4ad7-afe5-54824c08341e-config-volume podName:fd53cc9e-5423-4ad7-afe5-54824c08341e nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.428540913 +0000 UTC m=+216.366102341 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/fd53cc9e-5423-4ad7-afe5-54824c08341e-config-volume") pod "dns-default-crqhd" (UID: "fd53cc9e-5423-4ad7-afe5-54824c08341e") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.928593 4771 configmap.go:193] Couldn't get configMap openshift-ingress-operator/trusted-ca: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.928619 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f-trusted-ca podName:af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.428611185 +0000 UTC m=+216.366172593 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f-trusted-ca") pod "ingress-operator-5b745b69d9-bdlp9" (UID: "af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.928644 4771 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.928670 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c8a25c9-89d7-4606-8f6e-fe1b46b061eb-apiservice-cert podName:6c8a25c9-89d7-4606-8f6e-fe1b46b061eb nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.428660556 +0000 UTC m=+216.366221964 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/6c8a25c9-89d7-4606-8f6e-fe1b46b061eb-apiservice-cert") pod "packageserver-d55dfcdfc-tdq29" (UID: "6c8a25c9-89d7-4606-8f6e-fe1b46b061eb") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.928697 4771 configmap.go:193] Couldn't get configMap openshift-ingress/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.928718 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/740f2438-5f9c-40bb-ae51-77aac4708ab9-service-ca-bundle podName:740f2438-5f9c-40bb-ae51-77aac4708ab9 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.428712297 +0000 UTC m=+216.366273705 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/740f2438-5f9c-40bb-ae51-77aac4708ab9-service-ca-bundle") pod "router-default-5444994796-h4dqr" (UID: "740f2438-5f9c-40bb-ae51-77aac4708ab9") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.928988 4771 secret.go:188] Couldn't get secret openshift-ingress/router-certs-default: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.929038 4771 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.929056 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/740f2438-5f9c-40bb-ae51-77aac4708ab9-default-certificate podName:740f2438-5f9c-40bb-ae51-77aac4708ab9 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.429035436 +0000 UTC m=+216.366596784 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-certificate" (UniqueName: "kubernetes.io/secret/740f2438-5f9c-40bb-ae51-77aac4708ab9-default-certificate") pod "router-default-5444994796-h4dqr" (UID: "740f2438-5f9c-40bb-ae51-77aac4708ab9") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.929090 4771 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.929131 4771 secret.go:188] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.929130 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/963fd070-b5e6-4a67-afd6-d056aacf8bc2-marketplace-trusted-ca podName:963fd070-b5e6-4a67-afd6-d056aacf8bc2 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.429097077 +0000 UTC m=+216.366658415 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/963fd070-b5e6-4a67-afd6-d056aacf8bc2-marketplace-trusted-ca") pod "marketplace-operator-79b997595-rwr4f" (UID: "963fd070-b5e6-4a67-afd6-d056aacf8bc2") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.929163 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/963fd070-b5e6-4a67-afd6-d056aacf8bc2-marketplace-operator-metrics podName:963fd070-b5e6-4a67-afd6-d056aacf8bc2 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.429152659 +0000 UTC m=+216.366714047 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/963fd070-b5e6-4a67-afd6-d056aacf8bc2-marketplace-operator-metrics") pod "marketplace-operator-79b997595-rwr4f" (UID: "963fd070-b5e6-4a67-afd6-d056aacf8bc2") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.929191 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd53cc9e-5423-4ad7-afe5-54824c08341e-metrics-tls podName:fd53cc9e-5423-4ad7-afe5-54824c08341e nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.429172319 +0000 UTC m=+216.366733707 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fd53cc9e-5423-4ad7-afe5-54824c08341e-metrics-tls") pod "dns-default-crqhd" (UID: "fd53cc9e-5423-4ad7-afe5-54824c08341e") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.930080 4771 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.930134 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c8a25c9-89d7-4606-8f6e-fe1b46b061eb-webhook-cert podName:6c8a25c9-89d7-4606-8f6e-fe1b46b061eb nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.430117695 +0000 UTC m=+216.367679083 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/6c8a25c9-89d7-4606-8f6e-fe1b46b061eb-webhook-cert") pod "packageserver-d55dfcdfc-tdq29" (UID: "6c8a25c9-89d7-4606-8f6e-fe1b46b061eb") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.930149 4771 secret.go:188] Couldn't get secret openshift-ingress/router-stats-default: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.930154 4771 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.930165 4771 secret.go:188] Couldn't get secret openshift-ingress/router-metrics-certs-default: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.930202 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/740f2438-5f9c-40bb-ae51-77aac4708ab9-metrics-certs podName:740f2438-5f9c-40bb-ae51-77aac4708ab9 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.430194037 +0000 UTC m=+216.367755445 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/740f2438-5f9c-40bb-ae51-77aac4708ab9-metrics-certs") pod "router-default-5444994796-h4dqr" (UID: "740f2438-5f9c-40bb-ae51-77aac4708ab9") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.930232 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee145931-4993-4e80-88c9-1f8a4f46e77c-serving-cert podName:ee145931-4993-4e80-88c9-1f8a4f46e77c nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.430212037 +0000 UTC m=+216.367773365 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ee145931-4993-4e80-88c9-1f8a4f46e77c-serving-cert") pod "service-ca-operator-777779d784-nskbr" (UID: "ee145931-4993-4e80-88c9-1f8a4f46e77c") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: E0227 01:08:22.930273 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/740f2438-5f9c-40bb-ae51-77aac4708ab9-stats-auth podName:740f2438-5f9c-40bb-ae51-77aac4708ab9 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:23.430262899 +0000 UTC m=+216.367824227 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "stats-auth" (UniqueName: "kubernetes.io/secret/740f2438-5f9c-40bb-ae51-77aac4708ab9-stats-auth") pod "router-default-5444994796-h4dqr" (UID: "740f2438-5f9c-40bb-ae51-77aac4708ab9") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.944592 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khtxn\" (UniqueName: \"kubernetes.io/projected/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-kube-api-access-khtxn\") pod \"oauth-openshift-558db77b4-9scwl\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.966050 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j9dm\" (UniqueName: \"kubernetes.io/projected/58839f3c-374c-43d0-ac2e-32c497ead461-kube-api-access-9j9dm\") pod \"downloads-7954f5f757-gd7gl\" (UID: \"58839f3c-374c-43d0-ac2e-32c497ead461\") " pod="openshift-console/downloads-7954f5f757-gd7gl" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.972114 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-gd7gl" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.977650 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 27 01:08:22 crc kubenswrapper[4771]: I0227 01:08:22.997789 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.018257 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.038811 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.058919 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.079902 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.158299 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.165678 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-gd7gl"] Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.177584 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.197792 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.205986 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.218268 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.238212 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.259111 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.264414 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/179b172a-a753-4f11-9532-63816979538a-serving-cert\") pod \"route-controller-manager-6576b87f9c-g6lg5\" (UID: \"179b172a-a753-4f11-9532-63816979538a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.264581 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-config\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.264657 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.264900 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-etcd-client\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.265034 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-crfkk\" (UID: \"608f3fea-4388-4d6b-8795-fbba59621e28\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.265107 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dedd2c80-3f88-4871-82b4-7744b17d00fc-auth-proxy-config\") pod \"machine-approver-56656f9798-s7265\" (UID: \"dedd2c80-3f88-4871-82b4-7744b17d00fc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s7265" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.265191 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-encryption-config\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.265250 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-audit-policies\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.265391 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dedd2c80-3f88-4871-82b4-7744b17d00fc-config\") pod \"machine-approver-56656f9798-s7265\" (UID: \"dedd2c80-3f88-4871-82b4-7744b17d00fc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s7265" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.265490 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-encryption-config\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.265540 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/179b172a-a753-4f11-9532-63816979538a-config\") pod \"route-controller-manager-6576b87f9c-g6lg5\" (UID: \"179b172a-a753-4f11-9532-63816979538a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.265742 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-etcd-client\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.265825 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-audit\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.265967 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-client-ca\") pod \"controller-manager-879f6c89f-wgwwp\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.266176 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-config\") pod \"authentication-operator-69f744f599-crfkk\" (UID: \"608f3fea-4388-4d6b-8795-fbba59621e28\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.266229 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9f091ab-b345-4bf0-ac8e-b44181c8553f-serving-cert\") pod \"controller-manager-879f6c89f-wgwwp\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.266334 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-service-ca-bundle\") pod \"authentication-operator-69f744f599-crfkk\" (UID: \"608f3fea-4388-4d6b-8795-fbba59621e28\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.266441 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-etcd-serving-ca\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.266492 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-serving-cert\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.266619 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wgwwp\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.266754 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.266858 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca03f0a2-fdee-42d5-a671-212f7b35b6aa-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-cqlnh\" (UID: \"ca03f0a2-fdee-42d5-a671-212f7b35b6aa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cqlnh" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.266911 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dedd2c80-3f88-4871-82b4-7744b17d00fc-machine-approver-tls\") pod \"machine-approver-56656f9798-s7265\" (UID: \"dedd2c80-3f88-4871-82b4-7744b17d00fc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s7265" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.266978 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.267044 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/179b172a-a753-4f11-9532-63816979538a-client-ca\") pod \"route-controller-manager-6576b87f9c-g6lg5\" (UID: \"179b172a-a753-4f11-9532-63816979538a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.267141 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-serving-cert\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.267249 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-config\") pod \"controller-manager-879f6c89f-wgwwp\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.267320 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-image-import-ca\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.267584 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/608f3fea-4388-4d6b-8795-fbba59621e28-serving-cert\") pod \"authentication-operator-69f744f599-crfkk\" (UID: \"608f3fea-4388-4d6b-8795-fbba59621e28\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.278264 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.321077 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r259g\" (UniqueName: \"kubernetes.io/projected/5b3ec1be-ffa3-4733-ac99-7c86693297d7-kube-api-access-r259g\") pod \"cluster-image-registry-operator-dc59b4c8b-vdp46\" (UID: \"5b3ec1be-ffa3-4733-ac99-7c86693297d7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdp46" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.364109 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qck5x\" (UniqueName: \"kubernetes.io/projected/db8009a0-8b08-421c-8f35-e3127b0b5e8e-kube-api-access-qck5x\") pod \"console-f9d7485db-65dsm\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.394878 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2c8n\" (UniqueName: \"kubernetes.io/projected/7bd5b18f-fa8c-46d4-a571-630a67b14023-kube-api-access-b2c8n\") pod \"machine-api-operator-5694c8668f-4vrtf\" (UID: \"7bd5b18f-fa8c-46d4-a571-630a67b14023\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4vrtf" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.408159 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9scwl"] Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.433879 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b3ec1be-ffa3-4733-ac99-7c86693297d7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-vdp46\" (UID: \"5b3ec1be-ffa3-4733-ac99-7c86693297d7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdp46" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.437988 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.466785 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.470395 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f-metrics-tls\") pod \"ingress-operator-5b745b69d9-bdlp9\" (UID: \"af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bdlp9" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.470508 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/740f2438-5f9c-40bb-ae51-77aac4708ab9-service-ca-bundle\") pod \"router-default-5444994796-h4dqr\" (UID: \"740f2438-5f9c-40bb-ae51-77aac4708ab9\") " pod="openshift-ingress/router-default-5444994796-h4dqr" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.470597 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/83ee3033-e504-40f2-9c72-70d863d0d333-certs\") pod \"machine-config-server-77b5k\" (UID: \"83ee3033-e504-40f2-9c72-70d863d0d333\") " pod="openshift-machine-config-operator/machine-config-server-77b5k" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.470786 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d58c6fa9-b6f3-4c21-bc67-a36e569fbfeb-cert\") pod \"ingress-canary-wf8jh\" (UID: \"d58c6fa9-b6f3-4c21-bc67-a36e569fbfeb\") " pod="openshift-ingress-canary/ingress-canary-wf8jh" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.470850 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee145931-4993-4e80-88c9-1f8a4f46e77c-config\") pod \"service-ca-operator-777779d784-nskbr\" (UID: \"ee145931-4993-4e80-88c9-1f8a4f46e77c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nskbr" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.471006 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/83ee3033-e504-40f2-9c72-70d863d0d333-node-bootstrap-token\") pod \"machine-config-server-77b5k\" (UID: \"83ee3033-e504-40f2-9c72-70d863d0d333\") " pod="openshift-machine-config-operator/machine-config-server-77b5k" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.471146 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd53cc9e-5423-4ad7-afe5-54824c08341e-config-volume\") pod \"dns-default-crqhd\" (UID: \"fd53cc9e-5423-4ad7-afe5-54824c08341e\") " pod="openshift-dns/dns-default-crqhd" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.471344 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f-trusted-ca\") pod \"ingress-operator-5b745b69d9-bdlp9\" (UID: \"af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bdlp9" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.471686 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/740f2438-5f9c-40bb-ae51-77aac4708ab9-service-ca-bundle\") pod \"router-default-5444994796-h4dqr\" (UID: \"740f2438-5f9c-40bb-ae51-77aac4708ab9\") " pod="openshift-ingress/router-default-5444994796-h4dqr" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.471689 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6c8a25c9-89d7-4606-8f6e-fe1b46b061eb-apiservice-cert\") pod \"packageserver-d55dfcdfc-tdq29\" (UID: \"6c8a25c9-89d7-4606-8f6e-fe1b46b061eb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tdq29" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.471783 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/963fd070-b5e6-4a67-afd6-d056aacf8bc2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rwr4f\" (UID: \"963fd070-b5e6-4a67-afd6-d056aacf8bc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.471866 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd53cc9e-5423-4ad7-afe5-54824c08341e-metrics-tls\") pod \"dns-default-crqhd\" (UID: \"fd53cc9e-5423-4ad7-afe5-54824c08341e\") " pod="openshift-dns/dns-default-crqhd" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.471905 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/740f2438-5f9c-40bb-ae51-77aac4708ab9-default-certificate\") pod \"router-default-5444994796-h4dqr\" (UID: \"740f2438-5f9c-40bb-ae51-77aac4708ab9\") " pod="openshift-ingress/router-default-5444994796-h4dqr" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.471921 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee145931-4993-4e80-88c9-1f8a4f46e77c-config\") pod \"service-ca-operator-777779d784-nskbr\" (UID: \"ee145931-4993-4e80-88c9-1f8a4f46e77c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nskbr" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.471998 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/963fd070-b5e6-4a67-afd6-d056aacf8bc2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rwr4f\" (UID: \"963fd070-b5e6-4a67-afd6-d056aacf8bc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.472024 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/740f2438-5f9c-40bb-ae51-77aac4708ab9-metrics-certs\") pod \"router-default-5444994796-h4dqr\" (UID: \"740f2438-5f9c-40bb-ae51-77aac4708ab9\") " pod="openshift-ingress/router-default-5444994796-h4dqr" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.472065 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/740f2438-5f9c-40bb-ae51-77aac4708ab9-stats-auth\") pod \"router-default-5444994796-h4dqr\" (UID: \"740f2438-5f9c-40bb-ae51-77aac4708ab9\") " pod="openshift-ingress/router-default-5444994796-h4dqr" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.472122 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6c8a25c9-89d7-4606-8f6e-fe1b46b061eb-webhook-cert\") pod \"packageserver-d55dfcdfc-tdq29\" (UID: \"6c8a25c9-89d7-4606-8f6e-fe1b46b061eb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tdq29" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.472196 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee145931-4993-4e80-88c9-1f8a4f46e77c-serving-cert\") pod \"service-ca-operator-777779d784-nskbr\" (UID: \"ee145931-4993-4e80-88c9-1f8a4f46e77c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nskbr" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.472680 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f-trusted-ca\") pod \"ingress-operator-5b745b69d9-bdlp9\" (UID: \"af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bdlp9" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.476463 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee145931-4993-4e80-88c9-1f8a4f46e77c-serving-cert\") pod \"service-ca-operator-777779d784-nskbr\" (UID: \"ee145931-4993-4e80-88c9-1f8a4f46e77c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nskbr" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.476525 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/740f2438-5f9c-40bb-ae51-77aac4708ab9-stats-auth\") pod \"router-default-5444994796-h4dqr\" (UID: \"740f2438-5f9c-40bb-ae51-77aac4708ab9\") " pod="openshift-ingress/router-default-5444994796-h4dqr" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.476627 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/740f2438-5f9c-40bb-ae51-77aac4708ab9-default-certificate\") pod \"router-default-5444994796-h4dqr\" (UID: \"740f2438-5f9c-40bb-ae51-77aac4708ab9\") " pod="openshift-ingress/router-default-5444994796-h4dqr" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.477138 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/740f2438-5f9c-40bb-ae51-77aac4708ab9-metrics-certs\") pod \"router-default-5444994796-h4dqr\" (UID: \"740f2438-5f9c-40bb-ae51-77aac4708ab9\") " pod="openshift-ingress/router-default-5444994796-h4dqr" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.478876 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.479030 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6c8a25c9-89d7-4606-8f6e-fe1b46b061eb-webhook-cert\") pod \"packageserver-d55dfcdfc-tdq29\" (UID: \"6c8a25c9-89d7-4606-8f6e-fe1b46b061eb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tdq29" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.479534 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6c8a25c9-89d7-4606-8f6e-fe1b46b061eb-apiservice-cert\") pod \"packageserver-d55dfcdfc-tdq29\" (UID: \"6c8a25c9-89d7-4606-8f6e-fe1b46b061eb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tdq29" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.485213 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f-metrics-tls\") pod \"ingress-operator-5b745b69d9-bdlp9\" (UID: \"af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bdlp9" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.498374 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.521849 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdp46" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.523674 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.538400 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.558460 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.567422 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.579781 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.590507 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/963fd070-b5e6-4a67-afd6-d056aacf8bc2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rwr4f\" (UID: \"963fd070-b5e6-4a67-afd6-d056aacf8bc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.609801 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.614746 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/963fd070-b5e6-4a67-afd6-d056aacf8bc2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rwr4f\" (UID: \"963fd070-b5e6-4a67-afd6-d056aacf8bc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.618081 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.643323 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.652496 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd53cc9e-5423-4ad7-afe5-54824c08341e-config-volume\") pod \"dns-default-crqhd\" (UID: \"fd53cc9e-5423-4ad7-afe5-54824c08341e\") " pod="openshift-dns/dns-default-crqhd" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.658261 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.673827 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-4vrtf" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.681115 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.687536 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd53cc9e-5423-4ad7-afe5-54824c08341e-metrics-tls\") pod \"dns-default-crqhd\" (UID: \"fd53cc9e-5423-4ad7-afe5-54824c08341e\") " pod="openshift-dns/dns-default-crqhd" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.698153 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.713668 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdp46"] Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.716328 4771 request.go:700] Waited for 1.913462678s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Dcanary-serving-cert&limit=500&resourceVersion=0 Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.718199 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 27 01:08:23 crc kubenswrapper[4771]: W0227 01:08:23.725400 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b3ec1be_ffa3_4733_ac99_7c86693297d7.slice/crio-c31829712e9da519ae54b53872ba32c87adba47762385340119f86013fd37999 WatchSource:0}: Error finding container c31829712e9da519ae54b53872ba32c87adba47762385340119f86013fd37999: Status 404 returned error can't find the container with id c31829712e9da519ae54b53872ba32c87adba47762385340119f86013fd37999 Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.725757 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d58c6fa9-b6f3-4c21-bc67-a36e569fbfeb-cert\") pod \"ingress-canary-wf8jh\" (UID: \"d58c6fa9-b6f3-4c21-bc67-a36e569fbfeb\") " pod="openshift-ingress-canary/ingress-canary-wf8jh" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.738497 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.757640 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.781013 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.799393 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.803634 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/83ee3033-e504-40f2-9c72-70d863d0d333-certs\") pod \"machine-config-server-77b5k\" (UID: \"83ee3033-e504-40f2-9c72-70d863d0d333\") " pod="openshift-machine-config-operator/machine-config-server-77b5k" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.812435 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" event={"ID":"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7","Type":"ContainerStarted","Data":"2f27cd8996898523845f5cd911350e6f00b9b64e518e26cf10570b63b113837a"} Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.812488 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" event={"ID":"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7","Type":"ContainerStarted","Data":"bd48d9c6210401197a31138d84b868da7abaf8546c8833dfe1bc6c759639d834"} Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.812789 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.815772 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-gd7gl" event={"ID":"58839f3c-374c-43d0-ac2e-32c497ead461","Type":"ContainerStarted","Data":"70bede3954171847df1f4fbfa355def36b1b46393b5cfd1248dd7b94bdfea603"} Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.815824 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-gd7gl" event={"ID":"58839f3c-374c-43d0-ac2e-32c497ead461","Type":"ContainerStarted","Data":"315e5bef74bfe11597d021863b7e233ae89c6e73c3747f1da4d3218af56feb4c"} Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.816142 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-gd7gl" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.816974 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdp46" event={"ID":"5b3ec1be-ffa3-4733-ac99-7c86693297d7","Type":"ContainerStarted","Data":"c31829712e9da519ae54b53872ba32c87adba47762385340119f86013fd37999"} Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.819223 4771 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-9scwl container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.819259 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-gd7gl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.819312 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gd7gl" podUID="58839f3c-374c-43d0-ac2e-32c497ead461" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.819269 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" podUID="8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.823515 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.835885 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/83ee3033-e504-40f2-9c72-70d863d0d333-node-bootstrap-token\") pod \"machine-config-server-77b5k\" (UID: \"83ee3033-e504-40f2-9c72-70d863d0d333\") " pod="openshift-machine-config-operator/machine-config-server-77b5k" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.858480 4771 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.879752 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.894438 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4vrtf"] Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.898271 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 27 01:08:23 crc kubenswrapper[4771]: W0227 01:08:23.904484 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bd5b18f_fa8c_46d4_a571_630a67b14023.slice/crio-a92bc83d75fc66a1df2b883c1d93c6d331c8985d5eb1e6362aaadb124388da9a WatchSource:0}: Error finding container a92bc83d75fc66a1df2b883c1d93c6d331c8985d5eb1e6362aaadb124388da9a: Status 404 returned error can't find the container with id a92bc83d75fc66a1df2b883c1d93c6d331c8985d5eb1e6362aaadb124388da9a Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.932630 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grbg2\" (UniqueName: \"kubernetes.io/projected/b6de81df-af0d-4ebe-b254-7a45c4eb5312-kube-api-access-grbg2\") pod \"collect-profiles-29535900-m4svc\" (UID: \"b6de81df-af0d-4ebe-b254-7a45c4eb5312\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-m4svc" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.953581 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz8jm\" (UniqueName: \"kubernetes.io/projected/706b5440-ad63-4d92-9708-96ce6d6926b8-kube-api-access-kz8jm\") pod \"migrator-59844c95c7-jwx72\" (UID: \"706b5440-ad63-4d92-9708-96ce6d6926b8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jwx72" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.974813 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lz5m\" (UniqueName: \"kubernetes.io/projected/e49e35fa-4abf-4adb-8e2d-48f71bd28c18-kube-api-access-8lz5m\") pod \"dns-operator-744455d44c-b77x6\" (UID: \"e49e35fa-4abf-4adb-8e2d-48f71bd28c18\") " pod="openshift-dns-operator/dns-operator-744455d44c-b77x6" Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.980845 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-b77x6" Feb 27 01:08:23 crc kubenswrapper[4771]: E0227 01:08:23.984715 4771 projected.go:288] Couldn't get configMap openshift-cluster-samples-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:23 crc kubenswrapper[4771]: I0227 01:08:23.990610 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/257bed8a-876b-4f5e-8a4c-66c1e47b33dc-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zv9vk\" (UID: \"257bed8a-876b-4f5e-8a4c-66c1e47b33dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zv9vk" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.017108 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62xs6\" (UniqueName: \"kubernetes.io/projected/ebdc7a41-2398-46bd-9724-aca23394d4b3-kube-api-access-62xs6\") pod \"kube-storage-version-migrator-operator-b67b599dd-wm5j7\" (UID: \"ebdc7a41-2398-46bd-9724-aca23394d4b3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wm5j7" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.020091 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-m4svc" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.027438 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jwx72" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.029996 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-65dsm"] Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.034328 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpxvj\" (UniqueName: \"kubernetes.io/projected/fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b-kube-api-access-gpxvj\") pod \"etcd-operator-b45778765-rmxxc\" (UID: \"fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rmxxc" Feb 27 01:08:24 crc kubenswrapper[4771]: W0227 01:08:24.040829 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb8009a0_8b08_421c_8f35_e3127b0b5e8e.slice/crio-dec5b9c457df15717739641c79145cc35de5bfc378494663a11ec140f8b6980e WatchSource:0}: Error finding container dec5b9c457df15717739641c79145cc35de5bfc378494663a11ec140f8b6980e: Status 404 returned error can't find the container with id dec5b9c457df15717739641c79145cc35de5bfc378494663a11ec140f8b6980e Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.055968 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf82l\" (UniqueName: \"kubernetes.io/projected/6be7b2b4-9297-4d34-8ebc-72e57afda4e4-kube-api-access-zf82l\") pod \"openshift-controller-manager-operator-756b6f6bc6-tsf86\" (UID: \"6be7b2b4-9297-4d34-8ebc-72e57afda4e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tsf86" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.076217 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-264dv\" (UniqueName: \"kubernetes.io/projected/dc136422-ec9e-411c-9c1c-5704e6033226-kube-api-access-264dv\") pod \"openshift-config-operator-7777fb866f-5mtwr\" (UID: \"dc136422-ec9e-411c-9c1c-5704e6033226\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5mtwr" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.101939 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrdrg\" (UniqueName: \"kubernetes.io/projected/62c59a17-8b65-4876-a007-1cb1f45a7c2b-kube-api-access-jrdrg\") pod \"control-plane-machine-set-operator-78cbb6b69f-2qwgc\" (UID: \"62c59a17-8b65-4876-a007-1cb1f45a7c2b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qwgc" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.117340 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9qk2\" (UniqueName: \"kubernetes.io/projected/f93aff21-0c7f-43b8-a1da-2c35dbfd8831-kube-api-access-t9qk2\") pod \"machine-config-operator-74547568cd-qr2pd\" (UID: \"f93aff21-0c7f-43b8-a1da-2c35dbfd8831\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qr2pd" Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.120797 4771 projected.go:288] Couldn't get configMap openshift-authentication-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.140712 4771 projected.go:288] Couldn't get configMap openshift-oauth-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.160726 4771 projected.go:288] Couldn't get configMap openshift-cluster-machine-approver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.161126 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/581a9e83-a359-4b05-b9c0-0d4c8d39277b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-28znj\" (UID: \"581a9e83-a359-4b05-b9c0-0d4c8d39277b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28znj" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.162444 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssnz8\" (UniqueName: \"kubernetes.io/projected/7f792bed-0aa4-455f-8fb7-2b26d76a6172-kube-api-access-ssnz8\") pod \"openshift-apiserver-operator-796bbdcf4f-pjdh5\" (UID: \"7f792bed-0aa4-455f-8fb7-2b26d76a6172\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjdh5" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.177522 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvlhp\" (UniqueName: \"kubernetes.io/projected/423b8446-879f-47e3-9779-14373f259598-kube-api-access-vvlhp\") pod \"console-operator-58897d9998-z6ptp\" (UID: \"423b8446-879f-47e3-9779-14373f259598\") " pod="openshift-console-operator/console-operator-58897d9998-z6ptp" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.186483 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qwgc" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.188267 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-b77x6"] Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.191728 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkvq9\" (UniqueName: \"kubernetes.io/projected/d5d1adad-cc9f-4d57-8099-d8e3323da190-kube-api-access-pkvq9\") pod \"machine-config-controller-84d6567774-ltdq5\" (UID: \"d5d1adad-cc9f-4d57-8099-d8e3323da190\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ltdq5" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.193122 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28znj" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.201944 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zv9vk" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.206107 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ltdq5" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.213065 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qr2pd" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.214579 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72432eea-a601-4d93-8aee-41ff9573ff0a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-m6vbm\" (UID: \"72432eea-a601-4d93-8aee-41ff9573ff0a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m6vbm" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.221962 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjdh5" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.256663 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wm5j7" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.260248 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tsf86" Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.269045 4771 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/machine-approver-config: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.269108 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dedd2c80-3f88-4871-82b4-7744b17d00fc-config podName:dedd2c80-3f88-4871-82b4-7744b17d00fc nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.269090886 +0000 UTC m=+218.206652174 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/dedd2c80-3f88-4871-82b4-7744b17d00fc-config") pod "machine-approver-56656f9798-s7265" (UID: "dedd2c80-3f88-4871-82b4-7744b17d00fc") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.270294 4771 configmap.go:193] Couldn't get configMap openshift-authentication-operator/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.270329 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-service-ca-bundle podName:608f3fea-4388-4d6b-8795-fbba59621e28 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.270319939 +0000 UTC m=+218.207881227 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-service-ca-bundle") pod "authentication-operator-69f744f599-crfkk" (UID: "608f3fea-4388-4d6b-8795-fbba59621e28") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.270769 4771 configmap.go:193] Couldn't get configMap openshift-authentication-operator/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.270799 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-trusted-ca-bundle podName:608f3fea-4388-4d6b-8795-fbba59621e28 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.270789601 +0000 UTC m=+218.208350889 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-trusted-ca-bundle") pod "authentication-operator-69f744f599-crfkk" (UID: "608f3fea-4388-4d6b-8795-fbba59621e28") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.270821 4771 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.270840 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dedd2c80-3f88-4871-82b4-7744b17d00fc-auth-proxy-config podName:dedd2c80-3f88-4871-82b4-7744b17d00fc nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.270834522 +0000 UTC m=+218.208395800 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/dedd2c80-3f88-4871-82b4-7744b17d00fc-auth-proxy-config") pod "machine-approver-56656f9798-s7265" (UID: "dedd2c80-3f88-4871-82b4-7744b17d00fc") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271445 4771 secret.go:188] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271493 4771 secret.go:188] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271527 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9f091ab-b345-4bf0-ac8e-b44181c8553f-serving-cert podName:b9f091ab-b345-4bf0-ac8e-b44181c8553f nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.271518711 +0000 UTC m=+218.209079999 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b9f091ab-b345-4bf0-ac8e-b44181c8553f-serving-cert") pod "controller-manager-879f6c89f-wgwwp" (UID: "b9f091ab-b345-4bf0-ac8e-b44181c8553f") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271526 4771 secret.go:188] Couldn't get secret openshift-oauth-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271574 4771 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271585 4771 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271599 4771 configmap.go:193] Couldn't get configMap openshift-apiserver/config: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271632 4771 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271645 4771 secret.go:188] Couldn't get secret openshift-oauth-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271673 4771 configmap.go:193] Couldn't get configMap openshift-authentication-operator/authentication-operator-config: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271681 4771 secret.go:188] Couldn't get secret openshift-cluster-machine-approver/machine-approver-tls: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271693 4771 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271706 4771 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271712 4771 secret.go:188] Couldn't get secret openshift-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271653 4771 secret.go:188] Couldn't get secret openshift-oauth-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271737 4771 configmap.go:193] Couldn't get configMap openshift-apiserver/image-import-ca: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271738 4771 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271790 4771 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271799 4771 configmap.go:193] Couldn't get configMap openshift-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271665 4771 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271665 4771 secret.go:188] Couldn't get secret openshift-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271826 4771 secret.go:188] Couldn't get secret openshift-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271836 4771 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271544 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/179b172a-a753-4f11-9532-63816979538a-serving-cert podName:179b172a-a753-4f11-9532-63816979538a nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.271534821 +0000 UTC m=+218.209096109 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/179b172a-a753-4f11-9532-63816979538a-serving-cert") pod "route-controller-manager-6576b87f9c-g6lg5" (UID: "179b172a-a753-4f11-9532-63816979538a") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271855 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/179b172a-a753-4f11-9532-63816979538a-config podName:179b172a-a753-4f11-9532-63816979538a nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.271848449 +0000 UTC m=+218.209409737 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/179b172a-a753-4f11-9532-63816979538a-config") pod "route-controller-manager-6576b87f9c-g6lg5" (UID: "179b172a-a753-4f11-9532-63816979538a") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271857 4771 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271859 4771 secret.go:188] Couldn't get secret openshift-authentication-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271881 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-audit podName:f4eaf94a-ef2d-48bb-8762-bad950a6918a nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.27186154 +0000 UTC m=+218.209422938 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-audit") pod "apiserver-76f77b778f-76f5q" (UID: "f4eaf94a-ef2d-48bb-8762-bad950a6918a") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271887 4771 secret.go:188] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271903 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-encryption-config podName:2e58f1a0-a75d-4280-8cfc-c249696d0b38 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.271894931 +0000 UTC m=+218.209456369 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-encryption-config") pod "apiserver-7bbb656c7d-sf4rl" (UID: "2e58f1a0-a75d-4280-8cfc-c249696d0b38") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271921 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-config podName:f4eaf94a-ef2d-48bb-8762-bad950a6918a nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.271913131 +0000 UTC m=+218.209474539 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-config") pod "apiserver-76f77b778f-76f5q" (UID: "f4eaf94a-ef2d-48bb-8762-bad950a6918a") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.271939 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-rmxxc" Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271944 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-etcd-serving-ca podName:2e58f1a0-a75d-4280-8cfc-c249696d0b38 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.271936962 +0000 UTC m=+218.209498350 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-etcd-serving-ca") pod "apiserver-7bbb656c7d-sf4rl" (UID: "2e58f1a0-a75d-4280-8cfc-c249696d0b38") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271965 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-config podName:608f3fea-4388-4d6b-8795-fbba59621e28 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.271956142 +0000 UTC m=+218.209517570 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-config") pod "authentication-operator-69f744f599-crfkk" (UID: "608f3fea-4388-4d6b-8795-fbba59621e28") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271980 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-config podName:b9f091ab-b345-4bf0-ac8e-b44181c8553f nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.271972513 +0000 UTC m=+218.209533921 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-config") pod "controller-manager-879f6c89f-wgwwp" (UID: "b9f091ab-b345-4bf0-ac8e-b44181c8553f") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.271995 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dedd2c80-3f88-4871-82b4-7744b17d00fc-machine-approver-tls podName:dedd2c80-3f88-4871-82b4-7744b17d00fc nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.271988223 +0000 UTC m=+218.209549641 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "machine-approver-tls" (UniqueName: "kubernetes.io/secret/dedd2c80-3f88-4871-82b4-7744b17d00fc-machine-approver-tls") pod "machine-approver-56656f9798-s7265" (UID: "dedd2c80-3f88-4871-82b4-7744b17d00fc") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.272009 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-serving-cert podName:2e58f1a0-a75d-4280-8cfc-c249696d0b38 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.272002493 +0000 UTC m=+218.209563911 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-serving-cert") pod "apiserver-7bbb656c7d-sf4rl" (UID: "2e58f1a0-a75d-4280-8cfc-c249696d0b38") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.272022 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-trusted-ca-bundle podName:2e58f1a0-a75d-4280-8cfc-c249696d0b38 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.272016414 +0000 UTC m=+218.209577842 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-trusted-ca-bundle") pod "apiserver-7bbb656c7d-sf4rl" (UID: "2e58f1a0-a75d-4280-8cfc-c249696d0b38") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.272035 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-etcd-serving-ca podName:f4eaf94a-ef2d-48bb-8762-bad950a6918a nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.272028654 +0000 UTC m=+218.209590062 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-etcd-serving-ca") pod "apiserver-76f77b778f-76f5q" (UID: "f4eaf94a-ef2d-48bb-8762-bad950a6918a") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.272050 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-etcd-client podName:2e58f1a0-a75d-4280-8cfc-c249696d0b38 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.272042764 +0000 UTC m=+218.209604172 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-etcd-client") pod "apiserver-7bbb656c7d-sf4rl" (UID: "2e58f1a0-a75d-4280-8cfc-c249696d0b38") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.272065 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-etcd-client podName:f4eaf94a-ef2d-48bb-8762-bad950a6918a nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.272057715 +0000 UTC m=+218.209619143 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-etcd-client") pod "apiserver-76f77b778f-76f5q" (UID: "f4eaf94a-ef2d-48bb-8762-bad950a6918a") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.272078 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-proxy-ca-bundles podName:b9f091ab-b345-4bf0-ac8e-b44181c8553f nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.272072055 +0000 UTC m=+218.209633473 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-proxy-ca-bundles") pod "controller-manager-879f6c89f-wgwwp" (UID: "b9f091ab-b345-4bf0-ac8e-b44181c8553f") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.272092 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-client-ca podName:b9f091ab-b345-4bf0-ac8e-b44181c8553f nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.272085696 +0000 UTC m=+218.209647114 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-client-ca") pod "controller-manager-879f6c89f-wgwwp" (UID: "b9f091ab-b345-4bf0-ac8e-b44181c8553f") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.272105 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-image-import-ca podName:f4eaf94a-ef2d-48bb-8762-bad950a6918a nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.272098536 +0000 UTC m=+218.209659944 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-import-ca" (UniqueName: "kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-image-import-ca") pod "apiserver-76f77b778f-76f5q" (UID: "f4eaf94a-ef2d-48bb-8762-bad950a6918a") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.272119 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-trusted-ca-bundle podName:f4eaf94a-ef2d-48bb-8762-bad950a6918a nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.272112736 +0000 UTC m=+218.209674134 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-trusted-ca-bundle") pod "apiserver-76f77b778f-76f5q" (UID: "f4eaf94a-ef2d-48bb-8762-bad950a6918a") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.272134 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-audit-policies podName:2e58f1a0-a75d-4280-8cfc-c249696d0b38 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.272127747 +0000 UTC m=+218.209689145 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit-policies" (UniqueName: "kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-audit-policies") pod "apiserver-7bbb656c7d-sf4rl" (UID: "2e58f1a0-a75d-4280-8cfc-c249696d0b38") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.272148 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-encryption-config podName:f4eaf94a-ef2d-48bb-8762-bad950a6918a nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.272142047 +0000 UTC m=+218.209703445 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-encryption-config") pod "apiserver-76f77b778f-76f5q" (UID: "f4eaf94a-ef2d-48bb-8762-bad950a6918a") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.272161 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-serving-cert podName:f4eaf94a-ef2d-48bb-8762-bad950a6918a nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.272154867 +0000 UTC m=+218.209716285 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-serving-cert") pod "apiserver-76f77b778f-76f5q" (UID: "f4eaf94a-ef2d-48bb-8762-bad950a6918a") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.272176 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/179b172a-a753-4f11-9532-63816979538a-client-ca podName:179b172a-a753-4f11-9532-63816979538a nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.272168568 +0000 UTC m=+218.209729966 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/179b172a-a753-4f11-9532-63816979538a-client-ca") pod "route-controller-manager-6576b87f9c-g6lg5" (UID: "179b172a-a753-4f11-9532-63816979538a") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.272191 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca03f0a2-fdee-42d5-a671-212f7b35b6aa-samples-operator-tls podName:ca03f0a2-fdee-42d5-a671-212f7b35b6aa nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.272184578 +0000 UTC m=+218.209745996 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ca03f0a2-fdee-42d5-a671-212f7b35b6aa-samples-operator-tls") pod "cluster-samples-operator-665b6dd947-cqlnh" (UID: "ca03f0a2-fdee-42d5-a671-212f7b35b6aa") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.272206 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/608f3fea-4388-4d6b-8795-fbba59621e28-serving-cert podName:608f3fea-4388-4d6b-8795-fbba59621e28 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.272199138 +0000 UTC m=+218.209760556 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/608f3fea-4388-4d6b-8795-fbba59621e28-serving-cert") pod "authentication-operator-69f744f599-crfkk" (UID: "608f3fea-4388-4d6b-8795-fbba59621e28") : failed to sync secret cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.272433 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535900-m4svc"] Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.274850 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9n4h\" (UniqueName: \"kubernetes.io/projected/fd53cc9e-5423-4ad7-afe5-54824c08341e-kube-api-access-v9n4h\") pod \"dns-default-crqhd\" (UID: \"fd53cc9e-5423-4ad7-afe5-54824c08341e\") " pod="openshift-dns/dns-default-crqhd" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.292971 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5mtwr" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.292958 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr2k6\" (UniqueName: \"kubernetes.io/projected/963fd070-b5e6-4a67-afd6-d056aacf8bc2-kube-api-access-sr2k6\") pod \"marketplace-operator-79b997595-rwr4f\" (UID: \"963fd070-b5e6-4a67-afd6-d056aacf8bc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.300067 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfm6g\" (UniqueName: \"kubernetes.io/projected/6c8a25c9-89d7-4606-8f6e-fe1b46b061eb-kube-api-access-gfm6g\") pod \"packageserver-d55dfcdfc-tdq29\" (UID: \"6c8a25c9-89d7-4606-8f6e-fe1b46b061eb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tdq29" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.306019 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-z6ptp" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.312764 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m6vbm" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.313097 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jwx72"] Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.314538 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk5ks\" (UniqueName: \"kubernetes.io/projected/ee145931-4993-4e80-88c9-1f8a4f46e77c-kube-api-access-tk5ks\") pod \"service-ca-operator-777779d784-nskbr\" (UID: \"ee145931-4993-4e80-88c9-1f8a4f46e77c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nskbr" Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.331571 4771 projected.go:288] Couldn't get configMap openshift-route-controller-manager/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.331604 4771 projected.go:194] Error preparing data for projected volume kube-api-access-xzzvb for pod openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.331696 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/179b172a-a753-4f11-9532-63816979538a-kube-api-access-xzzvb podName:179b172a-a753-4f11-9532-63816979538a nodeName:}" failed. No retries permitted until 2026-02-27 01:08:24.831674568 +0000 UTC m=+217.769235856 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xzzvb" (UniqueName: "kubernetes.io/projected/179b172a-a753-4f11-9532-63816979538a-kube-api-access-xzzvb") pod "route-controller-manager-6576b87f9c-g6lg5" (UID: "179b172a-a753-4f11-9532-63816979538a") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: W0227 01:08:24.332585 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6de81df_af0d_4ebe_b254_7a45c4eb5312.slice/crio-d4f6caf413b15a52ff7c90f6068a74ecff0a42c9f3362a5a5ff79c68acf00271 WatchSource:0}: Error finding container d4f6caf413b15a52ff7c90f6068a74ecff0a42c9f3362a5a5ff79c68acf00271: Status 404 returned error can't find the container with id d4f6caf413b15a52ff7c90f6068a74ecff0a42c9f3362a5a5ff79c68acf00271 Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.349008 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4r9g\" (UniqueName: \"kubernetes.io/projected/602fa36f-0b9d-4ce5-abc5-b8d894cbf0fb-kube-api-access-h4r9g\") pod \"package-server-manager-789f6589d5-tct95\" (UID: \"602fa36f-0b9d-4ce5-abc5-b8d894cbf0fb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tct95" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.367498 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqttp\" (UniqueName: \"kubernetes.io/projected/83ee3033-e504-40f2-9c72-70d863d0d333-kube-api-access-tqttp\") pod \"machine-config-server-77b5k\" (UID: \"83ee3033-e504-40f2-9c72-70d863d0d333\") " pod="openshift-machine-config-operator/machine-config-server-77b5k" Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.370221 4771 projected.go:288] Couldn't get configMap openshift-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.370328 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tct95" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.377947 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zlpg\" (UniqueName: \"kubernetes.io/projected/d58c6fa9-b6f3-4c21-bc67-a36e569fbfeb-kube-api-access-8zlpg\") pod \"ingress-canary-wf8jh\" (UID: \"d58c6fa9-b6f3-4c21-bc67-a36e569fbfeb\") " pod="openshift-ingress-canary/ingress-canary-wf8jh" Feb 27 01:08:24 crc kubenswrapper[4771]: W0227 01:08:24.381291 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod706b5440_ad63_4d92_9708_96ce6d6926b8.slice/crio-e80e4e45df14a64b4c91bb630c55e7e79f8850ebec2140fb65056434225b47a9 WatchSource:0}: Error finding container e80e4e45df14a64b4c91bb630c55e7e79f8850ebec2140fb65056434225b47a9: Status 404 returned error can't find the container with id e80e4e45df14a64b4c91bb630c55e7e79f8850ebec2140fb65056434225b47a9 Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.411266 4771 projected.go:288] Couldn't get configMap openshift-controller-manager/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.417324 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tdq29" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.420446 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nskbr" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.421053 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdbfg\" (UniqueName: \"kubernetes.io/projected/c2be6688-5fef-4657-9eea-235fc8bb13f7-kube-api-access-kdbfg\") pod \"catalog-operator-68c6474976-khn28\" (UID: \"c2be6688-5fef-4657-9eea-235fc8bb13f7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khn28" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.434702 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr4jg\" (UniqueName: \"kubernetes.io/projected/50a07abb-e77f-450d-990f-3c9e3b0360d9-kube-api-access-pr4jg\") pod \"service-ca-9c57cc56f-rmkfg\" (UID: \"50a07abb-e77f-450d-990f-3c9e3b0360d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-rmkfg" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.440739 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.441663 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttp4j\" (UniqueName: \"kubernetes.io/projected/e0d5634e-ce3f-40a5-b85d-64f8c4708c59-kube-api-access-ttp4j\") pod \"auto-csr-approver-29535908-hhvn5\" (UID: \"e0d5634e-ce3f-40a5-b85d-64f8c4708c59\") " pod="openshift-infra/auto-csr-approver-29535908-hhvn5" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.448496 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qwgc"] Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.449788 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-crqhd" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.458190 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bdlp9\" (UID: \"af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bdlp9" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.460093 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wf8jh" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.467768 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-77b5k" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.487467 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj7ld\" (UniqueName: \"kubernetes.io/projected/a6e4a2c7-0dd0-471a-a63c-aacd72b2cb17-kube-api-access-sj7ld\") pod \"olm-operator-6b444d44fb-4ltbb\" (UID: \"a6e4a2c7-0dd0-471a-a63c-aacd72b2cb17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4ltbb" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.510465 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgv82\" (UniqueName: \"kubernetes.io/projected/b9b84607-b33c-4c44-8331-9e09df2cccfe-kube-api-access-wgv82\") pod \"multus-admission-controller-857f4d67dd-98pdr\" (UID: \"b9b84607-b33c-4c44-8331-9e09df2cccfe\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-98pdr" Feb 27 01:08:24 crc kubenswrapper[4771]: W0227 01:08:24.510658 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62c59a17_8b65_4876_a007_1cb1f45a7c2b.slice/crio-c61fb600262047d950687ae0588e3d15ba931af9edec2c30fc9f9e148471119e WatchSource:0}: Error finding container c61fb600262047d950687ae0588e3d15ba931af9edec2c30fc9f9e148471119e: Status 404 returned error can't find the container with id c61fb600262047d950687ae0588e3d15ba931af9edec2c30fc9f9e148471119e Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.515022 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s45ck\" (UniqueName: \"kubernetes.io/projected/740f2438-5f9c-40bb-ae51-77aac4708ab9-kube-api-access-s45ck\") pod \"router-default-5444994796-h4dqr\" (UID: \"740f2438-5f9c-40bb-ae51-77aac4708ab9\") " pod="openshift-ingress/router-default-5444994796-h4dqr" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.539808 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnlw6\" (UniqueName: \"kubernetes.io/projected/af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f-kube-api-access-vnlw6\") pod \"ingress-operator-5b745b69d9-bdlp9\" (UID: \"af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bdlp9" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.548429 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.561206 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.579202 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.599724 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.615826 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.616251 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.616358 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.616893 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:10:26.616863416 +0000 UTC m=+339.554424704 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.618148 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.622492 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.638131 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.653541 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.657641 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.660685 4771 projected.go:194] Error preparing data for projected volume kube-api-access-m8x9z for pod openshift-apiserver/apiserver-76f77b778f-76f5q: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.660738 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4eaf94a-ef2d-48bb-8762-bad950a6918a-kube-api-access-m8x9z podName:f4eaf94a-ef2d-48bb-8762-bad950a6918a nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.160722298 +0000 UTC m=+218.098283586 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-m8x9z" (UniqueName: "kubernetes.io/projected/f4eaf94a-ef2d-48bb-8762-bad950a6918a-kube-api-access-m8x9z") pod "apiserver-76f77b778f-76f5q" (UID: "f4eaf94a-ef2d-48bb-8762-bad950a6918a") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.661022 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535908-hhvn5" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.677055 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.677705 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-98pdr" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.684257 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-rmkfg" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.692296 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khn28" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.698598 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4ltbb" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.710382 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.717508 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.717684 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.727653 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.728201 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-h4dqr" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.732441 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28znj"] Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.732486 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zv9vk"] Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.732593 4771 projected.go:194] Error preparing data for projected volume kube-api-access-mhdh9 for pod openshift-cluster-machine-approver/machine-approver-56656f9798-s7265: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.732655 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dedd2c80-3f88-4871-82b4-7744b17d00fc-kube-api-access-mhdh9 podName:dedd2c80-3f88-4871-82b4-7744b17d00fc nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.232637939 +0000 UTC m=+218.170199227 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mhdh9" (UniqueName: "kubernetes.io/projected/dedd2c80-3f88-4871-82b4-7744b17d00fc-kube-api-access-mhdh9") pod "machine-approver-56656f9798-s7265" (UID: "dedd2c80-3f88-4871-82b4-7744b17d00fc") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.734208 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bdlp9" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.736149 4771 request.go:700] Waited for 2.233048217s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.736377 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.738074 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.740590 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.747862 4771 projected.go:194] Error preparing data for projected volume kube-api-access-lvjg4 for pod openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cqlnh: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.748005 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ca03f0a2-fdee-42d5-a671-212f7b35b6aa-kube-api-access-lvjg4 podName:ca03f0a2-fdee-42d5-a671-212f7b35b6aa nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.247983409 +0000 UTC m=+218.185544697 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-lvjg4" (UniqueName: "kubernetes.io/projected/ca03f0a2-fdee-42d5-a671-212f7b35b6aa-kube-api-access-lvjg4") pod "cluster-samples-operator-665b6dd947-cqlnh" (UID: "ca03f0a2-fdee-42d5-a671-212f7b35b6aa") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.757266 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.757543 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjdh5"] Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.762988 4771 projected.go:194] Error preparing data for projected volume kube-api-access-b4fsd for pod openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: E0227 01:08:24.763070 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e58f1a0-a75d-4280-8cfc-c249696d0b38-kube-api-access-b4fsd podName:2e58f1a0-a75d-4280-8cfc-c249696d0b38 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.263049612 +0000 UTC m=+218.200610900 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-b4fsd" (UniqueName: "kubernetes.io/projected/2e58f1a0-a75d-4280-8cfc-c249696d0b38-kube-api-access-b4fsd") pod "apiserver-7bbb656c7d-sf4rl" (UID: "2e58f1a0-a75d-4280-8cfc-c249696d0b38") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.781376 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.796966 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.797787 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.806736 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.812519 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.819089 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.836347 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4vrtf" event={"ID":"7bd5b18f-fa8c-46d4-a571-630a67b14023","Type":"ContainerStarted","Data":"5dd52a424da15613a3ba943cce82014290c89c0fef40ae023bdf64f87d231a90"} Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.836394 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4vrtf" event={"ID":"7bd5b18f-fa8c-46d4-a571-630a67b14023","Type":"ContainerStarted","Data":"7ebb84798c01594e52426d92da044beb78edf03f75164cd76283a67cf44f766e"} Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.836405 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4vrtf" event={"ID":"7bd5b18f-fa8c-46d4-a571-630a67b14023","Type":"ContainerStarted","Data":"a92bc83d75fc66a1df2b883c1d93c6d331c8985d5eb1e6362aaadb124388da9a"} Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.838034 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-65dsm" event={"ID":"db8009a0-8b08-421c-8f35-e3127b0b5e8e","Type":"ContainerStarted","Data":"1a941b4ceb914b38949c9af872a4e5af7293c8636bd1a1d01ffd2ad49def31fe"} Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.838067 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-65dsm" event={"ID":"db8009a0-8b08-421c-8f35-e3127b0b5e8e","Type":"ContainerStarted","Data":"dec5b9c457df15717739641c79145cc35de5bfc378494663a11ec140f8b6980e"} Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.838715 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.840772 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdp46" event={"ID":"5b3ec1be-ffa3-4733-ac99-7c86693297d7","Type":"ContainerStarted","Data":"575d25bc0b273bcc043417aaf22dcfcb22e01190cdec486becb8cb765090c021"} Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.846484 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jwx72" event={"ID":"706b5440-ad63-4d92-9708-96ce6d6926b8","Type":"ContainerStarted","Data":"b755dc39b5596d5548f0b6ef376cf36ffa0f959499f774dad6ce55e5a634b24b"} Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.846604 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jwx72" event={"ID":"706b5440-ad63-4d92-9708-96ce6d6926b8","Type":"ContainerStarted","Data":"e80e4e45df14a64b4c91bb630c55e7e79f8850ebec2140fb65056434225b47a9"} Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.855809 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-77b5k" event={"ID":"83ee3033-e504-40f2-9c72-70d863d0d333","Type":"ContainerStarted","Data":"9ce15690ed05bbf098fcee82e1ef02f7009dd0e7300554887e540371e762faa4"} Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.857140 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.857560 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qwgc" event={"ID":"62c59a17-8b65-4876-a007-1cb1f45a7c2b","Type":"ContainerStarted","Data":"c61fb600262047d950687ae0588e3d15ba931af9edec2c30fc9f9e148471119e"} Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.879896 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.880267 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-b77x6" event={"ID":"e49e35fa-4abf-4adb-8e2d-48f71bd28c18","Type":"ContainerStarted","Data":"0cdc8141be8af0b7101511abaec6dfc5c12e17518444622314f4fe6f826b7822"} Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.880310 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-b77x6" event={"ID":"e49e35fa-4abf-4adb-8e2d-48f71bd28c18","Type":"ContainerStarted","Data":"d9bc37b701e839eae4b36f25dedb0acda72c7205714cd3adafd50687aee3e0ed"} Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.899036 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.915314 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-m4svc" event={"ID":"b6de81df-af0d-4ebe-b254-7a45c4eb5312","Type":"ContainerStarted","Data":"e0e334c29109c38d41bb92b4427f3ba3625e86f2c9191671d44c6e07d0b9487f"} Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.915374 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-m4svc" event={"ID":"b6de81df-af0d-4ebe-b254-7a45c4eb5312","Type":"ContainerStarted","Data":"d4f6caf413b15a52ff7c90f6068a74ecff0a42c9f3362a5a5ff79c68acf00271"} Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.920290 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzzvb\" (UniqueName: \"kubernetes.io/projected/179b172a-a753-4f11-9532-63816979538a-kube-api-access-xzzvb\") pod \"route-controller-manager-6576b87f9c-g6lg5\" (UID: \"179b172a-a753-4f11-9532-63816979538a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.924233 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.925675 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-gd7gl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.925713 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gd7gl" podUID="58839f3c-374c-43d0-ac2e-32c497ead461" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.938224 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.957657 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.958449 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.980248 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 27 01:08:24 crc kubenswrapper[4771]: I0227 01:08:24.987937 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qr2pd"] Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.010354 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.018004 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.021910 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ltdq5"] Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.038726 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.057696 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.077952 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.100389 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.117991 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 27 01:08:25 crc kubenswrapper[4771]: E0227 01:08:25.126450 4771 projected.go:288] Couldn't get configMap openshift-authentication-operator/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:25 crc kubenswrapper[4771]: E0227 01:08:25.126484 4771 projected.go:194] Error preparing data for projected volume kube-api-access-9pwmh for pod openshift-authentication-operator/authentication-operator-69f744f599-crfkk: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:25 crc kubenswrapper[4771]: E0227 01:08:25.126567 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/608f3fea-4388-4d6b-8795-fbba59621e28-kube-api-access-9pwmh podName:608f3fea-4388-4d6b-8795-fbba59621e28 nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.626531572 +0000 UTC m=+218.564092920 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9pwmh" (UniqueName: "kubernetes.io/projected/608f3fea-4388-4d6b-8795-fbba59621e28-kube-api-access-9pwmh") pod "authentication-operator-69f744f599-crfkk" (UID: "608f3fea-4388-4d6b-8795-fbba59621e28") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.137984 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.190081 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.194493 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.197914 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.225214 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.229994 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8x9z\" (UniqueName: \"kubernetes.io/projected/f4eaf94a-ef2d-48bb-8762-bad950a6918a-kube-api-access-m8x9z\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.234070 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhdh9\" (UniqueName: \"kubernetes.io/projected/dedd2c80-3f88-4871-82b4-7744b17d00fc-kube-api-access-mhdh9\") pod \"machine-approver-56656f9798-s7265\" (UID: \"dedd2c80-3f88-4871-82b4-7744b17d00fc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s7265" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.237844 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.255624 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhdh9\" (UniqueName: \"kubernetes.io/projected/dedd2c80-3f88-4871-82b4-7744b17d00fc-kube-api-access-mhdh9\") pod \"machine-approver-56656f9798-s7265\" (UID: \"dedd2c80-3f88-4871-82b4-7744b17d00fc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s7265" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.259506 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.281901 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.306173 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.317544 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.322801 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzzvb\" (UniqueName: \"kubernetes.io/projected/179b172a-a753-4f11-9532-63816979538a-kube-api-access-xzzvb\") pod \"route-controller-manager-6576b87f9c-g6lg5\" (UID: \"179b172a-a753-4f11-9532-63816979538a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.338132 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-client-ca\") pod \"controller-manager-879f6c89f-wgwwp\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.338533 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvjg4\" (UniqueName: \"kubernetes.io/projected/ca03f0a2-fdee-42d5-a671-212f7b35b6aa-kube-api-access-lvjg4\") pod \"cluster-samples-operator-665b6dd947-cqlnh\" (UID: \"ca03f0a2-fdee-42d5-a671-212f7b35b6aa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cqlnh" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.338589 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-config\") pod \"authentication-operator-69f744f599-crfkk\" (UID: \"608f3fea-4388-4d6b-8795-fbba59621e28\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.338605 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9f091ab-b345-4bf0-ac8e-b44181c8553f-serving-cert\") pod \"controller-manager-879f6c89f-wgwwp\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.338626 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-service-ca-bundle\") pod \"authentication-operator-69f744f599-crfkk\" (UID: \"608f3fea-4388-4d6b-8795-fbba59621e28\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.338644 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-etcd-serving-ca\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.338662 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-serving-cert\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.338677 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wgwwp\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.338698 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.338720 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dedd2c80-3f88-4871-82b4-7744b17d00fc-machine-approver-tls\") pod \"machine-approver-56656f9798-s7265\" (UID: \"dedd2c80-3f88-4871-82b4-7744b17d00fc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s7265" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.338736 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca03f0a2-fdee-42d5-a671-212f7b35b6aa-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-cqlnh\" (UID: \"ca03f0a2-fdee-42d5-a671-212f7b35b6aa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cqlnh" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.338753 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.338773 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/179b172a-a753-4f11-9532-63816979538a-client-ca\") pod \"route-controller-manager-6576b87f9c-g6lg5\" (UID: \"179b172a-a753-4f11-9532-63816979538a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.338792 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-serving-cert\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.338811 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-config\") pod \"controller-manager-879f6c89f-wgwwp\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.338831 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-image-import-ca\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.338868 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/608f3fea-4388-4d6b-8795-fbba59621e28-serving-cert\") pod \"authentication-operator-69f744f599-crfkk\" (UID: \"608f3fea-4388-4d6b-8795-fbba59621e28\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.338892 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/179b172a-a753-4f11-9532-63816979538a-serving-cert\") pod \"route-controller-manager-6576b87f9c-g6lg5\" (UID: \"179b172a-a753-4f11-9532-63816979538a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.338909 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-config\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.338925 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.338941 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4fsd\" (UniqueName: \"kubernetes.io/projected/2e58f1a0-a75d-4280-8cfc-c249696d0b38-kube-api-access-b4fsd\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.338969 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-etcd-client\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.338985 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dedd2c80-3f88-4871-82b4-7744b17d00fc-auth-proxy-config\") pod \"machine-approver-56656f9798-s7265\" (UID: \"dedd2c80-3f88-4871-82b4-7744b17d00fc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s7265" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.339001 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-crfkk\" (UID: \"608f3fea-4388-4d6b-8795-fbba59621e28\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.339026 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-encryption-config\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.339045 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-audit-policies\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.339065 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dedd2c80-3f88-4871-82b4-7744b17d00fc-config\") pod \"machine-approver-56656f9798-s7265\" (UID: \"dedd2c80-3f88-4871-82b4-7744b17d00fc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s7265" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.339089 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-encryption-config\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.339105 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/179b172a-a753-4f11-9532-63816979538a-config\") pod \"route-controller-manager-6576b87f9c-g6lg5\" (UID: \"179b172a-a753-4f11-9532-63816979538a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.339126 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-etcd-client\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.339140 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-audit\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.346196 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-service-ca-bundle\") pod \"authentication-operator-69f744f599-crfkk\" (UID: \"608f3fea-4388-4d6b-8795-fbba59621e28\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.348869 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.350082 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wgwwp\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.350424 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvjg4\" (UniqueName: \"kubernetes.io/projected/ca03f0a2-fdee-42d5-a671-212f7b35b6aa-kube-api-access-lvjg4\") pod \"cluster-samples-operator-665b6dd947-cqlnh\" (UID: \"ca03f0a2-fdee-42d5-a671-212f7b35b6aa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cqlnh" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.350562 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-etcd-serving-ca\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.352690 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4fsd\" (UniqueName: \"kubernetes.io/projected/2e58f1a0-a75d-4280-8cfc-c249696d0b38-kube-api-access-b4fsd\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.353607 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.354768 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-config\") pod \"controller-manager-879f6c89f-wgwwp\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.355216 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-config\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.355322 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-audit-policies\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.355917 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dedd2c80-3f88-4871-82b4-7744b17d00fc-config\") pod \"machine-approver-56656f9798-s7265\" (UID: \"dedd2c80-3f88-4871-82b4-7744b17d00fc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s7265" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.356484 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2e58f1a0-a75d-4280-8cfc-c249696d0b38-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.358230 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dedd2c80-3f88-4871-82b4-7744b17d00fc-auth-proxy-config\") pod \"machine-approver-56656f9798-s7265\" (UID: \"dedd2c80-3f88-4871-82b4-7744b17d00fc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s7265" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.358312 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-crfkk\" (UID: \"608f3fea-4388-4d6b-8795-fbba59621e28\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.360592 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/179b172a-a753-4f11-9532-63816979538a-config\") pod \"route-controller-manager-6576b87f9c-g6lg5\" (UID: \"179b172a-a753-4f11-9532-63816979538a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.364448 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-etcd-client\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.366151 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-encryption-config\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.366326 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8x9z\" (UniqueName: \"kubernetes.io/projected/f4eaf94a-ef2d-48bb-8762-bad950a6918a-kube-api-access-m8x9z\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.370392 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.375930 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-etcd-client\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.377466 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.378013 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/179b172a-a753-4f11-9532-63816979538a-serving-cert\") pod \"route-controller-manager-6576b87f9c-g6lg5\" (UID: \"179b172a-a753-4f11-9532-63816979538a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.380049 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.380448 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-audit\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.383160 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca03f0a2-fdee-42d5-a671-212f7b35b6aa-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-cqlnh\" (UID: \"ca03f0a2-fdee-42d5-a671-212f7b35b6aa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cqlnh" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.391428 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-serving-cert\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.391838 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4eaf94a-ef2d-48bb-8762-bad950a6918a-serving-cert\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.391925 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2e58f1a0-a75d-4280-8cfc-c249696d0b38-encryption-config\") pod \"apiserver-7bbb656c7d-sf4rl\" (UID: \"2e58f1a0-a75d-4280-8cfc-c249696d0b38\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.397348 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/608f3fea-4388-4d6b-8795-fbba59621e28-serving-cert\") pod \"authentication-operator-69f744f599-crfkk\" (UID: \"608f3fea-4388-4d6b-8795-fbba59621e28\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.398879 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.399426 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9f091ab-b345-4bf0-ac8e-b44181c8553f-serving-cert\") pod \"controller-manager-879f6c89f-wgwwp\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:25 crc kubenswrapper[4771]: E0227 01:08:25.401763 4771 projected.go:194] Error preparing data for projected volume kube-api-access-hph8x for pod openshift-controller-manager/controller-manager-879f6c89f-wgwwp: failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:25 crc kubenswrapper[4771]: E0227 01:08:25.401826 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b9f091ab-b345-4bf0-ac8e-b44181c8553f-kube-api-access-hph8x podName:b9f091ab-b345-4bf0-ac8e-b44181c8553f nodeName:}" failed. No retries permitted until 2026-02-27 01:08:25.901807596 +0000 UTC m=+218.839368874 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hph8x" (UniqueName: "kubernetes.io/projected/b9f091ab-b345-4bf0-ac8e-b44181c8553f-kube-api-access-hph8x") pod "controller-manager-879f6c89f-wgwwp" (UID: "b9f091ab-b345-4bf0-ac8e-b44181c8553f") : failed to sync configmap cache: timed out waiting for the condition Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.404403 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dedd2c80-3f88-4871-82b4-7744b17d00fc-machine-approver-tls\") pod \"machine-approver-56656f9798-s7265\" (UID: \"dedd2c80-3f88-4871-82b4-7744b17d00fc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s7265" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.420711 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.431740 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/608f3fea-4388-4d6b-8795-fbba59621e28-config\") pod \"authentication-operator-69f744f599-crfkk\" (UID: \"608f3fea-4388-4d6b-8795-fbba59621e28\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.442737 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.452671 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f4eaf94a-ef2d-48bb-8762-bad950a6918a-image-import-ca\") pod \"apiserver-76f77b778f-76f5q\" (UID: \"f4eaf94a-ef2d-48bb-8762-bad950a6918a\") " pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.453220 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tsf86"] Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.454786 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wm5j7"] Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.455971 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5mtwr"] Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.457259 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.463310 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/179b172a-a753-4f11-9532-63816979538a-client-ca\") pod \"route-controller-manager-6576b87f9c-g6lg5\" (UID: \"179b172a-a753-4f11-9532-63816979538a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.478039 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.478585 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tct95"] Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.487171 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-client-ca\") pod \"controller-manager-879f6c89f-wgwwp\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.490897 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s7265" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.497003 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.543512 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ebbe1c67-5385-4eda-af88-793c2c85e043-registry-certificates\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.543564 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ebbe1c67-5385-4eda-af88-793c2c85e043-registry-tls\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.543597 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkmrm\" (UniqueName: \"kubernetes.io/projected/ebbe1c67-5385-4eda-af88-793c2c85e043-kube-api-access-hkmrm\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.543625 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebbe1c67-5385-4eda-af88-793c2c85e043-trusted-ca\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.543670 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ebbe1c67-5385-4eda-af88-793c2c85e043-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.543704 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ebbe1c67-5385-4eda-af88-793c2c85e043-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.543726 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ebbe1c67-5385-4eda-af88-793c2c85e043-bound-sa-token\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.543748 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:25 crc kubenswrapper[4771]: E0227 01:08:25.544014 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:26.044001866 +0000 UTC m=+218.981563154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.563815 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cqlnh" Feb 27 01:08:25 crc kubenswrapper[4771]: W0227 01:08:25.568487 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod602fa36f_0b9d_4ce5_abc5_b8d894cbf0fb.slice/crio-e3dab06716e8c37c334909366bd3a9503e0ee2ab01c331e90a8c83e90d75be31 WatchSource:0}: Error finding container e3dab06716e8c37c334909366bd3a9503e0ee2ab01c331e90a8c83e90d75be31: Status 404 returned error can't find the container with id e3dab06716e8c37c334909366bd3a9503e0ee2ab01c331e90a8c83e90d75be31 Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.573515 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.597441 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.647291 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:25 crc kubenswrapper[4771]: E0227 01:08:25.647475 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:26.147452088 +0000 UTC m=+219.085013376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.647943 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebbe1c67-5385-4eda-af88-793c2c85e043-trusted-ca\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.648066 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pwmh\" (UniqueName: \"kubernetes.io/projected/608f3fea-4388-4d6b-8795-fbba59621e28-kube-api-access-9pwmh\") pod \"authentication-operator-69f744f599-crfkk\" (UID: \"608f3fea-4388-4d6b-8795-fbba59621e28\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.648381 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ebbe1c67-5385-4eda-af88-793c2c85e043-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.651292 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ebbe1c67-5385-4eda-af88-793c2c85e043-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.651341 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e14c3fd8-5a78-42cc-b829-7a5b4ce8d996-csi-data-dir\") pod \"csi-hostpathplugin-mwrgv\" (UID: \"e14c3fd8-5a78-42cc-b829-7a5b4ce8d996\") " pod="hostpath-provisioner/csi-hostpathplugin-mwrgv" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.651450 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ebbe1c67-5385-4eda-af88-793c2c85e043-bound-sa-token\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.651541 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.651726 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e14c3fd8-5a78-42cc-b829-7a5b4ce8d996-plugins-dir\") pod \"csi-hostpathplugin-mwrgv\" (UID: \"e14c3fd8-5a78-42cc-b829-7a5b4ce8d996\") " pod="hostpath-provisioner/csi-hostpathplugin-mwrgv" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.651751 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm48w\" (UniqueName: \"kubernetes.io/projected/e14c3fd8-5a78-42cc-b829-7a5b4ce8d996-kube-api-access-cm48w\") pod \"csi-hostpathplugin-mwrgv\" (UID: \"e14c3fd8-5a78-42cc-b829-7a5b4ce8d996\") " pod="hostpath-provisioner/csi-hostpathplugin-mwrgv" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.654060 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e14c3fd8-5a78-42cc-b829-7a5b4ce8d996-registration-dir\") pod \"csi-hostpathplugin-mwrgv\" (UID: \"e14c3fd8-5a78-42cc-b829-7a5b4ce8d996\") " pod="hostpath-provisioner/csi-hostpathplugin-mwrgv" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.654372 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ebbe1c67-5385-4eda-af88-793c2c85e043-registry-certificates\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.654448 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ebbe1c67-5385-4eda-af88-793c2c85e043-registry-tls\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.654532 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e14c3fd8-5a78-42cc-b829-7a5b4ce8d996-socket-dir\") pod \"csi-hostpathplugin-mwrgv\" (UID: \"e14c3fd8-5a78-42cc-b829-7a5b4ce8d996\") " pod="hostpath-provisioner/csi-hostpathplugin-mwrgv" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.654645 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e14c3fd8-5a78-42cc-b829-7a5b4ce8d996-mountpoint-dir\") pod \"csi-hostpathplugin-mwrgv\" (UID: \"e14c3fd8-5a78-42cc-b829-7a5b4ce8d996\") " pod="hostpath-provisioner/csi-hostpathplugin-mwrgv" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.654963 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkmrm\" (UniqueName: \"kubernetes.io/projected/ebbe1c67-5385-4eda-af88-793c2c85e043-kube-api-access-hkmrm\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.655491 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebbe1c67-5385-4eda-af88-793c2c85e043-trusted-ca\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:25 crc kubenswrapper[4771]: E0227 01:08:25.663929 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:26.163872047 +0000 UTC m=+219.101433335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.689439 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ebbe1c67-5385-4eda-af88-793c2c85e043-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.704584 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ebbe1c67-5385-4eda-af88-793c2c85e043-registry-certificates\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.713986 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ebbe1c67-5385-4eda-af88-793c2c85e043-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.714051 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pwmh\" (UniqueName: \"kubernetes.io/projected/608f3fea-4388-4d6b-8795-fbba59621e28-kube-api-access-9pwmh\") pod \"authentication-operator-69f744f599-crfkk\" (UID: \"608f3fea-4388-4d6b-8795-fbba59621e28\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.714491 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ebbe1c67-5385-4eda-af88-793c2c85e043-registry-tls\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.734083 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ebbe1c67-5385-4eda-af88-793c2c85e043-bound-sa-token\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.751307 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkmrm\" (UniqueName: \"kubernetes.io/projected/ebbe1c67-5385-4eda-af88-793c2c85e043-kube-api-access-hkmrm\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.759115 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.760003 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e14c3fd8-5a78-42cc-b829-7a5b4ce8d996-csi-data-dir\") pod \"csi-hostpathplugin-mwrgv\" (UID: \"e14c3fd8-5a78-42cc-b829-7a5b4ce8d996\") " pod="hostpath-provisioner/csi-hostpathplugin-mwrgv" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.760042 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e14c3fd8-5a78-42cc-b829-7a5b4ce8d996-plugins-dir\") pod \"csi-hostpathplugin-mwrgv\" (UID: \"e14c3fd8-5a78-42cc-b829-7a5b4ce8d996\") " pod="hostpath-provisioner/csi-hostpathplugin-mwrgv" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.760057 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm48w\" (UniqueName: \"kubernetes.io/projected/e14c3fd8-5a78-42cc-b829-7a5b4ce8d996-kube-api-access-cm48w\") pod \"csi-hostpathplugin-mwrgv\" (UID: \"e14c3fd8-5a78-42cc-b829-7a5b4ce8d996\") " pod="hostpath-provisioner/csi-hostpathplugin-mwrgv" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.760089 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e14c3fd8-5a78-42cc-b829-7a5b4ce8d996-registration-dir\") pod \"csi-hostpathplugin-mwrgv\" (UID: \"e14c3fd8-5a78-42cc-b829-7a5b4ce8d996\") " pod="hostpath-provisioner/csi-hostpathplugin-mwrgv" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.760113 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e14c3fd8-5a78-42cc-b829-7a5b4ce8d996-socket-dir\") pod \"csi-hostpathplugin-mwrgv\" (UID: \"e14c3fd8-5a78-42cc-b829-7a5b4ce8d996\") " pod="hostpath-provisioner/csi-hostpathplugin-mwrgv" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.760130 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e14c3fd8-5a78-42cc-b829-7a5b4ce8d996-mountpoint-dir\") pod \"csi-hostpathplugin-mwrgv\" (UID: \"e14c3fd8-5a78-42cc-b829-7a5b4ce8d996\") " pod="hostpath-provisioner/csi-hostpathplugin-mwrgv" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.760221 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e14c3fd8-5a78-42cc-b829-7a5b4ce8d996-mountpoint-dir\") pod \"csi-hostpathplugin-mwrgv\" (UID: \"e14c3fd8-5a78-42cc-b829-7a5b4ce8d996\") " pod="hostpath-provisioner/csi-hostpathplugin-mwrgv" Feb 27 01:08:25 crc kubenswrapper[4771]: E0227 01:08:25.760285 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:26.260271033 +0000 UTC m=+219.197832321 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.760352 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e14c3fd8-5a78-42cc-b829-7a5b4ce8d996-csi-data-dir\") pod \"csi-hostpathplugin-mwrgv\" (UID: \"e14c3fd8-5a78-42cc-b829-7a5b4ce8d996\") " pod="hostpath-provisioner/csi-hostpathplugin-mwrgv" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.760538 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e14c3fd8-5a78-42cc-b829-7a5b4ce8d996-plugins-dir\") pod \"csi-hostpathplugin-mwrgv\" (UID: \"e14c3fd8-5a78-42cc-b829-7a5b4ce8d996\") " pod="hostpath-provisioner/csi-hostpathplugin-mwrgv" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.760686 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e14c3fd8-5a78-42cc-b829-7a5b4ce8d996-registration-dir\") pod \"csi-hostpathplugin-mwrgv\" (UID: \"e14c3fd8-5a78-42cc-b829-7a5b4ce8d996\") " pod="hostpath-provisioner/csi-hostpathplugin-mwrgv" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.760705 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e14c3fd8-5a78-42cc-b829-7a5b4ce8d996-socket-dir\") pod \"csi-hostpathplugin-mwrgv\" (UID: \"e14c3fd8-5a78-42cc-b829-7a5b4ce8d996\") " pod="hostpath-provisioner/csi-hostpathplugin-mwrgv" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.798414 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm48w\" (UniqueName: \"kubernetes.io/projected/e14c3fd8-5a78-42cc-b829-7a5b4ce8d996-kube-api-access-cm48w\") pod \"csi-hostpathplugin-mwrgv\" (UID: \"e14c3fd8-5a78-42cc-b829-7a5b4ce8d996\") " pod="hostpath-provisioner/csi-hostpathplugin-mwrgv" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.816592 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-crqhd"] Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.816618 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wf8jh"] Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.830357 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tdq29"] Feb 27 01:08:25 crc kubenswrapper[4771]: W0227 01:08:25.830952 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd53cc9e_5423_4ad7_afe5_54824c08341e.slice/crio-a30fa02c2646f269e633b744222d15f80e86dbd77e40a48810e9d96543f75ca5 WatchSource:0}: Error finding container a30fa02c2646f269e633b744222d15f80e86dbd77e40a48810e9d96543f75ca5: Status 404 returned error can't find the container with id a30fa02c2646f269e633b744222d15f80e86dbd77e40a48810e9d96543f75ca5 Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.831103 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.862670 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:25 crc kubenswrapper[4771]: E0227 01:08:25.863001 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:26.362989867 +0000 UTC m=+219.300551155 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.863559 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m6vbm"] Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.870128 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nskbr"] Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.873926 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rmxxc"] Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.873972 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-z6ptp"] Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.963115 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.963650 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hph8x\" (UniqueName: \"kubernetes.io/projected/b9f091ab-b345-4bf0-ac8e-b44181c8553f-kube-api-access-hph8x\") pod \"controller-manager-879f6c89f-wgwwp\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:25 crc kubenswrapper[4771]: E0227 01:08:25.964121 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:26.464094538 +0000 UTC m=+219.401655816 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.969798 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-mwrgv" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.973098 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hph8x\" (UniqueName: \"kubernetes.io/projected/b9f091ab-b345-4bf0-ac8e-b44181c8553f-kube-api-access-hph8x\") pod \"controller-manager-879f6c89f-wgwwp\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.973755 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rmkfg"] Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.974171 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-crqhd" event={"ID":"fd53cc9e-5423-4ad7-afe5-54824c08341e","Type":"ContainerStarted","Data":"a30fa02c2646f269e633b744222d15f80e86dbd77e40a48810e9d96543f75ca5"} Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.975363 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-98pdr"] Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.977493 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjdh5" event={"ID":"7f792bed-0aa4-455f-8fb7-2b26d76a6172","Type":"ContainerStarted","Data":"40b0b7fc63aa1105e123b423289f3072a3447ca035bfce59efad0dcacb3b68b9"} Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.977563 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjdh5" event={"ID":"7f792bed-0aa4-455f-8fb7-2b26d76a6172","Type":"ContainerStarted","Data":"903243c70038a50946a7be3d7a060f8d5b97fc6febee0600ff5d2350fc1a59e4"} Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.984206 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-b77x6" event={"ID":"e49e35fa-4abf-4adb-8e2d-48f71bd28c18","Type":"ContainerStarted","Data":"ccf7c37b5e7441c7c1a1e720cce9f12b034bbffb2ff9fd2ddcf38362c006a794"} Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.985998 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rwr4f"] Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.987374 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qr2pd" event={"ID":"f93aff21-0c7f-43b8-a1da-2c35dbfd8831","Type":"ContainerStarted","Data":"74a4f7ce48f4d074a03dd90ed4ed7ba6b12bee9ce906587282f73ee3ffd28cf2"} Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.988639 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wf8jh" event={"ID":"d58c6fa9-b6f3-4c21-bc67-a36e569fbfeb","Type":"ContainerStarted","Data":"760560f99cdc2bc5b7878a32a04eb7e6c4ad135cf6389ee88b3e5bafc25fb6db"} Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.989520 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zv9vk" event={"ID":"257bed8a-876b-4f5e-8a4c-66c1e47b33dc","Type":"ContainerStarted","Data":"ac5b89a96f4a2e851093880a64589165c9292d64a0e95de2b1af92570bdaab37"} Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.990155 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s7265" event={"ID":"dedd2c80-3f88-4871-82b4-7744b17d00fc","Type":"ContainerStarted","Data":"c16518ee834b107f268772b4e0a30e5f486e185752ec06704bdce4b4fd4e8b8d"} Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.990801 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tsf86" event={"ID":"6be7b2b4-9297-4d34-8ebc-72e57afda4e4","Type":"ContainerStarted","Data":"e58082a7e3d49f9fcbb0785205a90c2cd7e75f0eec557e7b84800e8abc0f3a42"} Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.991684 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wm5j7" event={"ID":"ebdc7a41-2398-46bd-9724-aca23394d4b3","Type":"ContainerStarted","Data":"79bf6db525bd3480c30fd814d1bfe8baf8cff5356cb18eb024ed2fc66dec3372"} Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.992970 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tct95" event={"ID":"602fa36f-0b9d-4ce5-abc5-b8d894cbf0fb","Type":"ContainerStarted","Data":"e3dab06716e8c37c334909366bd3a9503e0ee2ab01c331e90a8c83e90d75be31"} Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.993819 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-h4dqr" event={"ID":"740f2438-5f9c-40bb-ae51-77aac4708ab9","Type":"ContainerStarted","Data":"e359491a25d5a6bbda4981ba1d8da6e7837e97bdefce3b4ffb5fb86d2c64f3a8"} Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.993844 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-h4dqr" event={"ID":"740f2438-5f9c-40bb-ae51-77aac4708ab9","Type":"ContainerStarted","Data":"5357bfcdb75d8166f0303a291f43dc6c4c17644b28756dc8962dc6da026b56b5"} Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.994807 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28znj" event={"ID":"581a9e83-a359-4b05-b9c0-0d4c8d39277b","Type":"ContainerStarted","Data":"c91ad09d402a6d95473160741ec52ac045e18e16bca20a017d0ff41559d96159"} Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.995638 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qwgc" event={"ID":"62c59a17-8b65-4876-a007-1cb1f45a7c2b","Type":"ContainerStarted","Data":"dc96a16cd37b794916c2e7136f3c7f50ae0a7b38ca929a5b67fca562a87e44cd"} Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.996654 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-77b5k" event={"ID":"83ee3033-e504-40f2-9c72-70d863d0d333","Type":"ContainerStarted","Data":"bd2792b98d720a4c807cacc460e811ebb34cc6de902cb6f3538eb879f1eae225"} Feb 27 01:08:25 crc kubenswrapper[4771]: I0227 01:08:25.998147 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ltdq5" event={"ID":"d5d1adad-cc9f-4d57-8099-d8e3323da190","Type":"ContainerStarted","Data":"2339609429d8926577092bed5bb1d2882f929438759c612e2dfc7d20ef40a2b3"} Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.001095 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5mtwr" event={"ID":"dc136422-ec9e-411c-9c1c-5704e6033226","Type":"ContainerStarted","Data":"c95ac85b32b578b8024b290b0e999f4914a2d087ae24e4c30d13bec66829a94d"} Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.002569 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jwx72" event={"ID":"706b5440-ad63-4d92-9708-96ce6d6926b8","Type":"ContainerStarted","Data":"0c2bca50a07e14da144a42f789098f9cefb9de6e2dbd104fb0332371bf8bba8f"} Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.019100 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tdq29" event={"ID":"6c8a25c9-89d7-4606-8f6e-fe1b46b061eb","Type":"ContainerStarted","Data":"e032fdfb3dcce590a0880a19db889139164d8f34d7abd2d997f5fe01eee68b3e"} Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.043914 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bdlp9"] Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.045040 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535908-hhvn5"] Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.045772 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khn28"] Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.064952 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:26 crc kubenswrapper[4771]: E0227 01:08:26.069487 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:26.569473123 +0000 UTC m=+219.507034411 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.111375 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.176470 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:26 crc kubenswrapper[4771]: E0227 01:08:26.176786 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:26.676771339 +0000 UTC m=+219.614332627 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.181484 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-gd7gl" podStartSLOduration=173.181463785 podStartE2EDuration="2m53.181463785s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:26.141751714 +0000 UTC m=+219.079313002" watchObservedRunningTime="2026-02-27 01:08:26.181463785 +0000 UTC m=+219.119025083" Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.187571 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.212259 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4ltbb"] Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.235910 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-h4dqr" podStartSLOduration=173.23589315 podStartE2EDuration="2m53.23589315s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:26.235420846 +0000 UTC m=+219.172982134" watchObservedRunningTime="2026-02-27 01:08:26.23589315 +0000 UTC m=+219.173454438" Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.251829 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl"] Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.277806 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:26 crc kubenswrapper[4771]: E0227 01:08:26.278112 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:26.778101877 +0000 UTC m=+219.715663165 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.330726 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cqlnh"] Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.331319 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-76f5q"] Feb 27 01:08:26 crc kubenswrapper[4771]: W0227 01:08:26.350894 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e58f1a0_a75d_4280_8cfc_c249696d0b38.slice/crio-e5425a9c619195948bbc6fe29556e5a4632da6a89fb3ae9359e57b181fe55a91 WatchSource:0}: Error finding container e5425a9c619195948bbc6fe29556e5a4632da6a89fb3ae9359e57b181fe55a91: Status 404 returned error can't find the container with id e5425a9c619195948bbc6fe29556e5a4632da6a89fb3ae9359e57b181fe55a91 Feb 27 01:08:26 crc kubenswrapper[4771]: W0227 01:08:26.353265 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-c2f59db7ef1f7c4dca77787a37dc857bf9a39bd3d6017f01ebce948fad6c7734 WatchSource:0}: Error finding container c2f59db7ef1f7c4dca77787a37dc857bf9a39bd3d6017f01ebce948fad6c7734: Status 404 returned error can't find the container with id c2f59db7ef1f7c4dca77787a37dc857bf9a39bd3d6017f01ebce948fad6c7734 Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.382697 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:26 crc kubenswrapper[4771]: E0227 01:08:26.383056 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:26.88303331 +0000 UTC m=+219.820594608 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.384034 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:26 crc kubenswrapper[4771]: E0227 01:08:26.384523 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:26.884492749 +0000 UTC m=+219.822054037 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.472166 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5"] Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.482184 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mwrgv"] Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.485479 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:26 crc kubenswrapper[4771]: E0227 01:08:26.485826 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:26.985795185 +0000 UTC m=+219.923356473 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.587260 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jwx72" podStartSLOduration=172.587242526 podStartE2EDuration="2m52.587242526s" podCreationTimestamp="2026-02-27 01:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:26.586969138 +0000 UTC m=+219.524530426" watchObservedRunningTime="2026-02-27 01:08:26.587242526 +0000 UTC m=+219.524803814" Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.588172 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjdh5" podStartSLOduration=173.58816731 podStartE2EDuration="2m53.58816731s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:26.550904475 +0000 UTC m=+219.488465763" watchObservedRunningTime="2026-02-27 01:08:26.58816731 +0000 UTC m=+219.525728598" Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.592914 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:26 crc kubenswrapper[4771]: E0227 01:08:26.593211 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:27.093198495 +0000 UTC m=+220.030759783 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.597503 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-crfkk"] Feb 27 01:08:26 crc kubenswrapper[4771]: W0227 01:08:26.613242 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode14c3fd8_5a78_42cc_b829_7a5b4ce8d996.slice/crio-e080988478fdabc414dc6ecbeb972adefb559d6202353f4c83abc0fed54caf50 WatchSource:0}: Error finding container e080988478fdabc414dc6ecbeb972adefb559d6202353f4c83abc0fed54caf50: Status 404 returned error can't find the container with id e080988478fdabc414dc6ecbeb972adefb559d6202353f4c83abc0fed54caf50 Feb 27 01:08:26 crc kubenswrapper[4771]: W0227 01:08:26.634190 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod179b172a_a753_4f11_9532_63816979538a.slice/crio-8713e36d9d36df5b6d95754def13484a85337456f5a8da7ee35edc9cbffcdf0f WatchSource:0}: Error finding container 8713e36d9d36df5b6d95754def13484a85337456f5a8da7ee35edc9cbffcdf0f: Status 404 returned error can't find the container with id 8713e36d9d36df5b6d95754def13484a85337456f5a8da7ee35edc9cbffcdf0f Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.693967 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:26 crc kubenswrapper[4771]: E0227 01:08:26.694323 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:27.194307796 +0000 UTC m=+220.131869084 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.729630 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-h4dqr" Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.737136 4771 patch_prober.go:28] interesting pod/router-default-5444994796-h4dqr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 01:08:26 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Feb 27 01:08:26 crc kubenswrapper[4771]: [+]process-running ok Feb 27 01:08:26 crc kubenswrapper[4771]: healthz check failed Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.737182 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h4dqr" podUID="740f2438-5f9c-40bb-ae51-77aac4708ab9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.754032 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-65dsm" podStartSLOduration=173.754010061 podStartE2EDuration="2m53.754010061s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:26.752289625 +0000 UTC m=+219.689850923" watchObservedRunningTime="2026-02-27 01:08:26.754010061 +0000 UTC m=+219.691571349" Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.816714 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:26 crc kubenswrapper[4771]: E0227 01:08:26.817308 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:27.317267611 +0000 UTC m=+220.254828899 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.829651 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-b77x6" podStartSLOduration=173.829599721 podStartE2EDuration="2m53.829599721s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:26.798562631 +0000 UTC m=+219.736123919" watchObservedRunningTime="2026-02-27 01:08:26.829599721 +0000 UTC m=+219.767161009" Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.836598 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wgwwp"] Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.867470 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-m4svc" podStartSLOduration=173.867449692 podStartE2EDuration="2m53.867449692s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:26.864885683 +0000 UTC m=+219.802446971" watchObservedRunningTime="2026-02-27 01:08:26.867449692 +0000 UTC m=+219.805010980" Feb 27 01:08:26 crc kubenswrapper[4771]: W0227 01:08:26.909151 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9f091ab_b345_4bf0_ac8e_b44181c8553f.slice/crio-2367b6bb95a8cf9644de85afe36c77e92d4a00ad69bf30cf9a79dbb18f464e56 WatchSource:0}: Error finding container 2367b6bb95a8cf9644de85afe36c77e92d4a00ad69bf30cf9a79dbb18f464e56: Status 404 returned error can't find the container with id 2367b6bb95a8cf9644de85afe36c77e92d4a00ad69bf30cf9a79dbb18f464e56 Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.919879 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:26 crc kubenswrapper[4771]: E0227 01:08:26.920168 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:27.420151299 +0000 UTC m=+220.357712577 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.988240 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-77b5k" podStartSLOduration=5.988217158 podStartE2EDuration="5.988217158s" podCreationTimestamp="2026-02-27 01:08:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:26.948247089 +0000 UTC m=+219.885808377" watchObservedRunningTime="2026-02-27 01:08:26.988217158 +0000 UTC m=+219.925778446" Feb 27 01:08:26 crc kubenswrapper[4771]: I0227 01:08:26.991866 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdp46" podStartSLOduration=173.991843915 podStartE2EDuration="2m53.991843915s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:26.988890575 +0000 UTC m=+219.926451863" watchObservedRunningTime="2026-02-27 01:08:26.991843915 +0000 UTC m=+219.929405203" Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.021126 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:27 crc kubenswrapper[4771]: E0227 01:08:27.021654 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:27.521539478 +0000 UTC m=+220.459100766 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.048983 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4ltbb" event={"ID":"a6e4a2c7-0dd0-471a-a63c-aacd72b2cb17","Type":"ContainerStarted","Data":"51f870762f7da7aee2cdb0106740861e9a487454d104f85c3d3958d784348ad8"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.068722 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-z6ptp" event={"ID":"423b8446-879f-47e3-9779-14373f259598","Type":"ContainerStarted","Data":"a94ed905b834c92d925acbb4f4c50b71611c8ac4ac5f6bcc269fdcf9216429a8"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.068763 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-z6ptp" event={"ID":"423b8446-879f-47e3-9779-14373f259598","Type":"ContainerStarted","Data":"325ded51da6aad1a8268fd1f6f82603f5d1830cd6a76d6c07473e0621db80dc7"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.069302 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-z6ptp" Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.075950 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wf8jh" event={"ID":"d58c6fa9-b6f3-4c21-bc67-a36e569fbfeb","Type":"ContainerStarted","Data":"f35049b44bf2416b28a87b4a125f54f6a833c7b7cbea8af1d9dfc84c7fdc3327"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.077407 4771 patch_prober.go:28] interesting pod/console-operator-58897d9998-z6ptp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.077493 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-z6ptp" podUID="423b8446-879f-47e3-9779-14373f259598" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.086753 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-98pdr" event={"ID":"b9b84607-b33c-4c44-8331-9e09df2cccfe","Type":"ContainerStarted","Data":"2910743e3bc030cca719b0600478112eba9e76b2033a8322a40db7f3b7b26f1f"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.091142 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" event={"ID":"b9f091ab-b345-4bf0-ac8e-b44181c8553f","Type":"ContainerStarted","Data":"2367b6bb95a8cf9644de85afe36c77e92d4a00ad69bf30cf9a79dbb18f464e56"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.092472 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khn28" event={"ID":"c2be6688-5fef-4657-9eea-235fc8bb13f7","Type":"ContainerStarted","Data":"27d8ea7543b48bec668c419f9678f1a00f5344d28d5345385b8d51b086f7cbc8"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.095130 4771 generic.go:334] "Generic (PLEG): container finished" podID="b6de81df-af0d-4ebe-b254-7a45c4eb5312" containerID="e0e334c29109c38d41bb92b4427f3ba3625e86f2c9191671d44c6e07d0b9487f" exitCode=0 Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.095215 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-m4svc" event={"ID":"b6de81df-af0d-4ebe-b254-7a45c4eb5312","Type":"ContainerDied","Data":"e0e334c29109c38d41bb92b4427f3ba3625e86f2c9191671d44c6e07d0b9487f"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.097011 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" event={"ID":"963fd070-b5e6-4a67-afd6-d056aacf8bc2","Type":"ContainerStarted","Data":"b2ed07c40ddaed774caf8731e5dc006d7f2bfbc7d2cef9b338d51ad77e145ba4"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.105034 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tdq29" event={"ID":"6c8a25c9-89d7-4606-8f6e-fe1b46b061eb","Type":"ContainerStarted","Data":"e59e973d40c23efcb905172eedfe11a01089c8fb7c83395d34f5dcc3aae1f9b6"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.107650 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tdq29" Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.114922 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" event={"ID":"2e58f1a0-a75d-4280-8cfc-c249696d0b38","Type":"ContainerStarted","Data":"e5425a9c619195948bbc6fe29556e5a4632da6a89fb3ae9359e57b181fe55a91"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.116126 4771 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-tdq29 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.116174 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tdq29" podUID="6c8a25c9-89d7-4606-8f6e-fe1b46b061eb" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.117782 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"05dbe8d28bb464282550d556c5179b45bccbb00dbc50df40e302e13be9e2560f"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.119631 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-rmxxc" event={"ID":"fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b","Type":"ContainerStarted","Data":"75455501f1e4eb4655f1c414464f15cbf58af1fb8cd4a6244ba2ec58ddc711bc"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.122131 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:27 crc kubenswrapper[4771]: E0227 01:08:27.122325 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:27.62230011 +0000 UTC m=+220.559861418 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.122414 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.122521 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cqlnh" event={"ID":"ca03f0a2-fdee-42d5-a671-212f7b35b6aa","Type":"ContainerStarted","Data":"280186d417e4502ac2f21079874b5e4923d3228781e714b7dab47f6be14856fb"} Feb 27 01:08:27 crc kubenswrapper[4771]: E0227 01:08:27.122796 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:27.622784473 +0000 UTC m=+220.560345771 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.155778 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zv9vk" event={"ID":"257bed8a-876b-4f5e-8a4c-66c1e47b33dc","Type":"ContainerStarted","Data":"1689e783f8d296b565b6ecb516cffbd3e968b5a2b9ce756bb3b38e4ece52df4e"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.161143 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-rmkfg" event={"ID":"50a07abb-e77f-450d-990f-3c9e3b0360d9","Type":"ContainerStarted","Data":"9ad08f9d1236806843cc4b2ae854e0ccd33df5df821868830b5fae5097849cc1"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.161181 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-rmkfg" event={"ID":"50a07abb-e77f-450d-990f-3c9e3b0360d9","Type":"ContainerStarted","Data":"99a59e6df79818eb94abf91219dff91fc4eb59264b81a348abdfb8a956209181"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.169323 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wm5j7" event={"ID":"ebdc7a41-2398-46bd-9724-aca23394d4b3","Type":"ContainerStarted","Data":"3356190c21ae7c0182009037893220263bf7037f4036681ccc258356089bad4d"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.170629 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-4vrtf" podStartSLOduration=173.17061801 podStartE2EDuration="2m53.17061801s" podCreationTimestamp="2026-02-27 01:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:27.157229113 +0000 UTC m=+220.094790401" watchObservedRunningTime="2026-02-27 01:08:27.17061801 +0000 UTC m=+220.108179298" Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.183047 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nskbr" event={"ID":"ee145931-4993-4e80-88c9-1f8a4f46e77c","Type":"ContainerStarted","Data":"310c8b67b81ee00b00c7c155b12f0985e9c0a5ae1e48e6d5581d863dda8ab886"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.183110 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nskbr" event={"ID":"ee145931-4993-4e80-88c9-1f8a4f46e77c","Type":"ContainerStarted","Data":"b9f0df88ba10fd99fcd213140a22258f2f81a93ebaa3616c97b2e4c71b36cb14"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.196514 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" event={"ID":"608f3fea-4388-4d6b-8795-fbba59621e28","Type":"ContainerStarted","Data":"1b2e7082c5a69e9938b3070c15629c0b0c1cff37c747f446230a8b6af9b7c39d"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.210365 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28znj" event={"ID":"581a9e83-a359-4b05-b9c0-0d4c8d39277b","Type":"ContainerStarted","Data":"9a9dcecf80d19528552ac2617d5498e1c4314f6a98a1860ac900907fc5ec3056"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.223680 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:27 crc kubenswrapper[4771]: E0227 01:08:27.223819 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:27.723780521 +0000 UTC m=+220.661341809 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.224017 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:27 crc kubenswrapper[4771]: E0227 01:08:27.228814 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:27.728795455 +0000 UTC m=+220.666356743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.231367 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" event={"ID":"179b172a-a753-4f11-9532-63816979538a","Type":"ContainerStarted","Data":"8713e36d9d36df5b6d95754def13484a85337456f5a8da7ee35edc9cbffcdf0f"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.253480 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m6vbm" event={"ID":"72432eea-a601-4d93-8aee-41ff9573ff0a","Type":"ContainerStarted","Data":"bad3ad59340c3109bcd2654603e2247c890ae508e379833985d39b53e551d019"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.264709 4771 generic.go:334] "Generic (PLEG): container finished" podID="dc136422-ec9e-411c-9c1c-5704e6033226" containerID="50372cb0074b6494b3c455d1583393c324df5157735189e25482b62a489208d5" exitCode=0 Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.265264 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5mtwr" event={"ID":"dc136422-ec9e-411c-9c1c-5704e6033226","Type":"ContainerDied","Data":"50372cb0074b6494b3c455d1583393c324df5157735189e25482b62a489208d5"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.284833 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-76f5q" event={"ID":"f4eaf94a-ef2d-48bb-8762-bad950a6918a","Type":"ContainerStarted","Data":"abd7bde8a655c2e6c14661193145ef7192db8b2427fbda2055a958a135ba2693"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.312365 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"1be692e4d45e2de8fd8fed902399785fbcd142c1616f89f48be41e8a1b016ff9"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.324600 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:27 crc kubenswrapper[4771]: E0227 01:08:27.325755 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:27.825740795 +0000 UTC m=+220.763302083 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.327700 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-crqhd" event={"ID":"fd53cc9e-5423-4ad7-afe5-54824c08341e","Type":"ContainerStarted","Data":"38377d8fa1cc70d073439136efcc054153b9bcb6c1787c445aa28edfa35f35e4"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.329063 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" podStartSLOduration=174.329024652 podStartE2EDuration="2m54.329024652s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:27.325028536 +0000 UTC m=+220.262589824" watchObservedRunningTime="2026-02-27 01:08:27.329024652 +0000 UTC m=+220.266585940" Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.352649 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tsf86" event={"ID":"6be7b2b4-9297-4d34-8ebc-72e57afda4e4","Type":"ContainerStarted","Data":"7125ad5eac2026a68e443dc436bd6c17ba8f36b6b3848221c004f2a0028f5885"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.355473 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qr2pd" event={"ID":"f93aff21-0c7f-43b8-a1da-2c35dbfd8831","Type":"ContainerStarted","Data":"d66dc3f3b3104f8b44aad0d92603a775c0f044f672056361b6dee160a93d0403"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.359029 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bdlp9" event={"ID":"af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f","Type":"ContainerStarted","Data":"da8f27b84c83d0e71f1ecd5fa7ebb6cab80fefd24819010daf7b3e64fbc0071b"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.362777 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tct95" event={"ID":"602fa36f-0b9d-4ce5-abc5-b8d894cbf0fb","Type":"ContainerStarted","Data":"220b4a88aa28780b76c62dc4ca65c70b58a3d2f88cb3f486d9a49092db6907ba"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.365674 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535908-hhvn5" event={"ID":"e0d5634e-ce3f-40a5-b85d-64f8c4708c59","Type":"ContainerStarted","Data":"a8cbcd7114d387115d3a054b6ade4c8a81ee65dc780cca737549331a0b6d2646"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.367198 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c2f59db7ef1f7c4dca77787a37dc857bf9a39bd3d6017f01ebce948fad6c7734"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.376453 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qwgc" podStartSLOduration=173.376434589 podStartE2EDuration="2m53.376434589s" podCreationTimestamp="2026-02-27 01:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:27.375500474 +0000 UTC m=+220.313061762" watchObservedRunningTime="2026-02-27 01:08:27.376434589 +0000 UTC m=+220.313995877" Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.380648 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ltdq5" event={"ID":"d5d1adad-cc9f-4d57-8099-d8e3323da190","Type":"ContainerStarted","Data":"64961de63e07eb17d25086f2fef12b90ddb926a7687293ab7d36c7cdb68a17df"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.387001 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mwrgv" event={"ID":"e14c3fd8-5a78-42cc-b829-7a5b4ce8d996","Type":"ContainerStarted","Data":"e080988478fdabc414dc6ecbeb972adefb559d6202353f4c83abc0fed54caf50"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.400898 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s7265" event={"ID":"dedd2c80-3f88-4871-82b4-7744b17d00fc","Type":"ContainerStarted","Data":"6189113b75272595af8014e0fb97707e7d88c63cafed4cca74a817b7cd102596"} Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.427584 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:27 crc kubenswrapper[4771]: E0227 01:08:27.428410 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:27.928395177 +0000 UTC m=+220.865956455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.529579 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-z6ptp" podStartSLOduration=174.52956085 podStartE2EDuration="2m54.52956085s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:27.525393709 +0000 UTC m=+220.462954997" watchObservedRunningTime="2026-02-27 01:08:27.52956085 +0000 UTC m=+220.467122138" Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.530493 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:27 crc kubenswrapper[4771]: E0227 01:08:27.530781 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:28.030768343 +0000 UTC m=+220.968329621 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.590401 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wm5j7" podStartSLOduration=174.590379695 podStartE2EDuration="2m54.590379695s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:27.559473839 +0000 UTC m=+220.497035137" watchObservedRunningTime="2026-02-27 01:08:27.590379695 +0000 UTC m=+220.527940983" Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.623751 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tsf86" podStartSLOduration=174.623728546 podStartE2EDuration="2m54.623728546s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:27.620576002 +0000 UTC m=+220.558137290" watchObservedRunningTime="2026-02-27 01:08:27.623728546 +0000 UTC m=+220.561289834" Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.632012 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:27 crc kubenswrapper[4771]: E0227 01:08:27.632355 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:28.132340676 +0000 UTC m=+221.069901964 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.665295 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nskbr" podStartSLOduration=173.665276576 podStartE2EDuration="2m53.665276576s" podCreationTimestamp="2026-02-27 01:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:27.665168062 +0000 UTC m=+220.602729350" watchObservedRunningTime="2026-02-27 01:08:27.665276576 +0000 UTC m=+220.602837864" Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.732891 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:27 crc kubenswrapper[4771]: E0227 01:08:27.733254 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:28.233223451 +0000 UTC m=+221.170784739 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.733427 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:27 crc kubenswrapper[4771]: E0227 01:08:27.733806 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:28.233789896 +0000 UTC m=+221.171351184 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.741626 4771 patch_prober.go:28] interesting pod/router-default-5444994796-h4dqr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 01:08:27 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Feb 27 01:08:27 crc kubenswrapper[4771]: [+]process-running ok Feb 27 01:08:27 crc kubenswrapper[4771]: healthz check failed Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.741684 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h4dqr" podUID="740f2438-5f9c-40bb-ae51-77aac4708ab9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.830055 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-rmkfg" podStartSLOduration=173.830025887 podStartE2EDuration="2m53.830025887s" podCreationTimestamp="2026-02-27 01:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:27.764750663 +0000 UTC m=+220.702311951" watchObservedRunningTime="2026-02-27 01:08:27.830025887 +0000 UTC m=+220.767587185" Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.834127 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:27 crc kubenswrapper[4771]: E0227 01:08:27.834512 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:28.334482085 +0000 UTC m=+221.272043373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.834675 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:27 crc kubenswrapper[4771]: E0227 01:08:27.834967 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:28.334958809 +0000 UTC m=+221.272520097 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.841168 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wf8jh" podStartSLOduration=6.841135703 podStartE2EDuration="6.841135703s" podCreationTimestamp="2026-02-27 01:08:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:27.828210898 +0000 UTC m=+220.765772186" watchObservedRunningTime="2026-02-27 01:08:27.841135703 +0000 UTC m=+220.778696991" Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.906050 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-28znj" podStartSLOduration=174.906036248 podStartE2EDuration="2m54.906036248s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:27.90386582 +0000 UTC m=+220.841427108" watchObservedRunningTime="2026-02-27 01:08:27.906036248 +0000 UTC m=+220.843597536" Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.936463 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:27 crc kubenswrapper[4771]: E0227 01:08:27.938895 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:28.438876235 +0000 UTC m=+221.376437523 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.954506 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tdq29" podStartSLOduration=173.954473811 podStartE2EDuration="2m53.954473811s" podCreationTimestamp="2026-02-27 01:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:27.946113748 +0000 UTC m=+220.883675036" watchObservedRunningTime="2026-02-27 01:08:27.954473811 +0000 UTC m=+220.892035099" Feb 27 01:08:27 crc kubenswrapper[4771]: I0227 01:08:27.974571 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zv9vk" podStartSLOduration=174.974534587 podStartE2EDuration="2m54.974534587s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:27.97050254 +0000 UTC m=+220.908063828" watchObservedRunningTime="2026-02-27 01:08:27.974534587 +0000 UTC m=+220.912095875" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.039687 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:28 crc kubenswrapper[4771]: E0227 01:08:28.040044 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:28.540033787 +0000 UTC m=+221.477595075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.141466 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:28 crc kubenswrapper[4771]: E0227 01:08:28.142050 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:28.642034742 +0000 UTC m=+221.579596030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.243657 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:28 crc kubenswrapper[4771]: E0227 01:08:28.243974 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:28.743964255 +0000 UTC m=+221.681525543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.344411 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:28 crc kubenswrapper[4771]: E0227 01:08:28.344765 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:28.844747258 +0000 UTC m=+221.782308546 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.345240 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:28 crc kubenswrapper[4771]: E0227 01:08:28.345536 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:28.845526838 +0000 UTC m=+221.783088126 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.438992 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khn28" event={"ID":"c2be6688-5fef-4657-9eea-235fc8bb13f7","Type":"ContainerStarted","Data":"50dbb1f291a960ae1f5ed7012165a8a9306f909ab7a0dcee01202dbc1bd5b229"} Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.440844 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khn28" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.442018 4771 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-khn28 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.442054 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khn28" podUID="c2be6688-5fef-4657-9eea-235fc8bb13f7" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.447213 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:28 crc kubenswrapper[4771]: E0227 01:08:28.447565 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:28.947537583 +0000 UTC m=+221.885098861 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.462146 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"a90e9a6b65e76ee34b01e6f99d58f6a2c59950d274659a314a1cd49ea300ab48"} Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.462786 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.489697 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khn28" podStartSLOduration=174.489679239 podStartE2EDuration="2m54.489679239s" podCreationTimestamp="2026-02-27 01:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:28.470838296 +0000 UTC m=+221.408399584" watchObservedRunningTime="2026-02-27 01:08:28.489679239 +0000 UTC m=+221.427240527" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.490865 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-crqhd" event={"ID":"fd53cc9e-5423-4ad7-afe5-54824c08341e","Type":"ContainerStarted","Data":"9b27a116455b09fd30b59df5283fb34eef7a6d44c0211e67843abf692135bcbe"} Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.491658 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-crqhd" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.496285 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c6f5833205e258cfe2ebb3458547a7e2e0f483839159172d8c556b6d565450e6"} Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.505999 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-98pdr" event={"ID":"b9b84607-b33c-4c44-8331-9e09df2cccfe","Type":"ContainerStarted","Data":"447dae4e8678a8c86fbeb3e8914cee333a814879e2c7c4a87b3625994be847f1"} Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.506043 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-98pdr" event={"ID":"b9b84607-b33c-4c44-8331-9e09df2cccfe","Type":"ContainerStarted","Data":"1216edcca32f4c16ae672897aa10f3bd1f7d1aa3f4c039713489104e935a9a87"} Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.507723 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ltdq5" event={"ID":"d5d1adad-cc9f-4d57-8099-d8e3323da190","Type":"ContainerStarted","Data":"3587a2449c6118c4d1b016d330f3090d96dd4c8c69ea6efb52ec91002847eb18"} Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.518894 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-rmxxc" event={"ID":"fa4d3b1a-a5e4-4edb-af7f-81dfad3a5e8b","Type":"ContainerStarted","Data":"fcd551aea2837433992584ebccb96b5ddeebfe64e940dbf0e3414d31a7cd9cff"} Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.525489 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cqlnh" event={"ID":"ca03f0a2-fdee-42d5-a671-212f7b35b6aa","Type":"ContainerStarted","Data":"ede38f68844e811d4fdca14cc42213cb0b3a82178e50148299a30dd83516c24e"} Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.525532 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cqlnh" event={"ID":"ca03f0a2-fdee-42d5-a671-212f7b35b6aa","Type":"ContainerStarted","Data":"3d5661b3a94ed0d672136ff0e0d647b54620327decf3f5263d71a98e2385c713"} Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.541205 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s7265" event={"ID":"dedd2c80-3f88-4871-82b4-7744b17d00fc","Type":"ContainerStarted","Data":"2fddab11dcb5e3d1dadbd648cadd6aa98e52d59a0b80742b8c1d590c9a1ba65d"} Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.543457 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-98pdr" podStartSLOduration=175.543447886 podStartE2EDuration="2m55.543447886s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:28.542049189 +0000 UTC m=+221.479610477" watchObservedRunningTime="2026-02-27 01:08:28.543447886 +0000 UTC m=+221.481009164" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.543996 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-crqhd" podStartSLOduration=7.54399009 podStartE2EDuration="7.54399009s" podCreationTimestamp="2026-02-27 01:08:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:28.513169897 +0000 UTC m=+221.450731175" watchObservedRunningTime="2026-02-27 01:08:28.54399009 +0000 UTC m=+221.481551378" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.555687 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:28 crc kubenswrapper[4771]: E0227 01:08:28.557476 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:29.057463771 +0000 UTC m=+221.995025059 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.559941 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" event={"ID":"179b172a-a753-4f11-9532-63816979538a","Type":"ContainerStarted","Data":"682dae31a77e9bcefb96836af2c947ce7926e134536a1f5d0aebd075f4fe84c5"} Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.560727 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.566678 4771 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-g6lg5 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.566727 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" podUID="179b172a-a753-4f11-9532-63816979538a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.591501 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ltdq5" podStartSLOduration=175.591486209 podStartE2EDuration="2m55.591486209s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:28.589917657 +0000 UTC m=+221.527478945" watchObservedRunningTime="2026-02-27 01:08:28.591486209 +0000 UTC m=+221.529047497" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.592070 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6df46e892370461eebdcf6194f367d93a73c6747e7a8fced5f75d3365715c81f"} Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.603643 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m6vbm" event={"ID":"72432eea-a601-4d93-8aee-41ff9573ff0a","Type":"ContainerStarted","Data":"0ce7fa280cda6284e936c61a3e257f6bf757dc626c9a5007e586be34c7f5a73f"} Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.609802 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bdlp9" event={"ID":"af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f","Type":"ContainerStarted","Data":"1b3574fb286a80d75b5a5882fd2d9377e4f9caec3076cfd509faed7e8a6a0a57"} Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.609835 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bdlp9" event={"ID":"af9deae8-8c32-4c23-8b0a-f5f1d4b0ef3f","Type":"ContainerStarted","Data":"f1b4824d54b8d8829fe2d554ae3df1c9390be6c4f59997296c7037ec57819073"} Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.615294 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cqlnh" podStartSLOduration=175.615280425 podStartE2EDuration="2m55.615280425s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:28.614870364 +0000 UTC m=+221.552431652" watchObservedRunningTime="2026-02-27 01:08:28.615280425 +0000 UTC m=+221.552841713" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.632141 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" event={"ID":"608f3fea-4388-4d6b-8795-fbba59621e28","Type":"ContainerStarted","Data":"0164ca87b5f1731894a0a62133023ed147c0a8dc4167af4a74fc9d96b70e4b8a"} Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.644940 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qr2pd" event={"ID":"f93aff21-0c7f-43b8-a1da-2c35dbfd8831","Type":"ContainerStarted","Data":"e296c3f8e61791a36657cda1333cbe3dd6e6988f5628fe4071ab156a03117b88"} Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.649297 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4eaf94a-ef2d-48bb-8762-bad950a6918a" containerID="149030b6ff2974783ce06f173e33b93a4ca0b400c947d75789278a4abcac5def" exitCode=0 Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.649353 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-76f5q" event={"ID":"f4eaf94a-ef2d-48bb-8762-bad950a6918a","Type":"ContainerDied","Data":"149030b6ff2974783ce06f173e33b93a4ca0b400c947d75789278a4abcac5def"} Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.654748 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" event={"ID":"963fd070-b5e6-4a67-afd6-d056aacf8bc2","Type":"ContainerStarted","Data":"a8e6d733021697d0fa055f19b568a4e1868ca7248f61c98a447627db114115de"} Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.655506 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.656140 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-rmxxc" podStartSLOduration=175.656130796 podStartE2EDuration="2m55.656130796s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:28.655802678 +0000 UTC m=+221.593363966" watchObservedRunningTime="2026-02-27 01:08:28.656130796 +0000 UTC m=+221.593692084" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.659132 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:28 crc kubenswrapper[4771]: E0227 01:08:28.659313 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:29.159300811 +0000 UTC m=+222.096862099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.659331 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:28 crc kubenswrapper[4771]: E0227 01:08:28.660295 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:29.160287587 +0000 UTC m=+222.097848875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.667014 4771 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-rwr4f container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.667056 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" podUID="963fd070-b5e6-4a67-afd6-d056aacf8bc2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.670320 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4ltbb" event={"ID":"a6e4a2c7-0dd0-471a-a63c-aacd72b2cb17","Type":"ContainerStarted","Data":"5c2d996515ce63cd5b4285bff624279dbe4b5883062c044858e1340eb7fbc5fa"} Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.673583 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4ltbb" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.681360 4771 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-4ltbb container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.681405 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4ltbb" podUID="a6e4a2c7-0dd0-471a-a63c-aacd72b2cb17" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.690204 4771 generic.go:334] "Generic (PLEG): container finished" podID="2e58f1a0-a75d-4280-8cfc-c249696d0b38" containerID="a2d86fb8fe2c1f338d2c6ffcc608addaaaf583a781c138d5fb005598baf6a5f2" exitCode=0 Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.690264 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" event={"ID":"2e58f1a0-a75d-4280-8cfc-c249696d0b38","Type":"ContainerDied","Data":"a2d86fb8fe2c1f338d2c6ffcc608addaaaf583a781c138d5fb005598baf6a5f2"} Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.716884 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-s7265" podStartSLOduration=175.716867459 podStartE2EDuration="2m55.716867459s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:28.673794358 +0000 UTC m=+221.611355646" watchObservedRunningTime="2026-02-27 01:08:28.716867459 +0000 UTC m=+221.654428747" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.717137 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" podStartSLOduration=174.717132526 podStartE2EDuration="2m54.717132526s" podCreationTimestamp="2026-02-27 01:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:28.714218928 +0000 UTC m=+221.651780216" watchObservedRunningTime="2026-02-27 01:08:28.717132526 +0000 UTC m=+221.654693814" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.719486 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" event={"ID":"b9f091ab-b345-4bf0-ac8e-b44181c8553f","Type":"ContainerStarted","Data":"bd15a553da130f9baf7f148967603be9a45a395291d70f379f91fe53f2d9248c"} Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.720033 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.733299 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5mtwr" event={"ID":"dc136422-ec9e-411c-9c1c-5704e6033226","Type":"ContainerStarted","Data":"441c98c843b94bfeb5f9ead78763d6c2d75396e8f8abec1dfaf41415dc877405"} Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.733584 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5mtwr" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.743936 4771 patch_prober.go:28] interesting pod/router-default-5444994796-h4dqr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 01:08:28 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Feb 27 01:08:28 crc kubenswrapper[4771]: [+]process-running ok Feb 27 01:08:28 crc kubenswrapper[4771]: healthz check failed Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.743989 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h4dqr" podUID="740f2438-5f9c-40bb-ae51-77aac4708ab9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.744273 4771 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-wgwwp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.744288 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" podUID="b9f091ab-b345-4bf0-ac8e-b44181c8553f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.749448 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tct95" event={"ID":"602fa36f-0b9d-4ce5-abc5-b8d894cbf0fb","Type":"ContainerStarted","Data":"46858d3e8eb71bcfeefcfe5d74b38ed72c271043edc61b235eaf1d21a6cd8241"} Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.749480 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tct95" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.754642 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4ltbb" podStartSLOduration=174.754621538 podStartE2EDuration="2m54.754621538s" podCreationTimestamp="2026-02-27 01:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:28.752663935 +0000 UTC m=+221.690225223" watchObservedRunningTime="2026-02-27 01:08:28.754621538 +0000 UTC m=+221.692182816" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.760759 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:28 crc kubenswrapper[4771]: E0227 01:08:28.764975 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:29.264959084 +0000 UTC m=+222.202520372 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.787622 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tdq29" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.862612 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:28 crc kubenswrapper[4771]: E0227 01:08:28.867765 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:29.367744369 +0000 UTC m=+222.305305657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.877783 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bdlp9" podStartSLOduration=175.877756738 podStartE2EDuration="2m55.877756738s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:28.86851103 +0000 UTC m=+221.806072318" watchObservedRunningTime="2026-02-27 01:08:28.877756738 +0000 UTC m=+221.815318026" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.936503 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qr2pd" podStartSLOduration=175.936398693 podStartE2EDuration="2m55.936398693s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:28.898630034 +0000 UTC m=+221.836191322" watchObservedRunningTime="2026-02-27 01:08:28.936398693 +0000 UTC m=+221.873959981" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.964309 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:28 crc kubenswrapper[4771]: E0227 01:08:28.964711 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:29.46469447 +0000 UTC m=+222.402255758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.977767 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" podStartSLOduration=174.977748839 podStartE2EDuration="2m54.977748839s" podCreationTimestamp="2026-02-27 01:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:28.935323195 +0000 UTC m=+221.872884483" watchObservedRunningTime="2026-02-27 01:08:28.977748839 +0000 UTC m=+221.915310127" Feb 27 01:08:28 crc kubenswrapper[4771]: I0227 01:08:28.995833 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-z6ptp" Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.038530 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-m6vbm" podStartSLOduration=176.038514611 podStartE2EDuration="2m56.038514611s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:29.010958506 +0000 UTC m=+221.948519794" watchObservedRunningTime="2026-02-27 01:08:29.038514611 +0000 UTC m=+221.976075899" Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.039047 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wgwwp"] Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.068320 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:29 crc kubenswrapper[4771]: E0227 01:08:29.068613 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:29.568601226 +0000 UTC m=+222.506162514 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.149222 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-crfkk" podStartSLOduration=176.149207098 podStartE2EDuration="2m56.149207098s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:29.089490544 +0000 UTC m=+222.027051832" watchObservedRunningTime="2026-02-27 01:08:29.149207098 +0000 UTC m=+222.086768386" Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.169848 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.170200 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5"] Feb 27 01:08:29 crc kubenswrapper[4771]: E0227 01:08:29.170277 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:29.670262001 +0000 UTC m=+222.607823289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.174569 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tct95" podStartSLOduration=175.174540136 podStartE2EDuration="2m55.174540136s" podCreationTimestamp="2026-02-27 01:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:29.172969063 +0000 UTC m=+222.110530351" watchObservedRunningTime="2026-02-27 01:08:29.174540136 +0000 UTC m=+222.112101414" Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.271410 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:29 crc kubenswrapper[4771]: E0227 01:08:29.271791 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:29.771778543 +0000 UTC m=+222.709339831 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.273254 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" podStartSLOduration=176.273244102 podStartE2EDuration="2m56.273244102s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:29.270880959 +0000 UTC m=+222.208442247" watchObservedRunningTime="2026-02-27 01:08:29.273244102 +0000 UTC m=+222.210805390" Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.273761 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5mtwr" podStartSLOduration=176.273755796 podStartE2EDuration="2m56.273755796s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:29.232820623 +0000 UTC m=+222.170381911" watchObservedRunningTime="2026-02-27 01:08:29.273755796 +0000 UTC m=+222.211317084" Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.340809 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-m4svc" Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.359171 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.372086 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:29 crc kubenswrapper[4771]: E0227 01:08:29.372223 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:29.872203167 +0000 UTC m=+222.809764445 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.372302 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:29 crc kubenswrapper[4771]: E0227 01:08:29.372623 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:29.872611967 +0000 UTC m=+222.810173255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.473891 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6de81df-af0d-4ebe-b254-7a45c4eb5312-config-volume\") pod \"b6de81df-af0d-4ebe-b254-7a45c4eb5312\" (UID: \"b6de81df-af0d-4ebe-b254-7a45c4eb5312\") " Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.473948 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grbg2\" (UniqueName: \"kubernetes.io/projected/b6de81df-af0d-4ebe-b254-7a45c4eb5312-kube-api-access-grbg2\") pod \"b6de81df-af0d-4ebe-b254-7a45c4eb5312\" (UID: \"b6de81df-af0d-4ebe-b254-7a45c4eb5312\") " Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.473994 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6de81df-af0d-4ebe-b254-7a45c4eb5312-secret-volume\") pod \"b6de81df-af0d-4ebe-b254-7a45c4eb5312\" (UID: \"b6de81df-af0d-4ebe-b254-7a45c4eb5312\") " Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.474082 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:29 crc kubenswrapper[4771]: E0227 01:08:29.474639 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:29.974619132 +0000 UTC m=+222.912180420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.475078 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6de81df-af0d-4ebe-b254-7a45c4eb5312-config-volume" (OuterVolumeSpecName: "config-volume") pod "b6de81df-af0d-4ebe-b254-7a45c4eb5312" (UID: "b6de81df-af0d-4ebe-b254-7a45c4eb5312"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.486867 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6de81df-af0d-4ebe-b254-7a45c4eb5312-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b6de81df-af0d-4ebe-b254-7a45c4eb5312" (UID: "b6de81df-af0d-4ebe-b254-7a45c4eb5312"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.500211 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6de81df-af0d-4ebe-b254-7a45c4eb5312-kube-api-access-grbg2" (OuterVolumeSpecName: "kube-api-access-grbg2") pod "b6de81df-af0d-4ebe-b254-7a45c4eb5312" (UID: "b6de81df-af0d-4ebe-b254-7a45c4eb5312"). InnerVolumeSpecName "kube-api-access-grbg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.575868 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.575930 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6de81df-af0d-4ebe-b254-7a45c4eb5312-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.575940 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grbg2\" (UniqueName: \"kubernetes.io/projected/b6de81df-af0d-4ebe-b254-7a45c4eb5312-kube-api-access-grbg2\") on node \"crc\" DevicePath \"\"" Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.575951 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6de81df-af0d-4ebe-b254-7a45c4eb5312-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 01:08:29 crc kubenswrapper[4771]: E0227 01:08:29.576166 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:30.076156505 +0000 UTC m=+223.013717793 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.676444 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:29 crc kubenswrapper[4771]: E0227 01:08:29.676761 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:30.176698081 +0000 UTC m=+223.114259369 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.676910 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:29 crc kubenswrapper[4771]: E0227 01:08:29.677283 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:30.177274887 +0000 UTC m=+223.114836175 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.733628 4771 patch_prober.go:28] interesting pod/router-default-5444994796-h4dqr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 01:08:29 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Feb 27 01:08:29 crc kubenswrapper[4771]: [+]process-running ok Feb 27 01:08:29 crc kubenswrapper[4771]: healthz check failed Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.733677 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h4dqr" podUID="740f2438-5f9c-40bb-ae51-77aac4708ab9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.760403 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-m4svc" event={"ID":"b6de81df-af0d-4ebe-b254-7a45c4eb5312","Type":"ContainerDied","Data":"d4f6caf413b15a52ff7c90f6068a74ecff0a42c9f3362a5a5ff79c68acf00271"} Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.760443 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4f6caf413b15a52ff7c90f6068a74ecff0a42c9f3362a5a5ff79c68acf00271" Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.760471 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535900-m4svc" Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.763725 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mwrgv" event={"ID":"e14c3fd8-5a78-42cc-b829-7a5b4ce8d996","Type":"ContainerStarted","Data":"377bb35c242985ebd769aeed90634502bb4f0382c246eee385587a77b4999240"} Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.769102 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" event={"ID":"2e58f1a0-a75d-4280-8cfc-c249696d0b38","Type":"ContainerStarted","Data":"7cc1de0d4a9f6b391e4f75da6a7a41f35aef766255dd46432145720e23771ada"} Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.780125 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:29 crc kubenswrapper[4771]: E0227 01:08:29.780287 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:30.280248287 +0000 UTC m=+223.217809575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.780429 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:29 crc kubenswrapper[4771]: E0227 01:08:29.781372 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:30.281355877 +0000 UTC m=+223.218917245 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.787335 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-76f5q" event={"ID":"f4eaf94a-ef2d-48bb-8762-bad950a6918a","Type":"ContainerStarted","Data":"7d942dc0f5145d771e97ede3c30ff8b16461fba271e637e1f5a2c64bc6a8d9af"} Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.798360 4771 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-rwr4f container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.798417 4771 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-wgwwp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.798474 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" podUID="b9f091ab-b345-4bf0-ac8e-b44181c8553f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.798422 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" podUID="963fd070-b5e6-4a67-afd6-d056aacf8bc2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.800183 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" podStartSLOduration=175.800170089 podStartE2EDuration="2m55.800170089s" podCreationTimestamp="2026-02-27 01:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:29.798922036 +0000 UTC m=+222.736483324" watchObservedRunningTime="2026-02-27 01:08:29.800170089 +0000 UTC m=+222.737731377" Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.811996 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4ltbb" Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.829893 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-khn28" Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.882348 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:29 crc kubenswrapper[4771]: E0227 01:08:29.882448 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:30.382430127 +0000 UTC m=+223.319991415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.883305 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:29 crc kubenswrapper[4771]: E0227 01:08:29.887760 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:30.387744989 +0000 UTC m=+223.325306277 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.892602 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.942095 4771 ???:1] "http: TLS handshake error from 192.168.126.11:46304: no serving certificate available for the kubelet" Feb 27 01:08:29 crc kubenswrapper[4771]: I0227 01:08:29.985769 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:29 crc kubenswrapper[4771]: E0227 01:08:29.986747 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:30.486728144 +0000 UTC m=+223.424289432 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.014956 4771 ???:1] "http: TLS handshake error from 192.168.126.11:46314: no serving certificate available for the kubelet" Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.087276 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:30 crc kubenswrapper[4771]: E0227 01:08:30.087767 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:30.587750113 +0000 UTC m=+223.525311401 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.154308 4771 ???:1] "http: TLS handshake error from 192.168.126.11:46316: no serving certificate available for the kubelet" Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.188462 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:30 crc kubenswrapper[4771]: E0227 01:08:30.188687 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:30.688664738 +0000 UTC m=+223.626226026 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.188736 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:30 crc kubenswrapper[4771]: E0227 01:08:30.189097 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:30.689080729 +0000 UTC m=+223.626642017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.255526 4771 ???:1] "http: TLS handshake error from 192.168.126.11:46326: no serving certificate available for the kubelet" Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.289937 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:30 crc kubenswrapper[4771]: E0227 01:08:30.290128 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:30.790102928 +0000 UTC m=+223.727664216 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.290207 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:30 crc kubenswrapper[4771]: E0227 01:08:30.290518 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:30.790511059 +0000 UTC m=+223.728072347 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.308336 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5mtwr" Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.390087 4771 ???:1] "http: TLS handshake error from 192.168.126.11:46338: no serving certificate available for the kubelet" Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.391632 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:30 crc kubenswrapper[4771]: E0227 01:08:30.391880 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:30.891866077 +0000 UTC m=+223.829427365 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.492884 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:30 crc kubenswrapper[4771]: E0227 01:08:30.493188 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:30.993177633 +0000 UTC m=+223.930738921 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.574537 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.574838 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.593679 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:30 crc kubenswrapper[4771]: E0227 01:08:30.594031 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:31.094012727 +0000 UTC m=+224.031574015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.594328 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:30 crc kubenswrapper[4771]: E0227 01:08:30.594584 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:31.094576123 +0000 UTC m=+224.032137411 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.601743 4771 ???:1] "http: TLS handshake error from 192.168.126.11:46354: no serving certificate available for the kubelet" Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.696161 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:30 crc kubenswrapper[4771]: E0227 01:08:30.696462 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:31.196441053 +0000 UTC m=+224.134002341 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.737966 4771 patch_prober.go:28] interesting pod/router-default-5444994796-h4dqr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 01:08:30 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Feb 27 01:08:30 crc kubenswrapper[4771]: [+]process-running ok Feb 27 01:08:30 crc kubenswrapper[4771]: healthz check failed Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.738029 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h4dqr" podUID="740f2438-5f9c-40bb-ae51-77aac4708ab9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.797214 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:30 crc kubenswrapper[4771]: E0227 01:08:30.797527 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:31.297515744 +0000 UTC m=+224.235077032 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.804513 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mwrgv" event={"ID":"e14c3fd8-5a78-42cc-b829-7a5b4ce8d996","Type":"ContainerStarted","Data":"3427abf92e6d78fa86869b9fc2819b83e1d246c5126973f77666579c415169b5"} Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.811452 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-76f5q" event={"ID":"f4eaf94a-ef2d-48bb-8762-bad950a6918a","Type":"ContainerStarted","Data":"c55b6fbd0529769fd0ca06085b2680e6da0a85f7ad6843a64d3229731a86d080"} Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.812008 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" podUID="179b172a-a753-4f11-9532-63816979538a" containerName="route-controller-manager" containerID="cri-o://682dae31a77e9bcefb96836af2c947ce7926e134536a1f5d0aebd075f4fe84c5" gracePeriod=30 Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.812498 4771 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-rwr4f container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.812529 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" podUID="963fd070-b5e6-4a67-afd6-d056aacf8bc2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.813215 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" podUID="b9f091ab-b345-4bf0-ac8e-b44181c8553f" containerName="controller-manager" containerID="cri-o://bd15a553da130f9baf7f148967603be9a45a395291d70f379f91fe53f2d9248c" gracePeriod=30 Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.821493 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.841833 4771 ???:1] "http: TLS handshake error from 192.168.126.11:46362: no serving certificate available for the kubelet" Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.848013 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-76f5q" podStartSLOduration=177.847997613 podStartE2EDuration="2m57.847997613s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:30.846123782 +0000 UTC m=+223.783685070" watchObservedRunningTime="2026-02-27 01:08:30.847997613 +0000 UTC m=+223.785558891" Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.898048 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:30 crc kubenswrapper[4771]: E0227 01:08:30.898441 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:31.39842575 +0000 UTC m=+224.335987038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.976322 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lds4k"] Feb 27 01:08:30 crc kubenswrapper[4771]: E0227 01:08:30.976519 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6de81df-af0d-4ebe-b254-7a45c4eb5312" containerName="collect-profiles" Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.976535 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6de81df-af0d-4ebe-b254-7a45c4eb5312" containerName="collect-profiles" Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.976636 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6de81df-af0d-4ebe-b254-7a45c4eb5312" containerName="collect-profiles" Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.977240 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lds4k" Feb 27 01:08:30 crc kubenswrapper[4771]: I0227 01:08:30.979379 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:30.999282 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.000563 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lds4k"] Feb 27 01:08:31 crc kubenswrapper[4771]: E0227 01:08:31.001480 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:31.501466302 +0000 UTC m=+224.439027590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.028085 4771 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.103681 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.103847 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b5757cb-321d-4a76-8769-786b28a2b004-catalog-content\") pod \"certified-operators-lds4k\" (UID: \"0b5757cb-321d-4a76-8769-786b28a2b004\") " pod="openshift-marketplace/certified-operators-lds4k" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.103870 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzlpt\" (UniqueName: \"kubernetes.io/projected/0b5757cb-321d-4a76-8769-786b28a2b004-kube-api-access-kzlpt\") pod \"certified-operators-lds4k\" (UID: \"0b5757cb-321d-4a76-8769-786b28a2b004\") " pod="openshift-marketplace/certified-operators-lds4k" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.103910 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b5757cb-321d-4a76-8769-786b28a2b004-utilities\") pod \"certified-operators-lds4k\" (UID: \"0b5757cb-321d-4a76-8769-786b28a2b004\") " pod="openshift-marketplace/certified-operators-lds4k" Feb 27 01:08:31 crc kubenswrapper[4771]: E0227 01:08:31.104011 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:31.603993262 +0000 UTC m=+224.541554550 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.168786 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xk58g"] Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.169943 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xk58g" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.177778 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.184236 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xk58g"] Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.198293 4771 ???:1] "http: TLS handshake error from 192.168.126.11:46368: no serving certificate available for the kubelet" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.204648 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b5757cb-321d-4a76-8769-786b28a2b004-utilities\") pod \"certified-operators-lds4k\" (UID: \"0b5757cb-321d-4a76-8769-786b28a2b004\") " pod="openshift-marketplace/certified-operators-lds4k" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.204732 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b5757cb-321d-4a76-8769-786b28a2b004-catalog-content\") pod \"certified-operators-lds4k\" (UID: \"0b5757cb-321d-4a76-8769-786b28a2b004\") " pod="openshift-marketplace/certified-operators-lds4k" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.204752 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzlpt\" (UniqueName: \"kubernetes.io/projected/0b5757cb-321d-4a76-8769-786b28a2b004-kube-api-access-kzlpt\") pod \"certified-operators-lds4k\" (UID: \"0b5757cb-321d-4a76-8769-786b28a2b004\") " pod="openshift-marketplace/certified-operators-lds4k" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.204775 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:31 crc kubenswrapper[4771]: E0227 01:08:31.205024 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:31.70501311 +0000 UTC m=+224.642574398 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.205045 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b5757cb-321d-4a76-8769-786b28a2b004-utilities\") pod \"certified-operators-lds4k\" (UID: \"0b5757cb-321d-4a76-8769-786b28a2b004\") " pod="openshift-marketplace/certified-operators-lds4k" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.205398 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b5757cb-321d-4a76-8769-786b28a2b004-catalog-content\") pod \"certified-operators-lds4k\" (UID: \"0b5757cb-321d-4a76-8769-786b28a2b004\") " pod="openshift-marketplace/certified-operators-lds4k" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.229143 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzlpt\" (UniqueName: \"kubernetes.io/projected/0b5757cb-321d-4a76-8769-786b28a2b004-kube-api-access-kzlpt\") pod \"certified-operators-lds4k\" (UID: \"0b5757cb-321d-4a76-8769-786b28a2b004\") " pod="openshift-marketplace/certified-operators-lds4k" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.235364 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.289000 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq"] Feb 27 01:08:31 crc kubenswrapper[4771]: E0227 01:08:31.289411 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="179b172a-a753-4f11-9532-63816979538a" containerName="route-controller-manager" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.289430 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="179b172a-a753-4f11-9532-63816979538a" containerName="route-controller-manager" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.289607 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="179b172a-a753-4f11-9532-63816979538a" containerName="route-controller-manager" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.290103 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.296032 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq"] Feb 27 01:08:31 crc kubenswrapper[4771]: E0227 01:08:31.306199 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:31.806169463 +0000 UTC m=+224.743730741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.305630 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.310102 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/460ffbff-d0f0-43dc-bde9-6279c9a4b6a3-catalog-content\") pod \"community-operators-xk58g\" (UID: \"460ffbff-d0f0-43dc-bde9-6279c9a4b6a3\") " pod="openshift-marketplace/community-operators-xk58g" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.310211 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.310332 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/460ffbff-d0f0-43dc-bde9-6279c9a4b6a3-utilities\") pod \"community-operators-xk58g\" (UID: \"460ffbff-d0f0-43dc-bde9-6279c9a4b6a3\") " pod="openshift-marketplace/community-operators-xk58g" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.310398 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sstcr\" (UniqueName: \"kubernetes.io/projected/460ffbff-d0f0-43dc-bde9-6279c9a4b6a3-kube-api-access-sstcr\") pod \"community-operators-xk58g\" (UID: \"460ffbff-d0f0-43dc-bde9-6279c9a4b6a3\") " pod="openshift-marketplace/community-operators-xk58g" Feb 27 01:08:31 crc kubenswrapper[4771]: E0227 01:08:31.310929 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:31.810916599 +0000 UTC m=+224.748477877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.316774 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.320125 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lds4k" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.370031 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z5hx5"] Feb 27 01:08:31 crc kubenswrapper[4771]: E0227 01:08:31.370215 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9f091ab-b345-4bf0-ac8e-b44181c8553f" containerName="controller-manager" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.370225 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9f091ab-b345-4bf0-ac8e-b44181c8553f" containerName="controller-manager" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.370302 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9f091ab-b345-4bf0-ac8e-b44181c8553f" containerName="controller-manager" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.370975 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z5hx5" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.380226 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z5hx5"] Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.415041 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzzvb\" (UniqueName: \"kubernetes.io/projected/179b172a-a753-4f11-9532-63816979538a-kube-api-access-xzzvb\") pod \"179b172a-a753-4f11-9532-63816979538a\" (UID: \"179b172a-a753-4f11-9532-63816979538a\") " Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.415164 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.415201 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/179b172a-a753-4f11-9532-63816979538a-serving-cert\") pod \"179b172a-a753-4f11-9532-63816979538a\" (UID: \"179b172a-a753-4f11-9532-63816979538a\") " Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.415225 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-config\") pod \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.415244 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-proxy-ca-bundles\") pod \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.415265 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/179b172a-a753-4f11-9532-63816979538a-client-ca\") pod \"179b172a-a753-4f11-9532-63816979538a\" (UID: \"179b172a-a753-4f11-9532-63816979538a\") " Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.415298 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hph8x\" (UniqueName: \"kubernetes.io/projected/b9f091ab-b345-4bf0-ac8e-b44181c8553f-kube-api-access-hph8x\") pod \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.415313 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/179b172a-a753-4f11-9532-63816979538a-config\") pod \"179b172a-a753-4f11-9532-63816979538a\" (UID: \"179b172a-a753-4f11-9532-63816979538a\") " Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.415354 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9f091ab-b345-4bf0-ac8e-b44181c8553f-serving-cert\") pod \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.415388 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-client-ca\") pod \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\" (UID: \"b9f091ab-b345-4bf0-ac8e-b44181c8553f\") " Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.415523 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/856463d3-39ed-4435-8cfb-e6d3b7e4312d-config\") pod \"route-controller-manager-c4f4946d6-8r6tq\" (UID: \"856463d3-39ed-4435-8cfb-e6d3b7e4312d\") " pod="openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.415604 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/460ffbff-d0f0-43dc-bde9-6279c9a4b6a3-catalog-content\") pod \"community-operators-xk58g\" (UID: \"460ffbff-d0f0-43dc-bde9-6279c9a4b6a3\") " pod="openshift-marketplace/community-operators-xk58g" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.415647 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/856463d3-39ed-4435-8cfb-e6d3b7e4312d-serving-cert\") pod \"route-controller-manager-c4f4946d6-8r6tq\" (UID: \"856463d3-39ed-4435-8cfb-e6d3b7e4312d\") " pod="openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.415666 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/460ffbff-d0f0-43dc-bde9-6279c9a4b6a3-utilities\") pod \"community-operators-xk58g\" (UID: \"460ffbff-d0f0-43dc-bde9-6279c9a4b6a3\") " pod="openshift-marketplace/community-operators-xk58g" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.415679 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sstcr\" (UniqueName: \"kubernetes.io/projected/460ffbff-d0f0-43dc-bde9-6279c9a4b6a3-kube-api-access-sstcr\") pod \"community-operators-xk58g\" (UID: \"460ffbff-d0f0-43dc-bde9-6279c9a4b6a3\") " pod="openshift-marketplace/community-operators-xk58g" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.415708 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kvr9\" (UniqueName: \"kubernetes.io/projected/856463d3-39ed-4435-8cfb-e6d3b7e4312d-kube-api-access-8kvr9\") pod \"route-controller-manager-c4f4946d6-8r6tq\" (UID: \"856463d3-39ed-4435-8cfb-e6d3b7e4312d\") " pod="openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.415730 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/856463d3-39ed-4435-8cfb-e6d3b7e4312d-client-ca\") pod \"route-controller-manager-c4f4946d6-8r6tq\" (UID: \"856463d3-39ed-4435-8cfb-e6d3b7e4312d\") " pod="openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.417013 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-client-ca" (OuterVolumeSpecName: "client-ca") pod "b9f091ab-b345-4bf0-ac8e-b44181c8553f" (UID: "b9f091ab-b345-4bf0-ac8e-b44181c8553f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.417026 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b9f091ab-b345-4bf0-ac8e-b44181c8553f" (UID: "b9f091ab-b345-4bf0-ac8e-b44181c8553f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:08:31 crc kubenswrapper[4771]: E0227 01:08:31.417105 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:31.917088196 +0000 UTC m=+224.854649484 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.417436 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/460ffbff-d0f0-43dc-bde9-6279c9a4b6a3-catalog-content\") pod \"community-operators-xk58g\" (UID: \"460ffbff-d0f0-43dc-bde9-6279c9a4b6a3\") " pod="openshift-marketplace/community-operators-xk58g" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.417848 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-config" (OuterVolumeSpecName: "config") pod "b9f091ab-b345-4bf0-ac8e-b44181c8553f" (UID: "b9f091ab-b345-4bf0-ac8e-b44181c8553f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.418940 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/460ffbff-d0f0-43dc-bde9-6279c9a4b6a3-utilities\") pod \"community-operators-xk58g\" (UID: \"460ffbff-d0f0-43dc-bde9-6279c9a4b6a3\") " pod="openshift-marketplace/community-operators-xk58g" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.419078 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/179b172a-a753-4f11-9532-63816979538a-config" (OuterVolumeSpecName: "config") pod "179b172a-a753-4f11-9532-63816979538a" (UID: "179b172a-a753-4f11-9532-63816979538a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.419476 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/179b172a-a753-4f11-9532-63816979538a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "179b172a-a753-4f11-9532-63816979538a" (UID: "179b172a-a753-4f11-9532-63816979538a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.420250 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/179b172a-a753-4f11-9532-63816979538a-client-ca" (OuterVolumeSpecName: "client-ca") pod "179b172a-a753-4f11-9532-63816979538a" (UID: "179b172a-a753-4f11-9532-63816979538a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.421214 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/179b172a-a753-4f11-9532-63816979538a-kube-api-access-xzzvb" (OuterVolumeSpecName: "kube-api-access-xzzvb") pod "179b172a-a753-4f11-9532-63816979538a" (UID: "179b172a-a753-4f11-9532-63816979538a"). InnerVolumeSpecName "kube-api-access-xzzvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.422701 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9f091ab-b345-4bf0-ac8e-b44181c8553f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b9f091ab-b345-4bf0-ac8e-b44181c8553f" (UID: "b9f091ab-b345-4bf0-ac8e-b44181c8553f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.427284 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9f091ab-b345-4bf0-ac8e-b44181c8553f-kube-api-access-hph8x" (OuterVolumeSpecName: "kube-api-access-hph8x") pod "b9f091ab-b345-4bf0-ac8e-b44181c8553f" (UID: "b9f091ab-b345-4bf0-ac8e-b44181c8553f"). InnerVolumeSpecName "kube-api-access-hph8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.439218 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sstcr\" (UniqueName: \"kubernetes.io/projected/460ffbff-d0f0-43dc-bde9-6279c9a4b6a3-kube-api-access-sstcr\") pod \"community-operators-xk58g\" (UID: \"460ffbff-d0f0-43dc-bde9-6279c9a4b6a3\") " pod="openshift-marketplace/community-operators-xk58g" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.522291 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/856463d3-39ed-4435-8cfb-e6d3b7e4312d-client-ca\") pod \"route-controller-manager-c4f4946d6-8r6tq\" (UID: \"856463d3-39ed-4435-8cfb-e6d3b7e4312d\") " pod="openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.522356 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/856463d3-39ed-4435-8cfb-e6d3b7e4312d-config\") pod \"route-controller-manager-c4f4946d6-8r6tq\" (UID: \"856463d3-39ed-4435-8cfb-e6d3b7e4312d\") " pod="openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.522381 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.522411 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/856463d3-39ed-4435-8cfb-e6d3b7e4312d-serving-cert\") pod \"route-controller-manager-c4f4946d6-8r6tq\" (UID: \"856463d3-39ed-4435-8cfb-e6d3b7e4312d\") " pod="openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.522434 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b8tx\" (UniqueName: \"kubernetes.io/projected/739ccbd2-56c7-4a26-ad40-4f0f908089e8-kube-api-access-4b8tx\") pod \"certified-operators-z5hx5\" (UID: \"739ccbd2-56c7-4a26-ad40-4f0f908089e8\") " pod="openshift-marketplace/certified-operators-z5hx5" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.522457 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/739ccbd2-56c7-4a26-ad40-4f0f908089e8-catalog-content\") pod \"certified-operators-z5hx5\" (UID: \"739ccbd2-56c7-4a26-ad40-4f0f908089e8\") " pod="openshift-marketplace/certified-operators-z5hx5" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.522475 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/739ccbd2-56c7-4a26-ad40-4f0f908089e8-utilities\") pod \"certified-operators-z5hx5\" (UID: \"739ccbd2-56c7-4a26-ad40-4f0f908089e8\") " pod="openshift-marketplace/certified-operators-z5hx5" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.522493 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kvr9\" (UniqueName: \"kubernetes.io/projected/856463d3-39ed-4435-8cfb-e6d3b7e4312d-kube-api-access-8kvr9\") pod \"route-controller-manager-c4f4946d6-8r6tq\" (UID: \"856463d3-39ed-4435-8cfb-e6d3b7e4312d\") " pod="openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.523212 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/856463d3-39ed-4435-8cfb-e6d3b7e4312d-client-ca\") pod \"route-controller-manager-c4f4946d6-8r6tq\" (UID: \"856463d3-39ed-4435-8cfb-e6d3b7e4312d\") " pod="openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.523269 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzzvb\" (UniqueName: \"kubernetes.io/projected/179b172a-a753-4f11-9532-63816979538a-kube-api-access-xzzvb\") on node \"crc\" DevicePath \"\"" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.523276 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xk58g" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.523985 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/179b172a-a753-4f11-9532-63816979538a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.524021 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.524030 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.524039 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/179b172a-a753-4f11-9532-63816979538a-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.524049 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hph8x\" (UniqueName: \"kubernetes.io/projected/b9f091ab-b345-4bf0-ac8e-b44181c8553f-kube-api-access-hph8x\") on node \"crc\" DevicePath \"\"" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.524058 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/179b172a-a753-4f11-9532-63816979538a-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.524068 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9f091ab-b345-4bf0-ac8e-b44181c8553f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.524076 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9f091ab-b345-4bf0-ac8e-b44181c8553f-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.526289 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/856463d3-39ed-4435-8cfb-e6d3b7e4312d-config\") pod \"route-controller-manager-c4f4946d6-8r6tq\" (UID: \"856463d3-39ed-4435-8cfb-e6d3b7e4312d\") " pod="openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq" Feb 27 01:08:31 crc kubenswrapper[4771]: E0227 01:08:31.526657 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:32.026631412 +0000 UTC m=+224.964192700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.530275 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/856463d3-39ed-4435-8cfb-e6d3b7e4312d-serving-cert\") pod \"route-controller-manager-c4f4946d6-8r6tq\" (UID: \"856463d3-39ed-4435-8cfb-e6d3b7e4312d\") " pod="openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.531929 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.544470 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kvr9\" (UniqueName: \"kubernetes.io/projected/856463d3-39ed-4435-8cfb-e6d3b7e4312d-kube-api-access-8kvr9\") pod \"route-controller-manager-c4f4946d6-8r6tq\" (UID: \"856463d3-39ed-4435-8cfb-e6d3b7e4312d\") " pod="openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.573092 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g5gxk"] Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.574054 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g5gxk" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.596840 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g5gxk"] Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.622957 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.625073 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:31 crc kubenswrapper[4771]: E0227 01:08:31.625217 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:32.125193206 +0000 UTC m=+225.062754494 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.625455 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/739ccbd2-56c7-4a26-ad40-4f0f908089e8-utilities\") pod \"certified-operators-z5hx5\" (UID: \"739ccbd2-56c7-4a26-ad40-4f0f908089e8\") " pod="openshift-marketplace/certified-operators-z5hx5" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.625575 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.625621 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b8tx\" (UniqueName: \"kubernetes.io/projected/739ccbd2-56c7-4a26-ad40-4f0f908089e8-kube-api-access-4b8tx\") pod \"certified-operators-z5hx5\" (UID: \"739ccbd2-56c7-4a26-ad40-4f0f908089e8\") " pod="openshift-marketplace/certified-operators-z5hx5" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.625647 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/739ccbd2-56c7-4a26-ad40-4f0f908089e8-catalog-content\") pod \"certified-operators-z5hx5\" (UID: \"739ccbd2-56c7-4a26-ad40-4f0f908089e8\") " pod="openshift-marketplace/certified-operators-z5hx5" Feb 27 01:08:31 crc kubenswrapper[4771]: E0227 01:08:31.626114 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:32.126105649 +0000 UTC m=+225.063666937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.626272 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/739ccbd2-56c7-4a26-ad40-4f0f908089e8-catalog-content\") pod \"certified-operators-z5hx5\" (UID: \"739ccbd2-56c7-4a26-ad40-4f0f908089e8\") " pod="openshift-marketplace/certified-operators-z5hx5" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.626573 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/739ccbd2-56c7-4a26-ad40-4f0f908089e8-utilities\") pod \"certified-operators-z5hx5\" (UID: \"739ccbd2-56c7-4a26-ad40-4f0f908089e8\") " pod="openshift-marketplace/certified-operators-z5hx5" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.644538 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b8tx\" (UniqueName: \"kubernetes.io/projected/739ccbd2-56c7-4a26-ad40-4f0f908089e8-kube-api-access-4b8tx\") pod \"certified-operators-z5hx5\" (UID: \"739ccbd2-56c7-4a26-ad40-4f0f908089e8\") " pod="openshift-marketplace/certified-operators-z5hx5" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.691831 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z5hx5" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.726712 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:31 crc kubenswrapper[4771]: E0227 01:08:31.726810 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:32.226794279 +0000 UTC m=+225.164355567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.727023 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67wv4\" (UniqueName: \"kubernetes.io/projected/b4029ae4-2dfb-4351-88f8-08fefb8ab46e-kube-api-access-67wv4\") pod \"community-operators-g5gxk\" (UID: \"b4029ae4-2dfb-4351-88f8-08fefb8ab46e\") " pod="openshift-marketplace/community-operators-g5gxk" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.727048 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4029ae4-2dfb-4351-88f8-08fefb8ab46e-utilities\") pod \"community-operators-g5gxk\" (UID: \"b4029ae4-2dfb-4351-88f8-08fefb8ab46e\") " pod="openshift-marketplace/community-operators-g5gxk" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.727114 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4029ae4-2dfb-4351-88f8-08fefb8ab46e-catalog-content\") pod \"community-operators-g5gxk\" (UID: \"b4029ae4-2dfb-4351-88f8-08fefb8ab46e\") " pod="openshift-marketplace/community-operators-g5gxk" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.727141 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:31 crc kubenswrapper[4771]: E0227 01:08:31.727449 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:32.227441607 +0000 UTC m=+225.165002895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.733500 4771 patch_prober.go:28] interesting pod/router-default-5444994796-h4dqr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 01:08:31 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Feb 27 01:08:31 crc kubenswrapper[4771]: [+]process-running ok Feb 27 01:08:31 crc kubenswrapper[4771]: healthz check failed Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.733532 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h4dqr" podUID="740f2438-5f9c-40bb-ae51-77aac4708ab9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.744260 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lds4k"] Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.811507 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xk58g"] Feb 27 01:08:31 crc kubenswrapper[4771]: W0227 01:08:31.826271 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod460ffbff_d0f0_43dc_bde9_6279c9a4b6a3.slice/crio-9da539d313ec5162f40beb8d4a77cf80b2b2cae6a68c46d3037d807108e4be4e WatchSource:0}: Error finding container 9da539d313ec5162f40beb8d4a77cf80b2b2cae6a68c46d3037d807108e4be4e: Status 404 returned error can't find the container with id 9da539d313ec5162f40beb8d4a77cf80b2b2cae6a68c46d3037d807108e4be4e Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.828896 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.829146 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67wv4\" (UniqueName: \"kubernetes.io/projected/b4029ae4-2dfb-4351-88f8-08fefb8ab46e-kube-api-access-67wv4\") pod \"community-operators-g5gxk\" (UID: \"b4029ae4-2dfb-4351-88f8-08fefb8ab46e\") " pod="openshift-marketplace/community-operators-g5gxk" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.829175 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4029ae4-2dfb-4351-88f8-08fefb8ab46e-utilities\") pod \"community-operators-g5gxk\" (UID: \"b4029ae4-2dfb-4351-88f8-08fefb8ab46e\") " pod="openshift-marketplace/community-operators-g5gxk" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.829201 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4029ae4-2dfb-4351-88f8-08fefb8ab46e-catalog-content\") pod \"community-operators-g5gxk\" (UID: \"b4029ae4-2dfb-4351-88f8-08fefb8ab46e\") " pod="openshift-marketplace/community-operators-g5gxk" Feb 27 01:08:31 crc kubenswrapper[4771]: E0227 01:08:31.829580 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 01:08:32.329541745 +0000 UTC m=+225.267103033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.829643 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4029ae4-2dfb-4351-88f8-08fefb8ab46e-catalog-content\") pod \"community-operators-g5gxk\" (UID: \"b4029ae4-2dfb-4351-88f8-08fefb8ab46e\") " pod="openshift-marketplace/community-operators-g5gxk" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.829966 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4029ae4-2dfb-4351-88f8-08fefb8ab46e-utilities\") pod \"community-operators-g5gxk\" (UID: \"b4029ae4-2dfb-4351-88f8-08fefb8ab46e\") " pod="openshift-marketplace/community-operators-g5gxk" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.832815 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mwrgv" event={"ID":"e14c3fd8-5a78-42cc-b829-7a5b4ce8d996","Type":"ContainerStarted","Data":"8380aac2dde7c779cc40df312145ccd622fce3983747056d49de87ce5c3a1d90"} Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.832851 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mwrgv" event={"ID":"e14c3fd8-5a78-42cc-b829-7a5b4ce8d996","Type":"ContainerStarted","Data":"95cd4bd8f0b23c6b66374540ba28713b5b1d7390981504ac3f2e91ea27b10297"} Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.843071 4771 generic.go:334] "Generic (PLEG): container finished" podID="179b172a-a753-4f11-9532-63816979538a" containerID="682dae31a77e9bcefb96836af2c947ce7926e134536a1f5d0aebd075f4fe84c5" exitCode=0 Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.843156 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" event={"ID":"179b172a-a753-4f11-9532-63816979538a","Type":"ContainerDied","Data":"682dae31a77e9bcefb96836af2c947ce7926e134536a1f5d0aebd075f4fe84c5"} Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.843204 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" event={"ID":"179b172a-a753-4f11-9532-63816979538a","Type":"ContainerDied","Data":"8713e36d9d36df5b6d95754def13484a85337456f5a8da7ee35edc9cbffcdf0f"} Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.843223 4771 scope.go:117] "RemoveContainer" containerID="682dae31a77e9bcefb96836af2c947ce7926e134536a1f5d0aebd075f4fe84c5" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.843383 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.850467 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67wv4\" (UniqueName: \"kubernetes.io/projected/b4029ae4-2dfb-4351-88f8-08fefb8ab46e-kube-api-access-67wv4\") pod \"community-operators-g5gxk\" (UID: \"b4029ae4-2dfb-4351-88f8-08fefb8ab46e\") " pod="openshift-marketplace/community-operators-g5gxk" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.860991 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-mwrgv" podStartSLOduration=10.860970584 podStartE2EDuration="10.860970584s" podCreationTimestamp="2026-02-27 01:08:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:31.85931697 +0000 UTC m=+224.796878258" watchObservedRunningTime="2026-02-27 01:08:31.860970584 +0000 UTC m=+224.798531872" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.862964 4771 generic.go:334] "Generic (PLEG): container finished" podID="b9f091ab-b345-4bf0-ac8e-b44181c8553f" containerID="bd15a553da130f9baf7f148967603be9a45a395291d70f379f91fe53f2d9248c" exitCode=0 Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.863176 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.863638 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" event={"ID":"b9f091ab-b345-4bf0-ac8e-b44181c8553f","Type":"ContainerDied","Data":"bd15a553da130f9baf7f148967603be9a45a395291d70f379f91fe53f2d9248c"} Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.863675 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wgwwp" event={"ID":"b9f091ab-b345-4bf0-ac8e-b44181c8553f","Type":"ContainerDied","Data":"2367b6bb95a8cf9644de85afe36c77e92d4a00ad69bf30cf9a79dbb18f464e56"} Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.866172 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lds4k" event={"ID":"0b5757cb-321d-4a76-8769-786b28a2b004","Type":"ContainerStarted","Data":"e26475c1fb191aa6a0b6893b69ce3258ae015ed411a2f2645477e14bf0c9a2d8"} Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.873697 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sf4rl" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.885517 4771 ???:1] "http: TLS handshake error from 192.168.126.11:46380: no serving certificate available for the kubelet" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.893046 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g5gxk" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.899596 4771 scope.go:117] "RemoveContainer" containerID="682dae31a77e9bcefb96836af2c947ce7926e134536a1f5d0aebd075f4fe84c5" Feb 27 01:08:31 crc kubenswrapper[4771]: E0227 01:08:31.901790 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"682dae31a77e9bcefb96836af2c947ce7926e134536a1f5d0aebd075f4fe84c5\": container with ID starting with 682dae31a77e9bcefb96836af2c947ce7926e134536a1f5d0aebd075f4fe84c5 not found: ID does not exist" containerID="682dae31a77e9bcefb96836af2c947ce7926e134536a1f5d0aebd075f4fe84c5" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.901824 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"682dae31a77e9bcefb96836af2c947ce7926e134536a1f5d0aebd075f4fe84c5"} err="failed to get container status \"682dae31a77e9bcefb96836af2c947ce7926e134536a1f5d0aebd075f4fe84c5\": rpc error: code = NotFound desc = could not find container \"682dae31a77e9bcefb96836af2c947ce7926e134536a1f5d0aebd075f4fe84c5\": container with ID starting with 682dae31a77e9bcefb96836af2c947ce7926e134536a1f5d0aebd075f4fe84c5 not found: ID does not exist" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.901847 4771 scope.go:117] "RemoveContainer" containerID="bd15a553da130f9baf7f148967603be9a45a395291d70f379f91fe53f2d9248c" Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.924888 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wgwwp"] Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.928279 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wgwwp"] Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.930176 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:31 crc kubenswrapper[4771]: E0227 01:08:31.930482 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 01:08:32.430470801 +0000 UTC m=+225.368032079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zmxc8" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 01:08:31 crc kubenswrapper[4771]: I0227 01:08:31.985660 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5"] Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:31.999967 4771 scope.go:117] "RemoveContainer" containerID="bd15a553da130f9baf7f148967603be9a45a395291d70f379f91fe53f2d9248c" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.003670 4771 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-27T01:08:31.028128204Z","Handler":null,"Name":""} Feb 27 01:08:32 crc kubenswrapper[4771]: E0227 01:08:32.004533 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd15a553da130f9baf7f148967603be9a45a395291d70f379f91fe53f2d9248c\": container with ID starting with bd15a553da130f9baf7f148967603be9a45a395291d70f379f91fe53f2d9248c not found: ID does not exist" containerID="bd15a553da130f9baf7f148967603be9a45a395291d70f379f91fe53f2d9248c" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.004590 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd15a553da130f9baf7f148967603be9a45a395291d70f379f91fe53f2d9248c"} err="failed to get container status \"bd15a553da130f9baf7f148967603be9a45a395291d70f379f91fe53f2d9248c\": rpc error: code = NotFound desc = could not find container \"bd15a553da130f9baf7f148967603be9a45a395291d70f379f91fe53f2d9248c\": container with ID starting with bd15a553da130f9baf7f148967603be9a45a395291d70f379f91fe53f2d9248c not found: ID does not exist" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.010215 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g6lg5"] Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.016847 4771 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.016882 4771 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.034103 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.044026 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.075102 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.076058 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.083601 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.084518 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.089416 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.118767 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z5hx5"] Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.135504 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.145909 4771 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.145962 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.184523 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zmxc8\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.188595 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.189263 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.190925 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.190976 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.191170 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.208410 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.236396 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bdc0c710-02ce-4be7-a8b9-a4b559e7ffce-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bdc0c710-02ce-4be7-a8b9-a4b559e7ffce\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.236476 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bdc0c710-02ce-4be7-a8b9-a4b559e7ffce-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bdc0c710-02ce-4be7-a8b9-a4b559e7ffce\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.262624 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq"] Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.337373 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04f626e0-de75-453a-a7b2-e9449485c031-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"04f626e0-de75-453a-a7b2-e9449485c031\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.337424 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04f626e0-de75-453a-a7b2-e9449485c031-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"04f626e0-de75-453a-a7b2-e9449485c031\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.337454 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bdc0c710-02ce-4be7-a8b9-a4b559e7ffce-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bdc0c710-02ce-4be7-a8b9-a4b559e7ffce\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.337522 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bdc0c710-02ce-4be7-a8b9-a4b559e7ffce-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bdc0c710-02ce-4be7-a8b9-a4b559e7ffce\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.337940 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bdc0c710-02ce-4be7-a8b9-a4b559e7ffce-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bdc0c710-02ce-4be7-a8b9-a4b559e7ffce\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.343938 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g5gxk"] Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.362143 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bdc0c710-02ce-4be7-a8b9-a4b559e7ffce-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bdc0c710-02ce-4be7-a8b9-a4b559e7ffce\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.439017 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zmxc8"] Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.439105 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04f626e0-de75-453a-a7b2-e9449485c031-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"04f626e0-de75-453a-a7b2-e9449485c031\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.439152 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04f626e0-de75-453a-a7b2-e9449485c031-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"04f626e0-de75-453a-a7b2-e9449485c031\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.439894 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04f626e0-de75-453a-a7b2-e9449485c031-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"04f626e0-de75-453a-a7b2-e9449485c031\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.460688 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04f626e0-de75-453a-a7b2-e9449485c031-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"04f626e0-de75-453a-a7b2-e9449485c031\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.511347 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.518524 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.732020 4771 patch_prober.go:28] interesting pod/router-default-5444994796-h4dqr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 01:08:32 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Feb 27 01:08:32 crc kubenswrapper[4771]: [+]process-running ok Feb 27 01:08:32 crc kubenswrapper[4771]: healthz check failed Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.732069 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h4dqr" podUID="740f2438-5f9c-40bb-ae51-77aac4708ab9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.744597 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 27 01:08:32 crc kubenswrapper[4771]: W0227 01:08:32.762532 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbdc0c710_02ce_4be7_a8b9_a4b559e7ffce.slice/crio-6f9446b33db8c409ab576160544fa45fadb1f2acf0b2d63760f2d4f6cd61b55f WatchSource:0}: Error finding container 6f9446b33db8c409ab576160544fa45fadb1f2acf0b2d63760f2d4f6cd61b55f: Status 404 returned error can't find the container with id 6f9446b33db8c409ab576160544fa45fadb1f2acf0b2d63760f2d4f6cd61b55f Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.804503 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.890919 4771 generic.go:334] "Generic (PLEG): container finished" podID="460ffbff-d0f0-43dc-bde9-6279c9a4b6a3" containerID="fc9e4c09e8487da11ce052951b57a20b9214f4fd6fa12c2c4cca62daaab2317e" exitCode=0 Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.890981 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xk58g" event={"ID":"460ffbff-d0f0-43dc-bde9-6279c9a4b6a3","Type":"ContainerDied","Data":"fc9e4c09e8487da11ce052951b57a20b9214f4fd6fa12c2c4cca62daaab2317e"} Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.891005 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xk58g" event={"ID":"460ffbff-d0f0-43dc-bde9-6279c9a4b6a3","Type":"ContainerStarted","Data":"9da539d313ec5162f40beb8d4a77cf80b2b2cae6a68c46d3037d807108e4be4e"} Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.896005 4771 generic.go:334] "Generic (PLEG): container finished" podID="0b5757cb-321d-4a76-8769-786b28a2b004" containerID="a4e44e2ff53b0b41f0747a2c135f8582388f21d0d5972688f6fb0248eeff4c4b" exitCode=0 Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.896173 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lds4k" event={"ID":"0b5757cb-321d-4a76-8769-786b28a2b004","Type":"ContainerDied","Data":"a4e44e2ff53b0b41f0747a2c135f8582388f21d0d5972688f6fb0248eeff4c4b"} Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.904122 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"04f626e0-de75-453a-a7b2-e9449485c031","Type":"ContainerStarted","Data":"44486e245191892f59a009017f3c84f776d398defb8af7bdda61df4059de798d"} Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.906578 4771 generic.go:334] "Generic (PLEG): container finished" podID="739ccbd2-56c7-4a26-ad40-4f0f908089e8" containerID="7e3ed9db1076d237699426a40479cb3bc8a486dad98925003573a0e0fddf9312" exitCode=0 Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.906650 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z5hx5" event={"ID":"739ccbd2-56c7-4a26-ad40-4f0f908089e8","Type":"ContainerDied","Data":"7e3ed9db1076d237699426a40479cb3bc8a486dad98925003573a0e0fddf9312"} Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.906683 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z5hx5" event={"ID":"739ccbd2-56c7-4a26-ad40-4f0f908089e8","Type":"ContainerStarted","Data":"da90fe4302f6fc3636c974a254edf9cb034bb28bde68d1d4380a2d590be2b546"} Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.917517 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" event={"ID":"ebbe1c67-5385-4eda-af88-793c2c85e043","Type":"ContainerStarted","Data":"6f8cfd30afc043134f09522b81b040d1bb1294bced13422d93e2097bbfe219cc"} Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.917585 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" event={"ID":"ebbe1c67-5385-4eda-af88-793c2c85e043","Type":"ContainerStarted","Data":"9fbbf4ad51c25094c0cfedf118c56dea5edbfae24cdc2aa14d2c61ab253f3321"} Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.917692 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.992169 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-gd7gl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.992220 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gd7gl" podUID="58839f3c-374c-43d0-ac2e-32c497ead461" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.992393 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-gd7gl container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.992461 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-gd7gl" podUID="58839f3c-374c-43d0-ac2e-32c497ead461" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.993142 4771 generic.go:334] "Generic (PLEG): container finished" podID="b4029ae4-2dfb-4351-88f8-08fefb8ab46e" containerID="95a2b65788e7ed8da3746791683c1d62b52242d950088997901356b83d480a1e" exitCode=0 Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.993211 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" podStartSLOduration=179.993192752 podStartE2EDuration="2m59.993192752s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:32.990842449 +0000 UTC m=+225.928403737" watchObservedRunningTime="2026-02-27 01:08:32.993192752 +0000 UTC m=+225.930754050" Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.993782 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g5gxk" event={"ID":"b4029ae4-2dfb-4351-88f8-08fefb8ab46e","Type":"ContainerDied","Data":"95a2b65788e7ed8da3746791683c1d62b52242d950088997901356b83d480a1e"} Feb 27 01:08:32 crc kubenswrapper[4771]: I0227 01:08:32.993821 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g5gxk" event={"ID":"b4029ae4-2dfb-4351-88f8-08fefb8ab46e","Type":"ContainerStarted","Data":"4737c8964bff6f541285d04beba79fb67ec0e8b294c7f0fd17c2e5758f58edf6"} Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.036296 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bdc0c710-02ce-4be7-a8b9-a4b559e7ffce","Type":"ContainerStarted","Data":"6f9446b33db8c409ab576160544fa45fadb1f2acf0b2d63760f2d4f6cd61b55f"} Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.038858 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq" event={"ID":"856463d3-39ed-4435-8cfb-e6d3b7e4312d","Type":"ContainerStarted","Data":"00d8efd3ac413e6643bf0ca522b97b03593ca4fcd692824fa0b3c902ed914cf2"} Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.038905 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq" event={"ID":"856463d3-39ed-4435-8cfb-e6d3b7e4312d","Type":"ContainerStarted","Data":"400ba0f2b8fd2fd07c286354eea9e3c710958d499754660d31db73148263b439"} Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.039639 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.060602 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.168933 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq" podStartSLOduration=4.168917526 podStartE2EDuration="4.168917526s" podCreationTimestamp="2026-02-27 01:08:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:33.106625492 +0000 UTC m=+226.044186790" watchObservedRunningTime="2026-02-27 01:08:33.168917526 +0000 UTC m=+226.106478814" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.170859 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d5l2f"] Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.171827 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5l2f" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.176563 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.181076 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5l2f"] Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.206822 4771 ???:1] "http: TLS handshake error from 192.168.126.11:48786: no serving certificate available for the kubelet" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.254004 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deb9a4a5-1474-4744-a57e-fcdcc97922ed-catalog-content\") pod \"redhat-marketplace-d5l2f\" (UID: \"deb9a4a5-1474-4744-a57e-fcdcc97922ed\") " pod="openshift-marketplace/redhat-marketplace-d5l2f" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.254160 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2vfs\" (UniqueName: \"kubernetes.io/projected/deb9a4a5-1474-4744-a57e-fcdcc97922ed-kube-api-access-p2vfs\") pod \"redhat-marketplace-d5l2f\" (UID: \"deb9a4a5-1474-4744-a57e-fcdcc97922ed\") " pod="openshift-marketplace/redhat-marketplace-d5l2f" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.254243 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deb9a4a5-1474-4744-a57e-fcdcc97922ed-utilities\") pod \"redhat-marketplace-d5l2f\" (UID: \"deb9a4a5-1474-4744-a57e-fcdcc97922ed\") " pod="openshift-marketplace/redhat-marketplace-d5l2f" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.355352 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deb9a4a5-1474-4744-a57e-fcdcc97922ed-utilities\") pod \"redhat-marketplace-d5l2f\" (UID: \"deb9a4a5-1474-4744-a57e-fcdcc97922ed\") " pod="openshift-marketplace/redhat-marketplace-d5l2f" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.355435 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deb9a4a5-1474-4744-a57e-fcdcc97922ed-catalog-content\") pod \"redhat-marketplace-d5l2f\" (UID: \"deb9a4a5-1474-4744-a57e-fcdcc97922ed\") " pod="openshift-marketplace/redhat-marketplace-d5l2f" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.355478 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2vfs\" (UniqueName: \"kubernetes.io/projected/deb9a4a5-1474-4744-a57e-fcdcc97922ed-kube-api-access-p2vfs\") pod \"redhat-marketplace-d5l2f\" (UID: \"deb9a4a5-1474-4744-a57e-fcdcc97922ed\") " pod="openshift-marketplace/redhat-marketplace-d5l2f" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.356039 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deb9a4a5-1474-4744-a57e-fcdcc97922ed-utilities\") pod \"redhat-marketplace-d5l2f\" (UID: \"deb9a4a5-1474-4744-a57e-fcdcc97922ed\") " pod="openshift-marketplace/redhat-marketplace-d5l2f" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.356308 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deb9a4a5-1474-4744-a57e-fcdcc97922ed-catalog-content\") pod \"redhat-marketplace-d5l2f\" (UID: \"deb9a4a5-1474-4744-a57e-fcdcc97922ed\") " pod="openshift-marketplace/redhat-marketplace-d5l2f" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.373095 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2vfs\" (UniqueName: \"kubernetes.io/projected/deb9a4a5-1474-4744-a57e-fcdcc97922ed-kube-api-access-p2vfs\") pod \"redhat-marketplace-d5l2f\" (UID: \"deb9a4a5-1474-4744-a57e-fcdcc97922ed\") " pod="openshift-marketplace/redhat-marketplace-d5l2f" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.489342 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5l2f" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.568261 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.568323 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.568704 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-fd5687597-psjcb"] Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.571853 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sbc2d"] Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.573114 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbc2d" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.574864 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-fd5687597-psjcb"] Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.575460 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.579056 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbc2d"] Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.579184 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.579448 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.580008 4771 patch_prober.go:28] interesting pod/console-f9d7485db-65dsm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.580047 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-65dsm" podUID="db8009a0-8b08-421c-8f35-e3127b0b5e8e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.580885 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.581064 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.581151 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.581882 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.586569 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.659333 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b8778c53-0dad-4321-81d2-335e0715546b-proxy-ca-bundles\") pod \"controller-manager-fd5687597-psjcb\" (UID: \"b8778c53-0dad-4321-81d2-335e0715546b\") " pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.659704 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8778c53-0dad-4321-81d2-335e0715546b-config\") pod \"controller-manager-fd5687597-psjcb\" (UID: \"b8778c53-0dad-4321-81d2-335e0715546b\") " pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.659769 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8778c53-0dad-4321-81d2-335e0715546b-serving-cert\") pod \"controller-manager-fd5687597-psjcb\" (UID: \"b8778c53-0dad-4321-81d2-335e0715546b\") " pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.660019 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a309b5-c10d-49d7-ade8-3c087250dd91-catalog-content\") pod \"redhat-marketplace-sbc2d\" (UID: \"96a309b5-c10d-49d7-ade8-3c087250dd91\") " pod="openshift-marketplace/redhat-marketplace-sbc2d" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.660158 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxnk7\" (UniqueName: \"kubernetes.io/projected/96a309b5-c10d-49d7-ade8-3c087250dd91-kube-api-access-vxnk7\") pod \"redhat-marketplace-sbc2d\" (UID: \"96a309b5-c10d-49d7-ade8-3c087250dd91\") " pod="openshift-marketplace/redhat-marketplace-sbc2d" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.660271 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b8778c53-0dad-4321-81d2-335e0715546b-client-ca\") pod \"controller-manager-fd5687597-psjcb\" (UID: \"b8778c53-0dad-4321-81d2-335e0715546b\") " pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.660335 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a309b5-c10d-49d7-ade8-3c087250dd91-utilities\") pod \"redhat-marketplace-sbc2d\" (UID: \"96a309b5-c10d-49d7-ade8-3c087250dd91\") " pod="openshift-marketplace/redhat-marketplace-sbc2d" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.660358 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwxf6\" (UniqueName: \"kubernetes.io/projected/b8778c53-0dad-4321-81d2-335e0715546b-kube-api-access-gwxf6\") pod \"controller-manager-fd5687597-psjcb\" (UID: \"b8778c53-0dad-4321-81d2-335e0715546b\") " pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.722224 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5l2f"] Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.731555 4771 patch_prober.go:28] interesting pod/router-default-5444994796-h4dqr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 01:08:33 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Feb 27 01:08:33 crc kubenswrapper[4771]: [+]process-running ok Feb 27 01:08:33 crc kubenswrapper[4771]: healthz check failed Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.731597 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h4dqr" podUID="740f2438-5f9c-40bb-ae51-77aac4708ab9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.761816 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8778c53-0dad-4321-81d2-335e0715546b-serving-cert\") pod \"controller-manager-fd5687597-psjcb\" (UID: \"b8778c53-0dad-4321-81d2-335e0715546b\") " pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.761865 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a309b5-c10d-49d7-ade8-3c087250dd91-catalog-content\") pod \"redhat-marketplace-sbc2d\" (UID: \"96a309b5-c10d-49d7-ade8-3c087250dd91\") " pod="openshift-marketplace/redhat-marketplace-sbc2d" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.761908 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxnk7\" (UniqueName: \"kubernetes.io/projected/96a309b5-c10d-49d7-ade8-3c087250dd91-kube-api-access-vxnk7\") pod \"redhat-marketplace-sbc2d\" (UID: \"96a309b5-c10d-49d7-ade8-3c087250dd91\") " pod="openshift-marketplace/redhat-marketplace-sbc2d" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.761953 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b8778c53-0dad-4321-81d2-335e0715546b-client-ca\") pod \"controller-manager-fd5687597-psjcb\" (UID: \"b8778c53-0dad-4321-81d2-335e0715546b\") " pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.761973 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a309b5-c10d-49d7-ade8-3c087250dd91-utilities\") pod \"redhat-marketplace-sbc2d\" (UID: \"96a309b5-c10d-49d7-ade8-3c087250dd91\") " pod="openshift-marketplace/redhat-marketplace-sbc2d" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.761989 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwxf6\" (UniqueName: \"kubernetes.io/projected/b8778c53-0dad-4321-81d2-335e0715546b-kube-api-access-gwxf6\") pod \"controller-manager-fd5687597-psjcb\" (UID: \"b8778c53-0dad-4321-81d2-335e0715546b\") " pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.762016 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b8778c53-0dad-4321-81d2-335e0715546b-proxy-ca-bundles\") pod \"controller-manager-fd5687597-psjcb\" (UID: \"b8778c53-0dad-4321-81d2-335e0715546b\") " pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.762040 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8778c53-0dad-4321-81d2-335e0715546b-config\") pod \"controller-manager-fd5687597-psjcb\" (UID: \"b8778c53-0dad-4321-81d2-335e0715546b\") " pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.762856 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b8778c53-0dad-4321-81d2-335e0715546b-client-ca\") pod \"controller-manager-fd5687597-psjcb\" (UID: \"b8778c53-0dad-4321-81d2-335e0715546b\") " pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.764278 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8778c53-0dad-4321-81d2-335e0715546b-config\") pod \"controller-manager-fd5687597-psjcb\" (UID: \"b8778c53-0dad-4321-81d2-335e0715546b\") " pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.764611 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a309b5-c10d-49d7-ade8-3c087250dd91-utilities\") pod \"redhat-marketplace-sbc2d\" (UID: \"96a309b5-c10d-49d7-ade8-3c087250dd91\") " pod="openshift-marketplace/redhat-marketplace-sbc2d" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.765367 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a309b5-c10d-49d7-ade8-3c087250dd91-catalog-content\") pod \"redhat-marketplace-sbc2d\" (UID: \"96a309b5-c10d-49d7-ade8-3c087250dd91\") " pod="openshift-marketplace/redhat-marketplace-sbc2d" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.766328 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b8778c53-0dad-4321-81d2-335e0715546b-proxy-ca-bundles\") pod \"controller-manager-fd5687597-psjcb\" (UID: \"b8778c53-0dad-4321-81d2-335e0715546b\") " pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.770704 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8778c53-0dad-4321-81d2-335e0715546b-serving-cert\") pod \"controller-manager-fd5687597-psjcb\" (UID: \"b8778c53-0dad-4321-81d2-335e0715546b\") " pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.783681 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwxf6\" (UniqueName: \"kubernetes.io/projected/b8778c53-0dad-4321-81d2-335e0715546b-kube-api-access-gwxf6\") pod \"controller-manager-fd5687597-psjcb\" (UID: \"b8778c53-0dad-4321-81d2-335e0715546b\") " pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.784621 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxnk7\" (UniqueName: \"kubernetes.io/projected/96a309b5-c10d-49d7-ade8-3c087250dd91-kube-api-access-vxnk7\") pod \"redhat-marketplace-sbc2d\" (UID: \"96a309b5-c10d-49d7-ade8-3c087250dd91\") " pod="openshift-marketplace/redhat-marketplace-sbc2d" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.790925 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="179b172a-a753-4f11-9532-63816979538a" path="/var/lib/kubelet/pods/179b172a-a753-4f11-9532-63816979538a/volumes" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.791755 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.792292 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9f091ab-b345-4bf0-ac8e-b44181c8553f" path="/var/lib/kubelet/pods/b9f091ab-b345-4bf0-ac8e-b44181c8553f/volumes" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.906849 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbc2d" Feb 27 01:08:33 crc kubenswrapper[4771]: I0227 01:08:33.927056 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.051639 4771 generic.go:334] "Generic (PLEG): container finished" podID="04f626e0-de75-453a-a7b2-e9449485c031" containerID="6387875a0c0f53869813cf91684e46a9d861cdbdb70ef90126a4482290864ded" exitCode=0 Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.051707 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"04f626e0-de75-453a-a7b2-e9449485c031","Type":"ContainerDied","Data":"6387875a0c0f53869813cf91684e46a9d861cdbdb70ef90126a4482290864ded"} Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.055037 4771 generic.go:334] "Generic (PLEG): container finished" podID="deb9a4a5-1474-4744-a57e-fcdcc97922ed" containerID="682c22dcd6c3459e2fbcb0d6c685cf1d905abab945401dedf2d0a7549762fe71" exitCode=0 Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.055096 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5l2f" event={"ID":"deb9a4a5-1474-4744-a57e-fcdcc97922ed","Type":"ContainerDied","Data":"682c22dcd6c3459e2fbcb0d6c685cf1d905abab945401dedf2d0a7549762fe71"} Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.055160 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5l2f" event={"ID":"deb9a4a5-1474-4744-a57e-fcdcc97922ed","Type":"ContainerStarted","Data":"970024a3e6803c0ecfbcd86bc8def90fe4b988ea58c19e2017830b90d54e9e46"} Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.056668 4771 generic.go:334] "Generic (PLEG): container finished" podID="bdc0c710-02ce-4be7-a8b9-a4b559e7ffce" containerID="739891135a8c5db20149cde3d92a1ad722dac5e7e00033dd1494d8658efd39c3" exitCode=0 Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.056698 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bdc0c710-02ce-4be7-a8b9-a4b559e7ffce","Type":"ContainerDied","Data":"739891135a8c5db20149cde3d92a1ad722dac5e7e00033dd1494d8658efd39c3"} Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.167061 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7r96b"] Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.167998 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7r96b" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.173882 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.180458 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7r96b"] Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.273293 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-fd5687597-psjcb"] Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.285572 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg6c9\" (UniqueName: \"kubernetes.io/projected/33d95d7b-dfe7-495a-b686-5737dd95b974-kube-api-access-wg6c9\") pod \"redhat-operators-7r96b\" (UID: \"33d95d7b-dfe7-495a-b686-5737dd95b974\") " pod="openshift-marketplace/redhat-operators-7r96b" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.285860 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33d95d7b-dfe7-495a-b686-5737dd95b974-catalog-content\") pod \"redhat-operators-7r96b\" (UID: \"33d95d7b-dfe7-495a-b686-5737dd95b974\") " pod="openshift-marketplace/redhat-operators-7r96b" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.285919 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33d95d7b-dfe7-495a-b686-5737dd95b974-utilities\") pod \"redhat-operators-7r96b\" (UID: \"33d95d7b-dfe7-495a-b686-5737dd95b974\") " pod="openshift-marketplace/redhat-operators-7r96b" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.368700 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qs5th"] Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.369912 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qs5th" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.376931 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qs5th"] Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.389931 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33d95d7b-dfe7-495a-b686-5737dd95b974-utilities\") pod \"redhat-operators-7r96b\" (UID: \"33d95d7b-dfe7-495a-b686-5737dd95b974\") " pod="openshift-marketplace/redhat-operators-7r96b" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.390016 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg6c9\" (UniqueName: \"kubernetes.io/projected/33d95d7b-dfe7-495a-b686-5737dd95b974-kube-api-access-wg6c9\") pod \"redhat-operators-7r96b\" (UID: \"33d95d7b-dfe7-495a-b686-5737dd95b974\") " pod="openshift-marketplace/redhat-operators-7r96b" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.390047 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33d95d7b-dfe7-495a-b686-5737dd95b974-catalog-content\") pod \"redhat-operators-7r96b\" (UID: \"33d95d7b-dfe7-495a-b686-5737dd95b974\") " pod="openshift-marketplace/redhat-operators-7r96b" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.390372 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33d95d7b-dfe7-495a-b686-5737dd95b974-utilities\") pod \"redhat-operators-7r96b\" (UID: \"33d95d7b-dfe7-495a-b686-5737dd95b974\") " pod="openshift-marketplace/redhat-operators-7r96b" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.390437 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33d95d7b-dfe7-495a-b686-5737dd95b974-catalog-content\") pod \"redhat-operators-7r96b\" (UID: \"33d95d7b-dfe7-495a-b686-5737dd95b974\") " pod="openshift-marketplace/redhat-operators-7r96b" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.410442 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg6c9\" (UniqueName: \"kubernetes.io/projected/33d95d7b-dfe7-495a-b686-5737dd95b974-kube-api-access-wg6c9\") pod \"redhat-operators-7r96b\" (UID: \"33d95d7b-dfe7-495a-b686-5737dd95b974\") " pod="openshift-marketplace/redhat-operators-7r96b" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.428340 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbc2d"] Feb 27 01:08:34 crc kubenswrapper[4771]: W0227 01:08:34.435320 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96a309b5_c10d_49d7_ade8_3c087250dd91.slice/crio-e45be0b3465e90c9cd58b1d66c3a3466721bda1643a8fce19892514006807477 WatchSource:0}: Error finding container e45be0b3465e90c9cd58b1d66c3a3466721bda1643a8fce19892514006807477: Status 404 returned error can't find the container with id e45be0b3465e90c9cd58b1d66c3a3466721bda1643a8fce19892514006807477 Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.446117 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.489358 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7r96b" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.490806 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2x9v\" (UniqueName: \"kubernetes.io/projected/65fbc634-d941-4dee-a758-3b0b10bd60f0-kube-api-access-f2x9v\") pod \"redhat-operators-qs5th\" (UID: \"65fbc634-d941-4dee-a758-3b0b10bd60f0\") " pod="openshift-marketplace/redhat-operators-qs5th" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.490838 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65fbc634-d941-4dee-a758-3b0b10bd60f0-utilities\") pod \"redhat-operators-qs5th\" (UID: \"65fbc634-d941-4dee-a758-3b0b10bd60f0\") " pod="openshift-marketplace/redhat-operators-qs5th" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.490889 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65fbc634-d941-4dee-a758-3b0b10bd60f0-catalog-content\") pod \"redhat-operators-qs5th\" (UID: \"65fbc634-d941-4dee-a758-3b0b10bd60f0\") " pod="openshift-marketplace/redhat-operators-qs5th" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.592113 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65fbc634-d941-4dee-a758-3b0b10bd60f0-utilities\") pod \"redhat-operators-qs5th\" (UID: \"65fbc634-d941-4dee-a758-3b0b10bd60f0\") " pod="openshift-marketplace/redhat-operators-qs5th" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.592214 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65fbc634-d941-4dee-a758-3b0b10bd60f0-catalog-content\") pod \"redhat-operators-qs5th\" (UID: \"65fbc634-d941-4dee-a758-3b0b10bd60f0\") " pod="openshift-marketplace/redhat-operators-qs5th" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.592267 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2x9v\" (UniqueName: \"kubernetes.io/projected/65fbc634-d941-4dee-a758-3b0b10bd60f0-kube-api-access-f2x9v\") pod \"redhat-operators-qs5th\" (UID: \"65fbc634-d941-4dee-a758-3b0b10bd60f0\") " pod="openshift-marketplace/redhat-operators-qs5th" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.592666 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65fbc634-d941-4dee-a758-3b0b10bd60f0-catalog-content\") pod \"redhat-operators-qs5th\" (UID: \"65fbc634-d941-4dee-a758-3b0b10bd60f0\") " pod="openshift-marketplace/redhat-operators-qs5th" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.593243 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65fbc634-d941-4dee-a758-3b0b10bd60f0-utilities\") pod \"redhat-operators-qs5th\" (UID: \"65fbc634-d941-4dee-a758-3b0b10bd60f0\") " pod="openshift-marketplace/redhat-operators-qs5th" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.616782 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2x9v\" (UniqueName: \"kubernetes.io/projected/65fbc634-d941-4dee-a758-3b0b10bd60f0-kube-api-access-f2x9v\") pod \"redhat-operators-qs5th\" (UID: \"65fbc634-d941-4dee-a758-3b0b10bd60f0\") " pod="openshift-marketplace/redhat-operators-qs5th" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.685162 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qs5th" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.730715 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-h4dqr" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.733491 4771 patch_prober.go:28] interesting pod/router-default-5444994796-h4dqr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 01:08:34 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Feb 27 01:08:34 crc kubenswrapper[4771]: [+]process-running ok Feb 27 01:08:34 crc kubenswrapper[4771]: healthz check failed Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.733537 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h4dqr" podUID="740f2438-5f9c-40bb-ae51-77aac4708ab9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.785235 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7r96b"] Feb 27 01:08:34 crc kubenswrapper[4771]: I0227 01:08:34.982458 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qs5th"] Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.066190 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7r96b" event={"ID":"33d95d7b-dfe7-495a-b686-5737dd95b974","Type":"ContainerStarted","Data":"11f43a18e055af50c64122c250383fecc60d3c679e110e9fec3a67d4e83cf787"} Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.067898 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qs5th" event={"ID":"65fbc634-d941-4dee-a758-3b0b10bd60f0","Type":"ContainerStarted","Data":"52474d2ba3c64320970a53825bb51ad389f4effbb229761f4021772b3454f8d8"} Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.069414 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" event={"ID":"b8778c53-0dad-4321-81d2-335e0715546b","Type":"ContainerStarted","Data":"a0446a20cb480a746c947e2ebc6ceb1e793df23e87ff078cd62dec08f33ae858"} Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.069433 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" event={"ID":"b8778c53-0dad-4321-81d2-335e0715546b","Type":"ContainerStarted","Data":"c8bd06399a6d32f57bdaf65766368b38547c7480a80c25e34f70cf0f32be3429"} Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.080565 4771 generic.go:334] "Generic (PLEG): container finished" podID="96a309b5-c10d-49d7-ade8-3c087250dd91" containerID="f3545511429d990989fa6ddab489786b6685316f32258dadab235bd234b17a9a" exitCode=0 Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.080786 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbc2d" event={"ID":"96a309b5-c10d-49d7-ade8-3c087250dd91","Type":"ContainerDied","Data":"f3545511429d990989fa6ddab489786b6685316f32258dadab235bd234b17a9a"} Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.080814 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbc2d" event={"ID":"96a309b5-c10d-49d7-ade8-3c087250dd91","Type":"ContainerStarted","Data":"e45be0b3465e90c9cd58b1d66c3a3466721bda1643a8fce19892514006807477"} Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.453492 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.479121 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.502316 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.503289 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.511370 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.618885 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bdc0c710-02ce-4be7-a8b9-a4b559e7ffce-kube-api-access\") pod \"bdc0c710-02ce-4be7-a8b9-a4b559e7ffce\" (UID: \"bdc0c710-02ce-4be7-a8b9-a4b559e7ffce\") " Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.618955 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bdc0c710-02ce-4be7-a8b9-a4b559e7ffce-kubelet-dir\") pod \"bdc0c710-02ce-4be7-a8b9-a4b559e7ffce\" (UID: \"bdc0c710-02ce-4be7-a8b9-a4b559e7ffce\") " Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.619039 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04f626e0-de75-453a-a7b2-e9449485c031-kubelet-dir\") pod \"04f626e0-de75-453a-a7b2-e9449485c031\" (UID: \"04f626e0-de75-453a-a7b2-e9449485c031\") " Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.619062 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04f626e0-de75-453a-a7b2-e9449485c031-kube-api-access\") pod \"04f626e0-de75-453a-a7b2-e9449485c031\" (UID: \"04f626e0-de75-453a-a7b2-e9449485c031\") " Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.620968 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bdc0c710-02ce-4be7-a8b9-a4b559e7ffce-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bdc0c710-02ce-4be7-a8b9-a4b559e7ffce" (UID: "bdc0c710-02ce-4be7-a8b9-a4b559e7ffce"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.621016 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04f626e0-de75-453a-a7b2-e9449485c031-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "04f626e0-de75-453a-a7b2-e9449485c031" (UID: "04f626e0-de75-453a-a7b2-e9449485c031"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.625265 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04f626e0-de75-453a-a7b2-e9449485c031-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "04f626e0-de75-453a-a7b2-e9449485c031" (UID: "04f626e0-de75-453a-a7b2-e9449485c031"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.625526 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdc0c710-02ce-4be7-a8b9-a4b559e7ffce-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bdc0c710-02ce-4be7-a8b9-a4b559e7ffce" (UID: "bdc0c710-02ce-4be7-a8b9-a4b559e7ffce"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.720139 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bdc0c710-02ce-4be7-a8b9-a4b559e7ffce-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.720391 4771 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bdc0c710-02ce-4be7-a8b9-a4b559e7ffce-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.720400 4771 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04f626e0-de75-453a-a7b2-e9449485c031-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.720407 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04f626e0-de75-453a-a7b2-e9449485c031-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.736058 4771 patch_prober.go:28] interesting pod/router-default-5444994796-h4dqr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 01:08:35 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Feb 27 01:08:35 crc kubenswrapper[4771]: [+]process-running ok Feb 27 01:08:35 crc kubenswrapper[4771]: healthz check failed Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.736105 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h4dqr" podUID="740f2438-5f9c-40bb-ae51-77aac4708ab9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 01:08:35 crc kubenswrapper[4771]: I0227 01:08:35.807488 4771 ???:1] "http: TLS handshake error from 192.168.126.11:48796: no serving certificate available for the kubelet" Feb 27 01:08:36 crc kubenswrapper[4771]: I0227 01:08:36.107926 4771 generic.go:334] "Generic (PLEG): container finished" podID="33d95d7b-dfe7-495a-b686-5737dd95b974" containerID="95b828f5b4197c1a596ee79949d2f6af0cbd56911f39aa62ee2742a50e7e15d5" exitCode=0 Feb 27 01:08:36 crc kubenswrapper[4771]: I0227 01:08:36.108001 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7r96b" event={"ID":"33d95d7b-dfe7-495a-b686-5737dd95b974","Type":"ContainerDied","Data":"95b828f5b4197c1a596ee79949d2f6af0cbd56911f39aa62ee2742a50e7e15d5"} Feb 27 01:08:36 crc kubenswrapper[4771]: I0227 01:08:36.122076 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 01:08:36 crc kubenswrapper[4771]: I0227 01:08:36.122270 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bdc0c710-02ce-4be7-a8b9-a4b559e7ffce","Type":"ContainerDied","Data":"6f9446b33db8c409ab576160544fa45fadb1f2acf0b2d63760f2d4f6cd61b55f"} Feb 27 01:08:36 crc kubenswrapper[4771]: I0227 01:08:36.122324 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f9446b33db8c409ab576160544fa45fadb1f2acf0b2d63760f2d4f6cd61b55f" Feb 27 01:08:36 crc kubenswrapper[4771]: I0227 01:08:36.143701 4771 generic.go:334] "Generic (PLEG): container finished" podID="65fbc634-d941-4dee-a758-3b0b10bd60f0" containerID="559a793c8bc90ff6ec92096c31966d30de7b5630ba93dd06b4863076c13ad3f1" exitCode=0 Feb 27 01:08:36 crc kubenswrapper[4771]: I0227 01:08:36.143776 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qs5th" event={"ID":"65fbc634-d941-4dee-a758-3b0b10bd60f0","Type":"ContainerDied","Data":"559a793c8bc90ff6ec92096c31966d30de7b5630ba93dd06b4863076c13ad3f1"} Feb 27 01:08:36 crc kubenswrapper[4771]: I0227 01:08:36.151800 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 01:08:36 crc kubenswrapper[4771]: I0227 01:08:36.152109 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"04f626e0-de75-453a-a7b2-e9449485c031","Type":"ContainerDied","Data":"44486e245191892f59a009017f3c84f776d398defb8af7bdda61df4059de798d"} Feb 27 01:08:36 crc kubenswrapper[4771]: I0227 01:08:36.152125 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44486e245191892f59a009017f3c84f776d398defb8af7bdda61df4059de798d" Feb 27 01:08:36 crc kubenswrapper[4771]: I0227 01:08:36.152140 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" Feb 27 01:08:36 crc kubenswrapper[4771]: I0227 01:08:36.159048 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" Feb 27 01:08:36 crc kubenswrapper[4771]: I0227 01:08:36.164394 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-76f5q" Feb 27 01:08:36 crc kubenswrapper[4771]: I0227 01:08:36.216525 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" podStartSLOduration=7.216509664 podStartE2EDuration="7.216509664s" podCreationTimestamp="2026-02-27 01:08:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:08:36.215778204 +0000 UTC m=+229.153339492" watchObservedRunningTime="2026-02-27 01:08:36.216509664 +0000 UTC m=+229.154070952" Feb 27 01:08:36 crc kubenswrapper[4771]: I0227 01:08:36.455243 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-crqhd" Feb 27 01:08:36 crc kubenswrapper[4771]: I0227 01:08:36.731099 4771 patch_prober.go:28] interesting pod/router-default-5444994796-h4dqr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 01:08:36 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Feb 27 01:08:36 crc kubenswrapper[4771]: [+]process-running ok Feb 27 01:08:36 crc kubenswrapper[4771]: healthz check failed Feb 27 01:08:36 crc kubenswrapper[4771]: I0227 01:08:36.731148 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h4dqr" podUID="740f2438-5f9c-40bb-ae51-77aac4708ab9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 01:08:37 crc kubenswrapper[4771]: I0227 01:08:37.733372 4771 patch_prober.go:28] interesting pod/router-default-5444994796-h4dqr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 01:08:37 crc kubenswrapper[4771]: [+]has-synced ok Feb 27 01:08:37 crc kubenswrapper[4771]: [+]process-running ok Feb 27 01:08:37 crc kubenswrapper[4771]: healthz check failed Feb 27 01:08:37 crc kubenswrapper[4771]: I0227 01:08:37.733445 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h4dqr" podUID="740f2438-5f9c-40bb-ae51-77aac4708ab9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 01:08:38 crc kubenswrapper[4771]: I0227 01:08:38.731724 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-h4dqr" Feb 27 01:08:38 crc kubenswrapper[4771]: I0227 01:08:38.733729 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-h4dqr" Feb 27 01:08:39 crc kubenswrapper[4771]: I0227 01:08:39.072341 4771 ???:1] "http: TLS handshake error from 192.168.126.11:48808: no serving certificate available for the kubelet" Feb 27 01:08:40 crc kubenswrapper[4771]: I0227 01:08:40.948216 4771 ???:1] "http: TLS handshake error from 192.168.126.11:48814: no serving certificate available for the kubelet" Feb 27 01:08:42 crc kubenswrapper[4771]: I0227 01:08:42.979472 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-gd7gl" Feb 27 01:08:43 crc kubenswrapper[4771]: I0227 01:08:43.572228 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:43 crc kubenswrapper[4771]: I0227 01:08:43.576535 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:08:48 crc kubenswrapper[4771]: E0227 01:08:48.398749 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 01:08:48 crc kubenswrapper[4771]: E0227 01:08:48.399584 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 01:08:48 crc kubenswrapper[4771]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 01:08:48 crc kubenswrapper[4771]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ttp4j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29535908-hhvn5_openshift-infra(e0d5634e-ce3f-40a5-b85d-64f8c4708c59): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Feb 27 01:08:48 crc kubenswrapper[4771]: > logger="UnhandledError" Feb 27 01:08:48 crc kubenswrapper[4771]: E0227 01:08:48.400746 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29535908-hhvn5" podUID="e0d5634e-ce3f-40a5-b85d-64f8c4708c59" Feb 27 01:08:48 crc kubenswrapper[4771]: I0227 01:08:48.572521 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-fd5687597-psjcb"] Feb 27 01:08:48 crc kubenswrapper[4771]: I0227 01:08:48.572779 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" podUID="b8778c53-0dad-4321-81d2-335e0715546b" containerName="controller-manager" containerID="cri-o://a0446a20cb480a746c947e2ebc6ceb1e793df23e87ff078cd62dec08f33ae858" gracePeriod=30 Feb 27 01:08:48 crc kubenswrapper[4771]: I0227 01:08:48.575642 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq"] Feb 27 01:08:48 crc kubenswrapper[4771]: I0227 01:08:48.575842 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq" podUID="856463d3-39ed-4435-8cfb-e6d3b7e4312d" containerName="route-controller-manager" containerID="cri-o://00d8efd3ac413e6643bf0ca522b97b03593ca4fcd692824fa0b3c902ed914cf2" gracePeriod=30 Feb 27 01:08:49 crc kubenswrapper[4771]: I0227 01:08:49.225131 4771 generic.go:334] "Generic (PLEG): container finished" podID="b8778c53-0dad-4321-81d2-335e0715546b" containerID="a0446a20cb480a746c947e2ebc6ceb1e793df23e87ff078cd62dec08f33ae858" exitCode=0 Feb 27 01:08:49 crc kubenswrapper[4771]: I0227 01:08:49.225220 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" event={"ID":"b8778c53-0dad-4321-81d2-335e0715546b","Type":"ContainerDied","Data":"a0446a20cb480a746c947e2ebc6ceb1e793df23e87ff078cd62dec08f33ae858"} Feb 27 01:08:49 crc kubenswrapper[4771]: I0227 01:08:49.227289 4771 generic.go:334] "Generic (PLEG): container finished" podID="856463d3-39ed-4435-8cfb-e6d3b7e4312d" containerID="00d8efd3ac413e6643bf0ca522b97b03593ca4fcd692824fa0b3c902ed914cf2" exitCode=0 Feb 27 01:08:49 crc kubenswrapper[4771]: I0227 01:08:49.227336 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq" event={"ID":"856463d3-39ed-4435-8cfb-e6d3b7e4312d","Type":"ContainerDied","Data":"00d8efd3ac413e6643bf0ca522b97b03593ca4fcd692824fa0b3c902ed914cf2"} Feb 27 01:08:49 crc kubenswrapper[4771]: E0227 01:08:49.228882 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29535908-hhvn5" podUID="e0d5634e-ce3f-40a5-b85d-64f8c4708c59" Feb 27 01:08:50 crc kubenswrapper[4771]: I0227 01:08:50.066312 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs\") pod \"network-metrics-daemon-24pv2\" (UID: \"15dd6a85-eabc-4a32-a283-33bf72d2a041\") " pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:08:50 crc kubenswrapper[4771]: I0227 01:08:50.068089 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 27 01:08:50 crc kubenswrapper[4771]: I0227 01:08:50.096128 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/15dd6a85-eabc-4a32-a283-33bf72d2a041-metrics-certs\") pod \"network-metrics-daemon-24pv2\" (UID: \"15dd6a85-eabc-4a32-a283-33bf72d2a041\") " pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:08:50 crc kubenswrapper[4771]: I0227 01:08:50.302637 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 27 01:08:50 crc kubenswrapper[4771]: I0227 01:08:50.312001 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-24pv2" Feb 27 01:08:51 crc kubenswrapper[4771]: I0227 01:08:51.624085 4771 patch_prober.go:28] interesting pod/route-controller-manager-c4f4946d6-8r6tq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" start-of-body= Feb 27 01:08:51 crc kubenswrapper[4771]: I0227 01:08:51.624535 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq" podUID="856463d3-39ed-4435-8cfb-e6d3b7e4312d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" Feb 27 01:08:52 crc kubenswrapper[4771]: I0227 01:08:52.215385 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:08:54 crc kubenswrapper[4771]: I0227 01:08:54.928467 4771 patch_prober.go:28] interesting pod/controller-manager-fd5687597-psjcb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": context deadline exceeded" start-of-body= Feb 27 01:08:54 crc kubenswrapper[4771]: I0227 01:08:54.928852 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" podUID="b8778c53-0dad-4321-81d2-335e0715546b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": context deadline exceeded" Feb 27 01:08:58 crc kubenswrapper[4771]: I0227 01:08:58.953979 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:08:58 crc kubenswrapper[4771]: I0227 01:08:58.954399 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:09:00 crc kubenswrapper[4771]: E0227 01:09:00.838905 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 27 01:09:00 crc kubenswrapper[4771]: E0227 01:09:00.839077 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4b8tx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-z5hx5_openshift-marketplace(739ccbd2-56c7-4a26-ad40-4f0f908089e8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 01:09:00 crc kubenswrapper[4771]: E0227 01:09:00.840274 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-z5hx5" podUID="739ccbd2-56c7-4a26-ad40-4f0f908089e8" Feb 27 01:09:01 crc kubenswrapper[4771]: I0227 01:09:01.448923 4771 ???:1] "http: TLS handshake error from 192.168.126.11:34466: no serving certificate available for the kubelet" Feb 27 01:09:01 crc kubenswrapper[4771]: E0227 01:09:01.994832 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-z5hx5" podUID="739ccbd2-56c7-4a26-ad40-4f0f908089e8" Feb 27 01:09:02 crc kubenswrapper[4771]: E0227 01:09:02.057469 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 27 01:09:02 crc kubenswrapper[4771]: E0227 01:09:02.057658 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-67wv4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-g5gxk_openshift-marketplace(b4029ae4-2dfb-4351-88f8-08fefb8ab46e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 01:09:02 crc kubenswrapper[4771]: E0227 01:09:02.058924 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-g5gxk" podUID="b4029ae4-2dfb-4351-88f8-08fefb8ab46e" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.072877 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.103785 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l"] Feb 27 01:09:02 crc kubenswrapper[4771]: E0227 01:09:02.104832 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc0c710-02ce-4be7-a8b9-a4b559e7ffce" containerName="pruner" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.104965 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc0c710-02ce-4be7-a8b9-a4b559e7ffce" containerName="pruner" Feb 27 01:09:02 crc kubenswrapper[4771]: E0227 01:09:02.105216 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04f626e0-de75-453a-a7b2-e9449485c031" containerName="pruner" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.105294 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="04f626e0-de75-453a-a7b2-e9449485c031" containerName="pruner" Feb 27 01:09:02 crc kubenswrapper[4771]: E0227 01:09:02.105362 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="856463d3-39ed-4435-8cfb-e6d3b7e4312d" containerName="route-controller-manager" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.105417 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="856463d3-39ed-4435-8cfb-e6d3b7e4312d" containerName="route-controller-manager" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.105601 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="856463d3-39ed-4435-8cfb-e6d3b7e4312d" containerName="route-controller-manager" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.105689 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="04f626e0-de75-453a-a7b2-e9449485c031" containerName="pruner" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.105783 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdc0c710-02ce-4be7-a8b9-a4b559e7ffce" containerName="pruner" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.106275 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l" Feb 27 01:09:02 crc kubenswrapper[4771]: E0227 01:09:02.125082 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 27 01:09:02 crc kubenswrapper[4771]: E0227 01:09:02.125211 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kzlpt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-lds4k_openshift-marketplace(0b5757cb-321d-4a76-8769-786b28a2b004): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 01:09:02 crc kubenswrapper[4771]: E0227 01:09:02.126470 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-lds4k" podUID="0b5757cb-321d-4a76-8769-786b28a2b004" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.136074 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l"] Feb 27 01:09:02 crc kubenswrapper[4771]: E0227 01:09:02.159968 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 27 01:09:02 crc kubenswrapper[4771]: E0227 01:09:02.160137 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sstcr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-xk58g_openshift-marketplace(460ffbff-d0f0-43dc-bde9-6279c9a4b6a3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 01:09:02 crc kubenswrapper[4771]: E0227 01:09:02.163007 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-xk58g" podUID="460ffbff-d0f0-43dc-bde9-6279c9a4b6a3" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.226853 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/856463d3-39ed-4435-8cfb-e6d3b7e4312d-config\") pod \"856463d3-39ed-4435-8cfb-e6d3b7e4312d\" (UID: \"856463d3-39ed-4435-8cfb-e6d3b7e4312d\") " Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.226937 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kvr9\" (UniqueName: \"kubernetes.io/projected/856463d3-39ed-4435-8cfb-e6d3b7e4312d-kube-api-access-8kvr9\") pod \"856463d3-39ed-4435-8cfb-e6d3b7e4312d\" (UID: \"856463d3-39ed-4435-8cfb-e6d3b7e4312d\") " Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.227228 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/856463d3-39ed-4435-8cfb-e6d3b7e4312d-serving-cert\") pod \"856463d3-39ed-4435-8cfb-e6d3b7e4312d\" (UID: \"856463d3-39ed-4435-8cfb-e6d3b7e4312d\") " Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.227358 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/856463d3-39ed-4435-8cfb-e6d3b7e4312d-client-ca\") pod \"856463d3-39ed-4435-8cfb-e6d3b7e4312d\" (UID: \"856463d3-39ed-4435-8cfb-e6d3b7e4312d\") " Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.227656 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c08e8ff-9791-4158-a79d-933dca605e06-serving-cert\") pod \"route-controller-manager-67cb7d9cdc-dhf8l\" (UID: \"4c08e8ff-9791-4158-a79d-933dca605e06\") " pod="openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.227765 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c08e8ff-9791-4158-a79d-933dca605e06-client-ca\") pod \"route-controller-manager-67cb7d9cdc-dhf8l\" (UID: \"4c08e8ff-9791-4158-a79d-933dca605e06\") " pod="openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.227846 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/856463d3-39ed-4435-8cfb-e6d3b7e4312d-config" (OuterVolumeSpecName: "config") pod "856463d3-39ed-4435-8cfb-e6d3b7e4312d" (UID: "856463d3-39ed-4435-8cfb-e6d3b7e4312d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.227888 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c08e8ff-9791-4158-a79d-933dca605e06-config\") pod \"route-controller-manager-67cb7d9cdc-dhf8l\" (UID: \"4c08e8ff-9791-4158-a79d-933dca605e06\") " pod="openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.227914 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpppg\" (UniqueName: \"kubernetes.io/projected/4c08e8ff-9791-4158-a79d-933dca605e06-kube-api-access-bpppg\") pod \"route-controller-manager-67cb7d9cdc-dhf8l\" (UID: \"4c08e8ff-9791-4158-a79d-933dca605e06\") " pod="openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.227998 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/856463d3-39ed-4435-8cfb-e6d3b7e4312d-client-ca" (OuterVolumeSpecName: "client-ca") pod "856463d3-39ed-4435-8cfb-e6d3b7e4312d" (UID: "856463d3-39ed-4435-8cfb-e6d3b7e4312d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.228058 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/856463d3-39ed-4435-8cfb-e6d3b7e4312d-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.231603 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/856463d3-39ed-4435-8cfb-e6d3b7e4312d-kube-api-access-8kvr9" (OuterVolumeSpecName: "kube-api-access-8kvr9") pod "856463d3-39ed-4435-8cfb-e6d3b7e4312d" (UID: "856463d3-39ed-4435-8cfb-e6d3b7e4312d"). InnerVolumeSpecName "kube-api-access-8kvr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.236728 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/856463d3-39ed-4435-8cfb-e6d3b7e4312d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "856463d3-39ed-4435-8cfb-e6d3b7e4312d" (UID: "856463d3-39ed-4435-8cfb-e6d3b7e4312d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.328659 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c08e8ff-9791-4158-a79d-933dca605e06-serving-cert\") pod \"route-controller-manager-67cb7d9cdc-dhf8l\" (UID: \"4c08e8ff-9791-4158-a79d-933dca605e06\") " pod="openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.328769 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c08e8ff-9791-4158-a79d-933dca605e06-client-ca\") pod \"route-controller-manager-67cb7d9cdc-dhf8l\" (UID: \"4c08e8ff-9791-4158-a79d-933dca605e06\") " pod="openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.328813 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c08e8ff-9791-4158-a79d-933dca605e06-config\") pod \"route-controller-manager-67cb7d9cdc-dhf8l\" (UID: \"4c08e8ff-9791-4158-a79d-933dca605e06\") " pod="openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.328830 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpppg\" (UniqueName: \"kubernetes.io/projected/4c08e8ff-9791-4158-a79d-933dca605e06-kube-api-access-bpppg\") pod \"route-controller-manager-67cb7d9cdc-dhf8l\" (UID: \"4c08e8ff-9791-4158-a79d-933dca605e06\") " pod="openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.328885 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/856463d3-39ed-4435-8cfb-e6d3b7e4312d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.328896 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/856463d3-39ed-4435-8cfb-e6d3b7e4312d-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.328905 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kvr9\" (UniqueName: \"kubernetes.io/projected/856463d3-39ed-4435-8cfb-e6d3b7e4312d-kube-api-access-8kvr9\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.329863 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c08e8ff-9791-4158-a79d-933dca605e06-client-ca\") pod \"route-controller-manager-67cb7d9cdc-dhf8l\" (UID: \"4c08e8ff-9791-4158-a79d-933dca605e06\") " pod="openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.330876 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c08e8ff-9791-4158-a79d-933dca605e06-config\") pod \"route-controller-manager-67cb7d9cdc-dhf8l\" (UID: \"4c08e8ff-9791-4158-a79d-933dca605e06\") " pod="openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.333082 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c08e8ff-9791-4158-a79d-933dca605e06-serving-cert\") pod \"route-controller-manager-67cb7d9cdc-dhf8l\" (UID: \"4c08e8ff-9791-4158-a79d-933dca605e06\") " pod="openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.338299 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.338425 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq" event={"ID":"856463d3-39ed-4435-8cfb-e6d3b7e4312d","Type":"ContainerDied","Data":"400ba0f2b8fd2fd07c286354eea9e3c710958d499754660d31db73148263b439"} Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.338526 4771 scope.go:117] "RemoveContainer" containerID="00d8efd3ac413e6643bf0ca522b97b03593ca4fcd692824fa0b3c902ed914cf2" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.345601 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpppg\" (UniqueName: \"kubernetes.io/projected/4c08e8ff-9791-4158-a79d-933dca605e06-kube-api-access-bpppg\") pod \"route-controller-manager-67cb7d9cdc-dhf8l\" (UID: \"4c08e8ff-9791-4158-a79d-933dca605e06\") " pod="openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.386200 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq"] Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.388606 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq"] Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.437748 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l" Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.625021 4771 patch_prober.go:28] interesting pod/route-controller-manager-c4f4946d6-8r6tq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 27 01:09:02 crc kubenswrapper[4771]: I0227 01:09:02.625122 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-c4f4946d6-8r6tq" podUID="856463d3-39ed-4435-8cfb-e6d3b7e4312d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 27 01:09:03 crc kubenswrapper[4771]: E0227 01:09:03.513379 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-g5gxk" podUID="b4029ae4-2dfb-4351-88f8-08fefb8ab46e" Feb 27 01:09:03 crc kubenswrapper[4771]: E0227 01:09:03.513392 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-lds4k" podUID="0b5757cb-321d-4a76-8769-786b28a2b004" Feb 27 01:09:03 crc kubenswrapper[4771]: E0227 01:09:03.513474 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xk58g" podUID="460ffbff-d0f0-43dc-bde9-6279c9a4b6a3" Feb 27 01:09:03 crc kubenswrapper[4771]: E0227 01:09:03.528632 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 27 01:09:03 crc kubenswrapper[4771]: E0227 01:09:03.528802 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vxnk7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-sbc2d_openshift-marketplace(96a309b5-c10d-49d7-ade8-3c087250dd91): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 01:09:03 crc kubenswrapper[4771]: E0227 01:09:03.530675 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-sbc2d" podUID="96a309b5-c10d-49d7-ade8-3c087250dd91" Feb 27 01:09:03 crc kubenswrapper[4771]: E0227 01:09:03.541696 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 27 01:09:03 crc kubenswrapper[4771]: E0227 01:09:03.542035 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p2vfs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-d5l2f_openshift-marketplace(deb9a4a5-1474-4744-a57e-fcdcc97922ed): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 01:09:03 crc kubenswrapper[4771]: E0227 01:09:03.543405 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-d5l2f" podUID="deb9a4a5-1474-4744-a57e-fcdcc97922ed" Feb 27 01:09:03 crc kubenswrapper[4771]: I0227 01:09:03.591341 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" Feb 27 01:09:03 crc kubenswrapper[4771]: I0227 01:09:03.648512 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b8778c53-0dad-4321-81d2-335e0715546b-proxy-ca-bundles\") pod \"b8778c53-0dad-4321-81d2-335e0715546b\" (UID: \"b8778c53-0dad-4321-81d2-335e0715546b\") " Feb 27 01:09:03 crc kubenswrapper[4771]: I0227 01:09:03.648640 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8778c53-0dad-4321-81d2-335e0715546b-config\") pod \"b8778c53-0dad-4321-81d2-335e0715546b\" (UID: \"b8778c53-0dad-4321-81d2-335e0715546b\") " Feb 27 01:09:03 crc kubenswrapper[4771]: I0227 01:09:03.648690 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b8778c53-0dad-4321-81d2-335e0715546b-client-ca\") pod \"b8778c53-0dad-4321-81d2-335e0715546b\" (UID: \"b8778c53-0dad-4321-81d2-335e0715546b\") " Feb 27 01:09:03 crc kubenswrapper[4771]: I0227 01:09:03.648719 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwxf6\" (UniqueName: \"kubernetes.io/projected/b8778c53-0dad-4321-81d2-335e0715546b-kube-api-access-gwxf6\") pod \"b8778c53-0dad-4321-81d2-335e0715546b\" (UID: \"b8778c53-0dad-4321-81d2-335e0715546b\") " Feb 27 01:09:03 crc kubenswrapper[4771]: I0227 01:09:03.648790 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8778c53-0dad-4321-81d2-335e0715546b-serving-cert\") pod \"b8778c53-0dad-4321-81d2-335e0715546b\" (UID: \"b8778c53-0dad-4321-81d2-335e0715546b\") " Feb 27 01:09:03 crc kubenswrapper[4771]: I0227 01:09:03.649616 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8778c53-0dad-4321-81d2-335e0715546b-client-ca" (OuterVolumeSpecName: "client-ca") pod "b8778c53-0dad-4321-81d2-335e0715546b" (UID: "b8778c53-0dad-4321-81d2-335e0715546b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:09:03 crc kubenswrapper[4771]: I0227 01:09:03.649666 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8778c53-0dad-4321-81d2-335e0715546b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b8778c53-0dad-4321-81d2-335e0715546b" (UID: "b8778c53-0dad-4321-81d2-335e0715546b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:09:03 crc kubenswrapper[4771]: I0227 01:09:03.649920 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8778c53-0dad-4321-81d2-335e0715546b-config" (OuterVolumeSpecName: "config") pod "b8778c53-0dad-4321-81d2-335e0715546b" (UID: "b8778c53-0dad-4321-81d2-335e0715546b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:09:03 crc kubenswrapper[4771]: I0227 01:09:03.653112 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8778c53-0dad-4321-81d2-335e0715546b-kube-api-access-gwxf6" (OuterVolumeSpecName: "kube-api-access-gwxf6") pod "b8778c53-0dad-4321-81d2-335e0715546b" (UID: "b8778c53-0dad-4321-81d2-335e0715546b"). InnerVolumeSpecName "kube-api-access-gwxf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:09:03 crc kubenswrapper[4771]: I0227 01:09:03.653930 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8778c53-0dad-4321-81d2-335e0715546b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b8778c53-0dad-4321-81d2-335e0715546b" (UID: "b8778c53-0dad-4321-81d2-335e0715546b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:09:03 crc kubenswrapper[4771]: I0227 01:09:03.750788 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b8778c53-0dad-4321-81d2-335e0715546b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:03 crc kubenswrapper[4771]: I0227 01:09:03.750825 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8778c53-0dad-4321-81d2-335e0715546b-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:03 crc kubenswrapper[4771]: I0227 01:09:03.750835 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b8778c53-0dad-4321-81d2-335e0715546b-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:03 crc kubenswrapper[4771]: I0227 01:09:03.750844 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwxf6\" (UniqueName: \"kubernetes.io/projected/b8778c53-0dad-4321-81d2-335e0715546b-kube-api-access-gwxf6\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:03 crc kubenswrapper[4771]: I0227 01:09:03.750860 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8778c53-0dad-4321-81d2-335e0715546b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:03 crc kubenswrapper[4771]: I0227 01:09:03.780250 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="856463d3-39ed-4435-8cfb-e6d3b7e4312d" path="/var/lib/kubelet/pods/856463d3-39ed-4435-8cfb-e6d3b7e4312d/volumes" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.349403 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.350286 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fd5687597-psjcb" event={"ID":"b8778c53-0dad-4321-81d2-335e0715546b","Type":"ContainerDied","Data":"c8bd06399a6d32f57bdaf65766368b38547c7480a80c25e34f70cf0f32be3429"} Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.375957 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tct95" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.381261 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-fd5687597-psjcb"] Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.384906 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-fd5687597-psjcb"] Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.600056 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5ccb7bb557-wrphg"] Feb 27 01:09:04 crc kubenswrapper[4771]: E0227 01:09:04.600576 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8778c53-0dad-4321-81d2-335e0715546b" containerName="controller-manager" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.600590 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8778c53-0dad-4321-81d2-335e0715546b" containerName="controller-manager" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.600690 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8778c53-0dad-4321-81d2-335e0715546b" containerName="controller-manager" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.601048 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.604512 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.604655 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.607952 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5ccb7bb557-wrphg"] Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.608136 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.608240 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.608360 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.608583 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.633870 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.659389 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18d419ea-0351-478d-a54b-99f5597d9dd0-serving-cert\") pod \"controller-manager-5ccb7bb557-wrphg\" (UID: \"18d419ea-0351-478d-a54b-99f5597d9dd0\") " pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.659431 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18d419ea-0351-478d-a54b-99f5597d9dd0-proxy-ca-bundles\") pod \"controller-manager-5ccb7bb557-wrphg\" (UID: \"18d419ea-0351-478d-a54b-99f5597d9dd0\") " pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.659450 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18d419ea-0351-478d-a54b-99f5597d9dd0-client-ca\") pod \"controller-manager-5ccb7bb557-wrphg\" (UID: \"18d419ea-0351-478d-a54b-99f5597d9dd0\") " pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.659485 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csbtl\" (UniqueName: \"kubernetes.io/projected/18d419ea-0351-478d-a54b-99f5597d9dd0-kube-api-access-csbtl\") pod \"controller-manager-5ccb7bb557-wrphg\" (UID: \"18d419ea-0351-478d-a54b-99f5597d9dd0\") " pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.659511 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18d419ea-0351-478d-a54b-99f5597d9dd0-config\") pod \"controller-manager-5ccb7bb557-wrphg\" (UID: \"18d419ea-0351-478d-a54b-99f5597d9dd0\") " pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.760421 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18d419ea-0351-478d-a54b-99f5597d9dd0-serving-cert\") pod \"controller-manager-5ccb7bb557-wrphg\" (UID: \"18d419ea-0351-478d-a54b-99f5597d9dd0\") " pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.760457 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18d419ea-0351-478d-a54b-99f5597d9dd0-proxy-ca-bundles\") pod \"controller-manager-5ccb7bb557-wrphg\" (UID: \"18d419ea-0351-478d-a54b-99f5597d9dd0\") " pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.760475 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18d419ea-0351-478d-a54b-99f5597d9dd0-client-ca\") pod \"controller-manager-5ccb7bb557-wrphg\" (UID: \"18d419ea-0351-478d-a54b-99f5597d9dd0\") " pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.760506 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csbtl\" (UniqueName: \"kubernetes.io/projected/18d419ea-0351-478d-a54b-99f5597d9dd0-kube-api-access-csbtl\") pod \"controller-manager-5ccb7bb557-wrphg\" (UID: \"18d419ea-0351-478d-a54b-99f5597d9dd0\") " pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.760528 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18d419ea-0351-478d-a54b-99f5597d9dd0-config\") pod \"controller-manager-5ccb7bb557-wrphg\" (UID: \"18d419ea-0351-478d-a54b-99f5597d9dd0\") " pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.761769 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18d419ea-0351-478d-a54b-99f5597d9dd0-config\") pod \"controller-manager-5ccb7bb557-wrphg\" (UID: \"18d419ea-0351-478d-a54b-99f5597d9dd0\") " pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.763122 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18d419ea-0351-478d-a54b-99f5597d9dd0-client-ca\") pod \"controller-manager-5ccb7bb557-wrphg\" (UID: \"18d419ea-0351-478d-a54b-99f5597d9dd0\") " pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.763364 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18d419ea-0351-478d-a54b-99f5597d9dd0-proxy-ca-bundles\") pod \"controller-manager-5ccb7bb557-wrphg\" (UID: \"18d419ea-0351-478d-a54b-99f5597d9dd0\") " pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.767707 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18d419ea-0351-478d-a54b-99f5597d9dd0-serving-cert\") pod \"controller-manager-5ccb7bb557-wrphg\" (UID: \"18d419ea-0351-478d-a54b-99f5597d9dd0\") " pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.777322 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csbtl\" (UniqueName: \"kubernetes.io/projected/18d419ea-0351-478d-a54b-99f5597d9dd0-kube-api-access-csbtl\") pod \"controller-manager-5ccb7bb557-wrphg\" (UID: \"18d419ea-0351-478d-a54b-99f5597d9dd0\") " pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.802224 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 01:09:04 crc kubenswrapper[4771]: I0227 01:09:04.922474 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" Feb 27 01:09:05 crc kubenswrapper[4771]: I0227 01:09:05.782660 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8778c53-0dad-4321-81d2-335e0715546b" path="/var/lib/kubelet/pods/b8778c53-0dad-4321-81d2-335e0715546b/volumes" Feb 27 01:09:06 crc kubenswrapper[4771]: I0227 01:09:06.677584 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 27 01:09:06 crc kubenswrapper[4771]: I0227 01:09:06.679273 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 01:09:06 crc kubenswrapper[4771]: I0227 01:09:06.680685 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 27 01:09:06 crc kubenswrapper[4771]: I0227 01:09:06.681430 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 27 01:09:06 crc kubenswrapper[4771]: I0227 01:09:06.688295 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 27 01:09:06 crc kubenswrapper[4771]: E0227 01:09:06.751039 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-d5l2f" podUID="deb9a4a5-1474-4744-a57e-fcdcc97922ed" Feb 27 01:09:06 crc kubenswrapper[4771]: I0227 01:09:06.782709 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7fe6c6d1-793b-4db4-bfff-b68683a9b4ad-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7fe6c6d1-793b-4db4-bfff-b68683a9b4ad\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 01:09:06 crc kubenswrapper[4771]: I0227 01:09:06.783873 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7fe6c6d1-793b-4db4-bfff-b68683a9b4ad-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7fe6c6d1-793b-4db4-bfff-b68683a9b4ad\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 01:09:06 crc kubenswrapper[4771]: E0227 01:09:06.790381 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 27 01:09:06 crc kubenswrapper[4771]: E0227 01:09:06.790597 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wg6c9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-7r96b_openshift-marketplace(33d95d7b-dfe7-495a-b686-5737dd95b974): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 01:09:06 crc kubenswrapper[4771]: E0227 01:09:06.794403 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-7r96b" podUID="33d95d7b-dfe7-495a-b686-5737dd95b974" Feb 27 01:09:06 crc kubenswrapper[4771]: E0227 01:09:06.800314 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 27 01:09:06 crc kubenswrapper[4771]: E0227 01:09:06.800517 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f2x9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-qs5th_openshift-marketplace(65fbc634-d941-4dee-a758-3b0b10bd60f0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 01:09:06 crc kubenswrapper[4771]: E0227 01:09:06.801713 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-qs5th" podUID="65fbc634-d941-4dee-a758-3b0b10bd60f0" Feb 27 01:09:06 crc kubenswrapper[4771]: I0227 01:09:06.806627 4771 scope.go:117] "RemoveContainer" containerID="a0446a20cb480a746c947e2ebc6ceb1e793df23e87ff078cd62dec08f33ae858" Feb 27 01:09:06 crc kubenswrapper[4771]: I0227 01:09:06.885975 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7fe6c6d1-793b-4db4-bfff-b68683a9b4ad-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7fe6c6d1-793b-4db4-bfff-b68683a9b4ad\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 01:09:06 crc kubenswrapper[4771]: I0227 01:09:06.886222 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7fe6c6d1-793b-4db4-bfff-b68683a9b4ad-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7fe6c6d1-793b-4db4-bfff-b68683a9b4ad\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 01:09:06 crc kubenswrapper[4771]: I0227 01:09:06.886305 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7fe6c6d1-793b-4db4-bfff-b68683a9b4ad-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7fe6c6d1-793b-4db4-bfff-b68683a9b4ad\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 01:09:06 crc kubenswrapper[4771]: I0227 01:09:06.908524 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7fe6c6d1-793b-4db4-bfff-b68683a9b4ad-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7fe6c6d1-793b-4db4-bfff-b68683a9b4ad\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 01:09:07 crc kubenswrapper[4771]: I0227 01:09:07.005472 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 01:09:07 crc kubenswrapper[4771]: I0227 01:09:07.221473 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-24pv2"] Feb 27 01:09:07 crc kubenswrapper[4771]: I0227 01:09:07.229892 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l"] Feb 27 01:09:07 crc kubenswrapper[4771]: W0227 01:09:07.230032 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15dd6a85_eabc_4a32_a283_33bf72d2a041.slice/crio-d65b1b9c4343c93049c3d90fc94096f5a32263c0e626326933e6bd5e25bc0fb9 WatchSource:0}: Error finding container d65b1b9c4343c93049c3d90fc94096f5a32263c0e626326933e6bd5e25bc0fb9: Status 404 returned error can't find the container with id d65b1b9c4343c93049c3d90fc94096f5a32263c0e626326933e6bd5e25bc0fb9 Feb 27 01:09:07 crc kubenswrapper[4771]: W0227 01:09:07.237082 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c08e8ff_9791_4158_a79d_933dca605e06.slice/crio-d37cee6d734425a4a88fc45c6a62d0ed77adab27cdedb62bc0cc9755ba617ef7 WatchSource:0}: Error finding container d37cee6d734425a4a88fc45c6a62d0ed77adab27cdedb62bc0cc9755ba617ef7: Status 404 returned error can't find the container with id d37cee6d734425a4a88fc45c6a62d0ed77adab27cdedb62bc0cc9755ba617ef7 Feb 27 01:09:07 crc kubenswrapper[4771]: I0227 01:09:07.331313 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5ccb7bb557-wrphg"] Feb 27 01:09:07 crc kubenswrapper[4771]: W0227 01:09:07.342706 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18d419ea_0351_478d_a54b_99f5597d9dd0.slice/crio-526a1283abcbbcf08cc831d256e2862a8402a680ee57168eb37dace6c7dc78aa WatchSource:0}: Error finding container 526a1283abcbbcf08cc831d256e2862a8402a680ee57168eb37dace6c7dc78aa: Status 404 returned error can't find the container with id 526a1283abcbbcf08cc831d256e2862a8402a680ee57168eb37dace6c7dc78aa Feb 27 01:09:07 crc kubenswrapper[4771]: I0227 01:09:07.363687 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-24pv2" event={"ID":"15dd6a85-eabc-4a32-a283-33bf72d2a041","Type":"ContainerStarted","Data":"d65b1b9c4343c93049c3d90fc94096f5a32263c0e626326933e6bd5e25bc0fb9"} Feb 27 01:09:07 crc kubenswrapper[4771]: I0227 01:09:07.364941 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l" event={"ID":"4c08e8ff-9791-4158-a79d-933dca605e06","Type":"ContainerStarted","Data":"d37cee6d734425a4a88fc45c6a62d0ed77adab27cdedb62bc0cc9755ba617ef7"} Feb 27 01:09:07 crc kubenswrapper[4771]: I0227 01:09:07.368057 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" event={"ID":"18d419ea-0351-478d-a54b-99f5597d9dd0","Type":"ContainerStarted","Data":"526a1283abcbbcf08cc831d256e2862a8402a680ee57168eb37dace6c7dc78aa"} Feb 27 01:09:07 crc kubenswrapper[4771]: I0227 01:09:07.369325 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535908-hhvn5" event={"ID":"e0d5634e-ce3f-40a5-b85d-64f8c4708c59","Type":"ContainerStarted","Data":"e303e79e8da8b68b1195811949b089c0de2403ad7876b01aa15b28f29a99be5f"} Feb 27 01:09:07 crc kubenswrapper[4771]: E0227 01:09:07.373000 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7r96b" podUID="33d95d7b-dfe7-495a-b686-5737dd95b974" Feb 27 01:09:07 crc kubenswrapper[4771]: E0227 01:09:07.378055 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-qs5th" podUID="65fbc634-d941-4dee-a758-3b0b10bd60f0" Feb 27 01:09:07 crc kubenswrapper[4771]: I0227 01:09:07.388261 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535908-hhvn5" podStartSLOduration=26.68062224 podStartE2EDuration="1m7.388239767s" podCreationTimestamp="2026-02-27 01:08:00 +0000 UTC" firstStartedPulling="2026-02-27 01:08:26.187506016 +0000 UTC m=+219.125067304" lastFinishedPulling="2026-02-27 01:09:06.895123543 +0000 UTC m=+259.832684831" observedRunningTime="2026-02-27 01:09:07.386522511 +0000 UTC m=+260.324083799" watchObservedRunningTime="2026-02-27 01:09:07.388239767 +0000 UTC m=+260.325801055" Feb 27 01:09:07 crc kubenswrapper[4771]: I0227 01:09:07.430395 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 27 01:09:07 crc kubenswrapper[4771]: I0227 01:09:07.694140 4771 csr.go:261] certificate signing request csr-4hm52 is approved, waiting to be issued Feb 27 01:09:07 crc kubenswrapper[4771]: I0227 01:09:07.700499 4771 csr.go:257] certificate signing request csr-4hm52 is issued Feb 27 01:09:08 crc kubenswrapper[4771]: I0227 01:09:08.382798 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7fe6c6d1-793b-4db4-bfff-b68683a9b4ad","Type":"ContainerStarted","Data":"e47e838176efa1d1725ad452d2379f9d3dc0f49431cb25d5fece74bc87d1fc2a"} Feb 27 01:09:08 crc kubenswrapper[4771]: I0227 01:09:08.383118 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7fe6c6d1-793b-4db4-bfff-b68683a9b4ad","Type":"ContainerStarted","Data":"7a0b2896aeafe007da8470282367ef4fccefc0e7b39dad639cc5494031c2b539"} Feb 27 01:09:08 crc kubenswrapper[4771]: I0227 01:09:08.390967 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-24pv2" event={"ID":"15dd6a85-eabc-4a32-a283-33bf72d2a041","Type":"ContainerStarted","Data":"64a10e0ee99fe2be5c4fe6968f1225917a45d9b166dba17af7b6406fc4c4d1da"} Feb 27 01:09:08 crc kubenswrapper[4771]: I0227 01:09:08.391018 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-24pv2" event={"ID":"15dd6a85-eabc-4a32-a283-33bf72d2a041","Type":"ContainerStarted","Data":"6768c31a230aba39ecab270163de43722a7a8a3b700b9164aace634e524a5be0"} Feb 27 01:09:08 crc kubenswrapper[4771]: I0227 01:09:08.395110 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l" event={"ID":"4c08e8ff-9791-4158-a79d-933dca605e06","Type":"ContainerStarted","Data":"4bbca217c13f61b688f3f0078a137122b3a261a9cddc3e60d44f0c42966d9a24"} Feb 27 01:09:08 crc kubenswrapper[4771]: I0227 01:09:08.395965 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l" Feb 27 01:09:08 crc kubenswrapper[4771]: I0227 01:09:08.400878 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" event={"ID":"18d419ea-0351-478d-a54b-99f5597d9dd0","Type":"ContainerStarted","Data":"9265ddaa678deb10d03bf12d26f3a54237dba31bd1e3db5074316de2264397ce"} Feb 27 01:09:08 crc kubenswrapper[4771]: I0227 01:09:08.401339 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" Feb 27 01:09:08 crc kubenswrapper[4771]: I0227 01:09:08.402008 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l" Feb 27 01:09:08 crc kubenswrapper[4771]: I0227 01:09:08.404097 4771 generic.go:334] "Generic (PLEG): container finished" podID="e0d5634e-ce3f-40a5-b85d-64f8c4708c59" containerID="e303e79e8da8b68b1195811949b089c0de2403ad7876b01aa15b28f29a99be5f" exitCode=0 Feb 27 01:09:08 crc kubenswrapper[4771]: I0227 01:09:08.404130 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535908-hhvn5" event={"ID":"e0d5634e-ce3f-40a5-b85d-64f8c4708c59","Type":"ContainerDied","Data":"e303e79e8da8b68b1195811949b089c0de2403ad7876b01aa15b28f29a99be5f"} Feb 27 01:09:08 crc kubenswrapper[4771]: I0227 01:09:08.407881 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" Feb 27 01:09:08 crc kubenswrapper[4771]: I0227 01:09:08.409889 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.409828489 podStartE2EDuration="2.409828489s" podCreationTimestamp="2026-02-27 01:09:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:09:08.40237601 +0000 UTC m=+261.339937308" watchObservedRunningTime="2026-02-27 01:09:08.409828489 +0000 UTC m=+261.347389797" Feb 27 01:09:08 crc kubenswrapper[4771]: I0227 01:09:08.425971 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" podStartSLOduration=20.425953029 podStartE2EDuration="20.425953029s" podCreationTimestamp="2026-02-27 01:08:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:09:08.423964046 +0000 UTC m=+261.361525334" watchObservedRunningTime="2026-02-27 01:09:08.425953029 +0000 UTC m=+261.363514317" Feb 27 01:09:08 crc kubenswrapper[4771]: I0227 01:09:08.446186 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l" podStartSLOduration=20.44616818 podStartE2EDuration="20.44616818s" podCreationTimestamp="2026-02-27 01:08:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:09:08.443032426 +0000 UTC m=+261.380593754" watchObservedRunningTime="2026-02-27 01:09:08.44616818 +0000 UTC m=+261.383729478" Feb 27 01:09:08 crc kubenswrapper[4771]: I0227 01:09:08.466872 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-24pv2" podStartSLOduration=215.466853583 podStartE2EDuration="3m35.466853583s" podCreationTimestamp="2026-02-27 01:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:09:08.464566491 +0000 UTC m=+261.402127789" watchObservedRunningTime="2026-02-27 01:09:08.466853583 +0000 UTC m=+261.404414871" Feb 27 01:09:08 crc kubenswrapper[4771]: I0227 01:09:08.589185 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5ccb7bb557-wrphg"] Feb 27 01:09:08 crc kubenswrapper[4771]: I0227 01:09:08.681500 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l"] Feb 27 01:09:08 crc kubenswrapper[4771]: I0227 01:09:08.702225 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-11 17:31:21.457490779 +0000 UTC Feb 27 01:09:08 crc kubenswrapper[4771]: I0227 01:09:08.702451 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6184h22m12.755043393s for next certificate rotation Feb 27 01:09:09 crc kubenswrapper[4771]: I0227 01:09:09.421350 4771 generic.go:334] "Generic (PLEG): container finished" podID="7fe6c6d1-793b-4db4-bfff-b68683a9b4ad" containerID="e47e838176efa1d1725ad452d2379f9d3dc0f49431cb25d5fece74bc87d1fc2a" exitCode=0 Feb 27 01:09:09 crc kubenswrapper[4771]: I0227 01:09:09.421470 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7fe6c6d1-793b-4db4-bfff-b68683a9b4ad","Type":"ContainerDied","Data":"e47e838176efa1d1725ad452d2379f9d3dc0f49431cb25d5fece74bc87d1fc2a"} Feb 27 01:09:09 crc kubenswrapper[4771]: I0227 01:09:09.673352 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535908-hhvn5" Feb 27 01:09:09 crc kubenswrapper[4771]: I0227 01:09:09.703512 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-25 08:55:51.71513066 +0000 UTC Feb 27 01:09:09 crc kubenswrapper[4771]: I0227 01:09:09.703587 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6511h46m42.011547381s for next certificate rotation Feb 27 01:09:09 crc kubenswrapper[4771]: I0227 01:09:09.722355 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttp4j\" (UniqueName: \"kubernetes.io/projected/e0d5634e-ce3f-40a5-b85d-64f8c4708c59-kube-api-access-ttp4j\") pod \"e0d5634e-ce3f-40a5-b85d-64f8c4708c59\" (UID: \"e0d5634e-ce3f-40a5-b85d-64f8c4708c59\") " Feb 27 01:09:09 crc kubenswrapper[4771]: I0227 01:09:09.727628 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0d5634e-ce3f-40a5-b85d-64f8c4708c59-kube-api-access-ttp4j" (OuterVolumeSpecName: "kube-api-access-ttp4j") pod "e0d5634e-ce3f-40a5-b85d-64f8c4708c59" (UID: "e0d5634e-ce3f-40a5-b85d-64f8c4708c59"). InnerVolumeSpecName "kube-api-access-ttp4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:09:09 crc kubenswrapper[4771]: I0227 01:09:09.823449 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttp4j\" (UniqueName: \"kubernetes.io/projected/e0d5634e-ce3f-40a5-b85d-64f8c4708c59-kube-api-access-ttp4j\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:10 crc kubenswrapper[4771]: I0227 01:09:10.427808 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535908-hhvn5" event={"ID":"e0d5634e-ce3f-40a5-b85d-64f8c4708c59","Type":"ContainerDied","Data":"a8cbcd7114d387115d3a054b6ade4c8a81ee65dc780cca737549331a0b6d2646"} Feb 27 01:09:10 crc kubenswrapper[4771]: I0227 01:09:10.430461 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8cbcd7114d387115d3a054b6ade4c8a81ee65dc780cca737549331a0b6d2646" Feb 27 01:09:10 crc kubenswrapper[4771]: I0227 01:09:10.428082 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" podUID="18d419ea-0351-478d-a54b-99f5597d9dd0" containerName="controller-manager" containerID="cri-o://9265ddaa678deb10d03bf12d26f3a54237dba31bd1e3db5074316de2264397ce" gracePeriod=30 Feb 27 01:09:10 crc kubenswrapper[4771]: I0227 01:09:10.428295 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l" podUID="4c08e8ff-9791-4158-a79d-933dca605e06" containerName="route-controller-manager" containerID="cri-o://4bbca217c13f61b688f3f0078a137122b3a261a9cddc3e60d44f0c42966d9a24" gracePeriod=30 Feb 27 01:09:10 crc kubenswrapper[4771]: I0227 01:09:10.427952 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535908-hhvn5" Feb 27 01:09:10 crc kubenswrapper[4771]: I0227 01:09:10.693466 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 01:09:10 crc kubenswrapper[4771]: I0227 01:09:10.833817 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7fe6c6d1-793b-4db4-bfff-b68683a9b4ad-kubelet-dir\") pod \"7fe6c6d1-793b-4db4-bfff-b68683a9b4ad\" (UID: \"7fe6c6d1-793b-4db4-bfff-b68683a9b4ad\") " Feb 27 01:09:10 crc kubenswrapper[4771]: I0227 01:09:10.834029 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7fe6c6d1-793b-4db4-bfff-b68683a9b4ad-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7fe6c6d1-793b-4db4-bfff-b68683a9b4ad" (UID: "7fe6c6d1-793b-4db4-bfff-b68683a9b4ad"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:09:10 crc kubenswrapper[4771]: I0227 01:09:10.834070 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7fe6c6d1-793b-4db4-bfff-b68683a9b4ad-kube-api-access\") pod \"7fe6c6d1-793b-4db4-bfff-b68683a9b4ad\" (UID: \"7fe6c6d1-793b-4db4-bfff-b68683a9b4ad\") " Feb 27 01:09:10 crc kubenswrapper[4771]: I0227 01:09:10.834332 4771 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7fe6c6d1-793b-4db4-bfff-b68683a9b4ad-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:10 crc kubenswrapper[4771]: I0227 01:09:10.843914 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fe6c6d1-793b-4db4-bfff-b68683a9b4ad-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7fe6c6d1-793b-4db4-bfff-b68683a9b4ad" (UID: "7fe6c6d1-793b-4db4-bfff-b68683a9b4ad"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:09:10 crc kubenswrapper[4771]: I0227 01:09:10.880032 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l" Feb 27 01:09:10 crc kubenswrapper[4771]: I0227 01:09:10.888363 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" Feb 27 01:09:10 crc kubenswrapper[4771]: I0227 01:09:10.935398 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7fe6c6d1-793b-4db4-bfff-b68683a9b4ad-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.036613 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c08e8ff-9791-4158-a79d-933dca605e06-config\") pod \"4c08e8ff-9791-4158-a79d-933dca605e06\" (UID: \"4c08e8ff-9791-4158-a79d-933dca605e06\") " Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.036690 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18d419ea-0351-478d-a54b-99f5597d9dd0-config\") pod \"18d419ea-0351-478d-a54b-99f5597d9dd0\" (UID: \"18d419ea-0351-478d-a54b-99f5597d9dd0\") " Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.036723 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpppg\" (UniqueName: \"kubernetes.io/projected/4c08e8ff-9791-4158-a79d-933dca605e06-kube-api-access-bpppg\") pod \"4c08e8ff-9791-4158-a79d-933dca605e06\" (UID: \"4c08e8ff-9791-4158-a79d-933dca605e06\") " Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.036767 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18d419ea-0351-478d-a54b-99f5597d9dd0-proxy-ca-bundles\") pod \"18d419ea-0351-478d-a54b-99f5597d9dd0\" (UID: \"18d419ea-0351-478d-a54b-99f5597d9dd0\") " Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.036801 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csbtl\" (UniqueName: \"kubernetes.io/projected/18d419ea-0351-478d-a54b-99f5597d9dd0-kube-api-access-csbtl\") pod \"18d419ea-0351-478d-a54b-99f5597d9dd0\" (UID: \"18d419ea-0351-478d-a54b-99f5597d9dd0\") " Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.036834 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c08e8ff-9791-4158-a79d-933dca605e06-client-ca\") pod \"4c08e8ff-9791-4158-a79d-933dca605e06\" (UID: \"4c08e8ff-9791-4158-a79d-933dca605e06\") " Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.036874 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18d419ea-0351-478d-a54b-99f5597d9dd0-client-ca\") pod \"18d419ea-0351-478d-a54b-99f5597d9dd0\" (UID: \"18d419ea-0351-478d-a54b-99f5597d9dd0\") " Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.036899 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18d419ea-0351-478d-a54b-99f5597d9dd0-serving-cert\") pod \"18d419ea-0351-478d-a54b-99f5597d9dd0\" (UID: \"18d419ea-0351-478d-a54b-99f5597d9dd0\") " Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.036940 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c08e8ff-9791-4158-a79d-933dca605e06-serving-cert\") pod \"4c08e8ff-9791-4158-a79d-933dca605e06\" (UID: \"4c08e8ff-9791-4158-a79d-933dca605e06\") " Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.037563 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c08e8ff-9791-4158-a79d-933dca605e06-config" (OuterVolumeSpecName: "config") pod "4c08e8ff-9791-4158-a79d-933dca605e06" (UID: "4c08e8ff-9791-4158-a79d-933dca605e06"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.037663 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18d419ea-0351-478d-a54b-99f5597d9dd0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "18d419ea-0351-478d-a54b-99f5597d9dd0" (UID: "18d419ea-0351-478d-a54b-99f5597d9dd0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.037766 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18d419ea-0351-478d-a54b-99f5597d9dd0-client-ca" (OuterVolumeSpecName: "client-ca") pod "18d419ea-0351-478d-a54b-99f5597d9dd0" (UID: "18d419ea-0351-478d-a54b-99f5597d9dd0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.037811 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18d419ea-0351-478d-a54b-99f5597d9dd0-config" (OuterVolumeSpecName: "config") pod "18d419ea-0351-478d-a54b-99f5597d9dd0" (UID: "18d419ea-0351-478d-a54b-99f5597d9dd0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.037799 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c08e8ff-9791-4158-a79d-933dca605e06-client-ca" (OuterVolumeSpecName: "client-ca") pod "4c08e8ff-9791-4158-a79d-933dca605e06" (UID: "4c08e8ff-9791-4158-a79d-933dca605e06"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.039613 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18d419ea-0351-478d-a54b-99f5597d9dd0-kube-api-access-csbtl" (OuterVolumeSpecName: "kube-api-access-csbtl") pod "18d419ea-0351-478d-a54b-99f5597d9dd0" (UID: "18d419ea-0351-478d-a54b-99f5597d9dd0"). InnerVolumeSpecName "kube-api-access-csbtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.039914 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d419ea-0351-478d-a54b-99f5597d9dd0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "18d419ea-0351-478d-a54b-99f5597d9dd0" (UID: "18d419ea-0351-478d-a54b-99f5597d9dd0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.039909 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c08e8ff-9791-4158-a79d-933dca605e06-kube-api-access-bpppg" (OuterVolumeSpecName: "kube-api-access-bpppg") pod "4c08e8ff-9791-4158-a79d-933dca605e06" (UID: "4c08e8ff-9791-4158-a79d-933dca605e06"). InnerVolumeSpecName "kube-api-access-bpppg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.040109 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c08e8ff-9791-4158-a79d-933dca605e06-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4c08e8ff-9791-4158-a79d-933dca605e06" (UID: "4c08e8ff-9791-4158-a79d-933dca605e06"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.138353 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c08e8ff-9791-4158-a79d-933dca605e06-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.138390 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c08e8ff-9791-4158-a79d-933dca605e06-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.138399 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18d419ea-0351-478d-a54b-99f5597d9dd0-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.138409 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpppg\" (UniqueName: \"kubernetes.io/projected/4c08e8ff-9791-4158-a79d-933dca605e06-kube-api-access-bpppg\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.138421 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18d419ea-0351-478d-a54b-99f5597d9dd0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.138430 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csbtl\" (UniqueName: \"kubernetes.io/projected/18d419ea-0351-478d-a54b-99f5597d9dd0-kube-api-access-csbtl\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.138437 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c08e8ff-9791-4158-a79d-933dca605e06-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.138445 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18d419ea-0351-478d-a54b-99f5597d9dd0-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.138452 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18d419ea-0351-478d-a54b-99f5597d9dd0-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.432891 4771 generic.go:334] "Generic (PLEG): container finished" podID="18d419ea-0351-478d-a54b-99f5597d9dd0" containerID="9265ddaa678deb10d03bf12d26f3a54237dba31bd1e3db5074316de2264397ce" exitCode=0 Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.432942 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" event={"ID":"18d419ea-0351-478d-a54b-99f5597d9dd0","Type":"ContainerDied","Data":"9265ddaa678deb10d03bf12d26f3a54237dba31bd1e3db5074316de2264397ce"} Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.432967 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" event={"ID":"18d419ea-0351-478d-a54b-99f5597d9dd0","Type":"ContainerDied","Data":"526a1283abcbbcf08cc831d256e2862a8402a680ee57168eb37dace6c7dc78aa"} Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.432981 4771 scope.go:117] "RemoveContainer" containerID="9265ddaa678deb10d03bf12d26f3a54237dba31bd1e3db5074316de2264397ce" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.433073 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5ccb7bb557-wrphg" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.436631 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7fe6c6d1-793b-4db4-bfff-b68683a9b4ad","Type":"ContainerDied","Data":"7a0b2896aeafe007da8470282367ef4fccefc0e7b39dad639cc5494031c2b539"} Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.436671 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a0b2896aeafe007da8470282367ef4fccefc0e7b39dad639cc5494031c2b539" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.436720 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.451279 4771 generic.go:334] "Generic (PLEG): container finished" podID="4c08e8ff-9791-4158-a79d-933dca605e06" containerID="4bbca217c13f61b688f3f0078a137122b3a261a9cddc3e60d44f0c42966d9a24" exitCode=0 Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.451319 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l" event={"ID":"4c08e8ff-9791-4158-a79d-933dca605e06","Type":"ContainerDied","Data":"4bbca217c13f61b688f3f0078a137122b3a261a9cddc3e60d44f0c42966d9a24"} Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.451343 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l" event={"ID":"4c08e8ff-9791-4158-a79d-933dca605e06","Type":"ContainerDied","Data":"d37cee6d734425a4a88fc45c6a62d0ed77adab27cdedb62bc0cc9755ba617ef7"} Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.451619 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.453626 4771 scope.go:117] "RemoveContainer" containerID="9265ddaa678deb10d03bf12d26f3a54237dba31bd1e3db5074316de2264397ce" Feb 27 01:09:11 crc kubenswrapper[4771]: E0227 01:09:11.453900 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9265ddaa678deb10d03bf12d26f3a54237dba31bd1e3db5074316de2264397ce\": container with ID starting with 9265ddaa678deb10d03bf12d26f3a54237dba31bd1e3db5074316de2264397ce not found: ID does not exist" containerID="9265ddaa678deb10d03bf12d26f3a54237dba31bd1e3db5074316de2264397ce" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.453928 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9265ddaa678deb10d03bf12d26f3a54237dba31bd1e3db5074316de2264397ce"} err="failed to get container status \"9265ddaa678deb10d03bf12d26f3a54237dba31bd1e3db5074316de2264397ce\": rpc error: code = NotFound desc = could not find container \"9265ddaa678deb10d03bf12d26f3a54237dba31bd1e3db5074316de2264397ce\": container with ID starting with 9265ddaa678deb10d03bf12d26f3a54237dba31bd1e3db5074316de2264397ce not found: ID does not exist" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.453951 4771 scope.go:117] "RemoveContainer" containerID="4bbca217c13f61b688f3f0078a137122b3a261a9cddc3e60d44f0c42966d9a24" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.468676 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5ccb7bb557-wrphg"] Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.471527 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5ccb7bb557-wrphg"] Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.475533 4771 scope.go:117] "RemoveContainer" containerID="4bbca217c13f61b688f3f0078a137122b3a261a9cddc3e60d44f0c42966d9a24" Feb 27 01:09:11 crc kubenswrapper[4771]: E0227 01:09:11.482368 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bbca217c13f61b688f3f0078a137122b3a261a9cddc3e60d44f0c42966d9a24\": container with ID starting with 4bbca217c13f61b688f3f0078a137122b3a261a9cddc3e60d44f0c42966d9a24 not found: ID does not exist" containerID="4bbca217c13f61b688f3f0078a137122b3a261a9cddc3e60d44f0c42966d9a24" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.482644 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bbca217c13f61b688f3f0078a137122b3a261a9cddc3e60d44f0c42966d9a24"} err="failed to get container status \"4bbca217c13f61b688f3f0078a137122b3a261a9cddc3e60d44f0c42966d9a24\": rpc error: code = NotFound desc = could not find container \"4bbca217c13f61b688f3f0078a137122b3a261a9cddc3e60d44f0c42966d9a24\": container with ID starting with 4bbca217c13f61b688f3f0078a137122b3a261a9cddc3e60d44f0c42966d9a24 not found: ID does not exist" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.485052 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l"] Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.487857 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67cb7d9cdc-dhf8l"] Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.780315 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18d419ea-0351-478d-a54b-99f5597d9dd0" path="/var/lib/kubelet/pods/18d419ea-0351-478d-a54b-99f5597d9dd0/volumes" Feb 27 01:09:11 crc kubenswrapper[4771]: I0227 01:09:11.780813 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c08e8ff-9791-4158-a79d-933dca605e06" path="/var/lib/kubelet/pods/4c08e8ff-9791-4158-a79d-933dca605e06/volumes" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.609473 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-74565c8b54-pzq4n"] Feb 27 01:09:12 crc kubenswrapper[4771]: E0227 01:09:12.610008 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c08e8ff-9791-4158-a79d-933dca605e06" containerName="route-controller-manager" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.610030 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c08e8ff-9791-4158-a79d-933dca605e06" containerName="route-controller-manager" Feb 27 01:09:12 crc kubenswrapper[4771]: E0227 01:09:12.610049 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0d5634e-ce3f-40a5-b85d-64f8c4708c59" containerName="oc" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.610064 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0d5634e-ce3f-40a5-b85d-64f8c4708c59" containerName="oc" Feb 27 01:09:12 crc kubenswrapper[4771]: E0227 01:09:12.610093 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d419ea-0351-478d-a54b-99f5597d9dd0" containerName="controller-manager" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.610107 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d419ea-0351-478d-a54b-99f5597d9dd0" containerName="controller-manager" Feb 27 01:09:12 crc kubenswrapper[4771]: E0227 01:09:12.610130 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fe6c6d1-793b-4db4-bfff-b68683a9b4ad" containerName="pruner" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.610143 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fe6c6d1-793b-4db4-bfff-b68683a9b4ad" containerName="pruner" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.610326 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="18d419ea-0351-478d-a54b-99f5597d9dd0" containerName="controller-manager" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.610343 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0d5634e-ce3f-40a5-b85d-64f8c4708c59" containerName="oc" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.610365 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fe6c6d1-793b-4db4-bfff-b68683a9b4ad" containerName="pruner" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.610384 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c08e8ff-9791-4158-a79d-933dca605e06" containerName="route-controller-manager" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.614000 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.617593 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.618744 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv"] Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.617997 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.618269 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.618314 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.619339 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.620778 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.624057 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.624584 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.624800 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.625283 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.625437 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.625610 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.629364 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.629681 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.629827 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74565c8b54-pzq4n"] Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.633749 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv"] Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.755574 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/23a12bb6-171f-4e2b-9706-1b3ba0948752-proxy-ca-bundles\") pod \"controller-manager-74565c8b54-pzq4n\" (UID: \"23a12bb6-171f-4e2b-9706-1b3ba0948752\") " pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.755631 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f000cf8a-5b9a-472c-b986-881daa2151e1-serving-cert\") pod \"route-controller-manager-75c84b495d-b5fpv\" (UID: \"f000cf8a-5b9a-472c-b986-881daa2151e1\") " pod="openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.755667 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f000cf8a-5b9a-472c-b986-881daa2151e1-config\") pod \"route-controller-manager-75c84b495d-b5fpv\" (UID: \"f000cf8a-5b9a-472c-b986-881daa2151e1\") " pod="openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.755735 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhnmh\" (UniqueName: \"kubernetes.io/projected/23a12bb6-171f-4e2b-9706-1b3ba0948752-kube-api-access-nhnmh\") pod \"controller-manager-74565c8b54-pzq4n\" (UID: \"23a12bb6-171f-4e2b-9706-1b3ba0948752\") " pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.755874 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23a12bb6-171f-4e2b-9706-1b3ba0948752-serving-cert\") pod \"controller-manager-74565c8b54-pzq4n\" (UID: \"23a12bb6-171f-4e2b-9706-1b3ba0948752\") " pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.755920 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f62tz\" (UniqueName: \"kubernetes.io/projected/f000cf8a-5b9a-472c-b986-881daa2151e1-kube-api-access-f62tz\") pod \"route-controller-manager-75c84b495d-b5fpv\" (UID: \"f000cf8a-5b9a-472c-b986-881daa2151e1\") " pod="openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.755983 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23a12bb6-171f-4e2b-9706-1b3ba0948752-client-ca\") pod \"controller-manager-74565c8b54-pzq4n\" (UID: \"23a12bb6-171f-4e2b-9706-1b3ba0948752\") " pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.756029 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23a12bb6-171f-4e2b-9706-1b3ba0948752-config\") pod \"controller-manager-74565c8b54-pzq4n\" (UID: \"23a12bb6-171f-4e2b-9706-1b3ba0948752\") " pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.756045 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f000cf8a-5b9a-472c-b986-881daa2151e1-client-ca\") pod \"route-controller-manager-75c84b495d-b5fpv\" (UID: \"f000cf8a-5b9a-472c-b986-881daa2151e1\") " pod="openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.857579 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f000cf8a-5b9a-472c-b986-881daa2151e1-config\") pod \"route-controller-manager-75c84b495d-b5fpv\" (UID: \"f000cf8a-5b9a-472c-b986-881daa2151e1\") " pod="openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.857907 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhnmh\" (UniqueName: \"kubernetes.io/projected/23a12bb6-171f-4e2b-9706-1b3ba0948752-kube-api-access-nhnmh\") pod \"controller-manager-74565c8b54-pzq4n\" (UID: \"23a12bb6-171f-4e2b-9706-1b3ba0948752\") " pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.857946 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23a12bb6-171f-4e2b-9706-1b3ba0948752-serving-cert\") pod \"controller-manager-74565c8b54-pzq4n\" (UID: \"23a12bb6-171f-4e2b-9706-1b3ba0948752\") " pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.857971 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f62tz\" (UniqueName: \"kubernetes.io/projected/f000cf8a-5b9a-472c-b986-881daa2151e1-kube-api-access-f62tz\") pod \"route-controller-manager-75c84b495d-b5fpv\" (UID: \"f000cf8a-5b9a-472c-b986-881daa2151e1\") " pod="openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.858004 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23a12bb6-171f-4e2b-9706-1b3ba0948752-client-ca\") pod \"controller-manager-74565c8b54-pzq4n\" (UID: \"23a12bb6-171f-4e2b-9706-1b3ba0948752\") " pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.858023 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23a12bb6-171f-4e2b-9706-1b3ba0948752-config\") pod \"controller-manager-74565c8b54-pzq4n\" (UID: \"23a12bb6-171f-4e2b-9706-1b3ba0948752\") " pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.858036 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f000cf8a-5b9a-472c-b986-881daa2151e1-client-ca\") pod \"route-controller-manager-75c84b495d-b5fpv\" (UID: \"f000cf8a-5b9a-472c-b986-881daa2151e1\") " pod="openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.858064 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/23a12bb6-171f-4e2b-9706-1b3ba0948752-proxy-ca-bundles\") pod \"controller-manager-74565c8b54-pzq4n\" (UID: \"23a12bb6-171f-4e2b-9706-1b3ba0948752\") " pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.858098 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f000cf8a-5b9a-472c-b986-881daa2151e1-serving-cert\") pod \"route-controller-manager-75c84b495d-b5fpv\" (UID: \"f000cf8a-5b9a-472c-b986-881daa2151e1\") " pod="openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.858881 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f000cf8a-5b9a-472c-b986-881daa2151e1-config\") pod \"route-controller-manager-75c84b495d-b5fpv\" (UID: \"f000cf8a-5b9a-472c-b986-881daa2151e1\") " pod="openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.859103 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f000cf8a-5b9a-472c-b986-881daa2151e1-client-ca\") pod \"route-controller-manager-75c84b495d-b5fpv\" (UID: \"f000cf8a-5b9a-472c-b986-881daa2151e1\") " pod="openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.860946 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23a12bb6-171f-4e2b-9706-1b3ba0948752-config\") pod \"controller-manager-74565c8b54-pzq4n\" (UID: \"23a12bb6-171f-4e2b-9706-1b3ba0948752\") " pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.861068 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/23a12bb6-171f-4e2b-9706-1b3ba0948752-proxy-ca-bundles\") pod \"controller-manager-74565c8b54-pzq4n\" (UID: \"23a12bb6-171f-4e2b-9706-1b3ba0948752\") " pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.861597 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23a12bb6-171f-4e2b-9706-1b3ba0948752-client-ca\") pod \"controller-manager-74565c8b54-pzq4n\" (UID: \"23a12bb6-171f-4e2b-9706-1b3ba0948752\") " pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.865202 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f000cf8a-5b9a-472c-b986-881daa2151e1-serving-cert\") pod \"route-controller-manager-75c84b495d-b5fpv\" (UID: \"f000cf8a-5b9a-472c-b986-881daa2151e1\") " pod="openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.865204 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23a12bb6-171f-4e2b-9706-1b3ba0948752-serving-cert\") pod \"controller-manager-74565c8b54-pzq4n\" (UID: \"23a12bb6-171f-4e2b-9706-1b3ba0948752\") " pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.873682 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhnmh\" (UniqueName: \"kubernetes.io/projected/23a12bb6-171f-4e2b-9706-1b3ba0948752-kube-api-access-nhnmh\") pod \"controller-manager-74565c8b54-pzq4n\" (UID: \"23a12bb6-171f-4e2b-9706-1b3ba0948752\") " pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.879094 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f62tz\" (UniqueName: \"kubernetes.io/projected/f000cf8a-5b9a-472c-b986-881daa2151e1-kube-api-access-f62tz\") pod \"route-controller-manager-75c84b495d-b5fpv\" (UID: \"f000cf8a-5b9a-472c-b986-881daa2151e1\") " pod="openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.943947 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" Feb 27 01:09:12 crc kubenswrapper[4771]: I0227 01:09:12.949599 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv" Feb 27 01:09:13 crc kubenswrapper[4771]: I0227 01:09:13.319469 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74565c8b54-pzq4n"] Feb 27 01:09:13 crc kubenswrapper[4771]: I0227 01:09:13.355696 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv"] Feb 27 01:09:13 crc kubenswrapper[4771]: W0227 01:09:13.361205 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf000cf8a_5b9a_472c_b986_881daa2151e1.slice/crio-45d0b174205ff1ea7f95ebe53a08e6edff03851e7f5ee4069a8f93b4e0dc2ea9 WatchSource:0}: Error finding container 45d0b174205ff1ea7f95ebe53a08e6edff03851e7f5ee4069a8f93b4e0dc2ea9: Status 404 returned error can't find the container with id 45d0b174205ff1ea7f95ebe53a08e6edff03851e7f5ee4069a8f93b4e0dc2ea9 Feb 27 01:09:13 crc kubenswrapper[4771]: I0227 01:09:13.466711 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv" event={"ID":"f000cf8a-5b9a-472c-b986-881daa2151e1","Type":"ContainerStarted","Data":"6a7e8563774609c883331d2fe0b74fb7e738953d3b2c873c8c845205a013fd9c"} Feb 27 01:09:13 crc kubenswrapper[4771]: I0227 01:09:13.466757 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv" event={"ID":"f000cf8a-5b9a-472c-b986-881daa2151e1","Type":"ContainerStarted","Data":"45d0b174205ff1ea7f95ebe53a08e6edff03851e7f5ee4069a8f93b4e0dc2ea9"} Feb 27 01:09:13 crc kubenswrapper[4771]: I0227 01:09:13.468772 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" event={"ID":"23a12bb6-171f-4e2b-9706-1b3ba0948752","Type":"ContainerStarted","Data":"c808d905c2a943fcda916ad222a2d5dedb686587c6484600f094a8f7699eb5e8"} Feb 27 01:09:13 crc kubenswrapper[4771]: I0227 01:09:13.469097 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" Feb 27 01:09:13 crc kubenswrapper[4771]: I0227 01:09:13.471592 4771 patch_prober.go:28] interesting pod/controller-manager-74565c8b54-pzq4n container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" start-of-body= Feb 27 01:09:13 crc kubenswrapper[4771]: I0227 01:09:13.471666 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" podUID="23a12bb6-171f-4e2b-9706-1b3ba0948752" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" Feb 27 01:09:13 crc kubenswrapper[4771]: I0227 01:09:13.867936 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" podStartSLOduration=5.86791329 podStartE2EDuration="5.86791329s" podCreationTimestamp="2026-02-27 01:09:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:09:13.493929046 +0000 UTC m=+266.431490334" watchObservedRunningTime="2026-02-27 01:09:13.86791329 +0000 UTC m=+266.805474578" Feb 27 01:09:13 crc kubenswrapper[4771]: I0227 01:09:13.870866 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 27 01:09:13 crc kubenswrapper[4771]: I0227 01:09:13.871678 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 27 01:09:13 crc kubenswrapper[4771]: I0227 01:09:13.875197 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 27 01:09:13 crc kubenswrapper[4771]: I0227 01:09:13.876087 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 27 01:09:13 crc kubenswrapper[4771]: I0227 01:09:13.894410 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 27 01:09:13 crc kubenswrapper[4771]: I0227 01:09:13.970444 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dea387ee-2398-4e31-b664-58bae30775ca-kubelet-dir\") pod \"installer-9-crc\" (UID: \"dea387ee-2398-4e31-b664-58bae30775ca\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 01:09:13 crc kubenswrapper[4771]: I0227 01:09:13.970538 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dea387ee-2398-4e31-b664-58bae30775ca-var-lock\") pod \"installer-9-crc\" (UID: \"dea387ee-2398-4e31-b664-58bae30775ca\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 01:09:13 crc kubenswrapper[4771]: I0227 01:09:13.970587 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dea387ee-2398-4e31-b664-58bae30775ca-kube-api-access\") pod \"installer-9-crc\" (UID: \"dea387ee-2398-4e31-b664-58bae30775ca\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 01:09:14 crc kubenswrapper[4771]: I0227 01:09:14.071793 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dea387ee-2398-4e31-b664-58bae30775ca-var-lock\") pod \"installer-9-crc\" (UID: \"dea387ee-2398-4e31-b664-58bae30775ca\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 01:09:14 crc kubenswrapper[4771]: I0227 01:09:14.071838 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dea387ee-2398-4e31-b664-58bae30775ca-kube-api-access\") pod \"installer-9-crc\" (UID: \"dea387ee-2398-4e31-b664-58bae30775ca\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 01:09:14 crc kubenswrapper[4771]: I0227 01:09:14.071894 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dea387ee-2398-4e31-b664-58bae30775ca-kubelet-dir\") pod \"installer-9-crc\" (UID: \"dea387ee-2398-4e31-b664-58bae30775ca\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 01:09:14 crc kubenswrapper[4771]: I0227 01:09:14.071901 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dea387ee-2398-4e31-b664-58bae30775ca-var-lock\") pod \"installer-9-crc\" (UID: \"dea387ee-2398-4e31-b664-58bae30775ca\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 01:09:14 crc kubenswrapper[4771]: I0227 01:09:14.071956 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dea387ee-2398-4e31-b664-58bae30775ca-kubelet-dir\") pod \"installer-9-crc\" (UID: \"dea387ee-2398-4e31-b664-58bae30775ca\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 01:09:14 crc kubenswrapper[4771]: I0227 01:09:14.090456 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dea387ee-2398-4e31-b664-58bae30775ca-kube-api-access\") pod \"installer-9-crc\" (UID: \"dea387ee-2398-4e31-b664-58bae30775ca\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 01:09:14 crc kubenswrapper[4771]: I0227 01:09:14.239904 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 27 01:09:14 crc kubenswrapper[4771]: I0227 01:09:14.456269 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 27 01:09:14 crc kubenswrapper[4771]: I0227 01:09:14.476240 4771 generic.go:334] "Generic (PLEG): container finished" podID="739ccbd2-56c7-4a26-ad40-4f0f908089e8" containerID="28ff4feea473c2d9c69ca8261a4e864aeb36499b9e1851ea7fdfadf1057cc13f" exitCode=0 Feb 27 01:09:14 crc kubenswrapper[4771]: I0227 01:09:14.476309 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z5hx5" event={"ID":"739ccbd2-56c7-4a26-ad40-4f0f908089e8","Type":"ContainerDied","Data":"28ff4feea473c2d9c69ca8261a4e864aeb36499b9e1851ea7fdfadf1057cc13f"} Feb 27 01:09:14 crc kubenswrapper[4771]: I0227 01:09:14.479068 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" event={"ID":"23a12bb6-171f-4e2b-9706-1b3ba0948752","Type":"ContainerStarted","Data":"31f738954f9f29b777d10e12b36b1bbbfabfa0e060d3ebeb82f88afb4ad90205"} Feb 27 01:09:14 crc kubenswrapper[4771]: I0227 01:09:14.484596 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" Feb 27 01:09:14 crc kubenswrapper[4771]: I0227 01:09:14.485985 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dea387ee-2398-4e31-b664-58bae30775ca","Type":"ContainerStarted","Data":"df397424bf00a1373d8980140f8119be6a6ba254b1c8dc93c00d0402d27d9b66"} Feb 27 01:09:14 crc kubenswrapper[4771]: I0227 01:09:14.511527 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv" podStartSLOduration=6.511501573 podStartE2EDuration="6.511501573s" podCreationTimestamp="2026-02-27 01:09:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:09:14.509905561 +0000 UTC m=+267.447466879" watchObservedRunningTime="2026-02-27 01:09:14.511501573 +0000 UTC m=+267.449062881" Feb 27 01:09:15 crc kubenswrapper[4771]: I0227 01:09:15.493130 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dea387ee-2398-4e31-b664-58bae30775ca","Type":"ContainerStarted","Data":"c8129f634d51517be3065265b53503f117583182ac0d29c505a79acf301629fd"} Feb 27 01:09:15 crc kubenswrapper[4771]: I0227 01:09:15.496577 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lds4k" event={"ID":"0b5757cb-321d-4a76-8769-786b28a2b004","Type":"ContainerStarted","Data":"d347dd6ed5ac7651f21d473a9b564eda8422fc90e4d337eaedbe16edc926f6d3"} Feb 27 01:09:15 crc kubenswrapper[4771]: I0227 01:09:15.500629 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z5hx5" event={"ID":"739ccbd2-56c7-4a26-ad40-4f0f908089e8","Type":"ContainerStarted","Data":"152d0ec9817218b57a398bab4a74271426223cf7219a89961aefc31ce90a8199"} Feb 27 01:09:15 crc kubenswrapper[4771]: I0227 01:09:15.500771 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv" Feb 27 01:09:15 crc kubenswrapper[4771]: I0227 01:09:15.507197 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv" Feb 27 01:09:15 crc kubenswrapper[4771]: I0227 01:09:15.510229 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.51021345 podStartE2EDuration="2.51021345s" podCreationTimestamp="2026-02-27 01:09:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:09:15.508965127 +0000 UTC m=+268.446526425" watchObservedRunningTime="2026-02-27 01:09:15.51021345 +0000 UTC m=+268.447774738" Feb 27 01:09:15 crc kubenswrapper[4771]: I0227 01:09:15.797603 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z5hx5" podStartSLOduration=2.7872634979999997 podStartE2EDuration="44.797584058s" podCreationTimestamp="2026-02-27 01:08:31 +0000 UTC" firstStartedPulling="2026-02-27 01:08:32.914839099 +0000 UTC m=+225.852400387" lastFinishedPulling="2026-02-27 01:09:14.925159649 +0000 UTC m=+267.862720947" observedRunningTime="2026-02-27 01:09:15.576006144 +0000 UTC m=+268.513567432" watchObservedRunningTime="2026-02-27 01:09:15.797584058 +0000 UTC m=+268.735145356" Feb 27 01:09:16 crc kubenswrapper[4771]: I0227 01:09:16.506573 4771 generic.go:334] "Generic (PLEG): container finished" podID="0b5757cb-321d-4a76-8769-786b28a2b004" containerID="d347dd6ed5ac7651f21d473a9b564eda8422fc90e4d337eaedbe16edc926f6d3" exitCode=0 Feb 27 01:09:16 crc kubenswrapper[4771]: I0227 01:09:16.506938 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lds4k" event={"ID":"0b5757cb-321d-4a76-8769-786b28a2b004","Type":"ContainerDied","Data":"d347dd6ed5ac7651f21d473a9b564eda8422fc90e4d337eaedbe16edc926f6d3"} Feb 27 01:09:17 crc kubenswrapper[4771]: I0227 01:09:17.513408 4771 generic.go:334] "Generic (PLEG): container finished" podID="460ffbff-d0f0-43dc-bde9-6279c9a4b6a3" containerID="95ecb081700720bf87b845e9a62ec591d900173d79072552c5e69dba5f110992" exitCode=0 Feb 27 01:09:17 crc kubenswrapper[4771]: I0227 01:09:17.513483 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xk58g" event={"ID":"460ffbff-d0f0-43dc-bde9-6279c9a4b6a3","Type":"ContainerDied","Data":"95ecb081700720bf87b845e9a62ec591d900173d79072552c5e69dba5f110992"} Feb 27 01:09:17 crc kubenswrapper[4771]: I0227 01:09:17.518256 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lds4k" event={"ID":"0b5757cb-321d-4a76-8769-786b28a2b004","Type":"ContainerStarted","Data":"0f0d42e63af5bd77c784ef421d2d59e606ac8de2e02658b3fe2f1a8785594c6d"} Feb 27 01:09:17 crc kubenswrapper[4771]: I0227 01:09:17.546735 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lds4k" podStartSLOduration=3.363747468 podStartE2EDuration="47.546719591s" podCreationTimestamp="2026-02-27 01:08:30 +0000 UTC" firstStartedPulling="2026-02-27 01:08:32.899804637 +0000 UTC m=+225.837365945" lastFinishedPulling="2026-02-27 01:09:17.08277678 +0000 UTC m=+270.020338068" observedRunningTime="2026-02-27 01:09:17.545408656 +0000 UTC m=+270.482969954" watchObservedRunningTime="2026-02-27 01:09:17.546719591 +0000 UTC m=+270.484280879" Feb 27 01:09:18 crc kubenswrapper[4771]: I0227 01:09:18.524701 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g5gxk" event={"ID":"b4029ae4-2dfb-4351-88f8-08fefb8ab46e","Type":"ContainerStarted","Data":"2ae0e2ff149c5134630b388b3ae0566c67b5fb7c4e1b9edbece6a6b618fe72b1"} Feb 27 01:09:18 crc kubenswrapper[4771]: I0227 01:09:18.526909 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xk58g" event={"ID":"460ffbff-d0f0-43dc-bde9-6279c9a4b6a3","Type":"ContainerStarted","Data":"3c71395cd4bd71530a327203b7f598b299539b916e93ce5e7465b8dcbe031487"} Feb 27 01:09:19 crc kubenswrapper[4771]: I0227 01:09:19.532854 4771 generic.go:334] "Generic (PLEG): container finished" podID="b4029ae4-2dfb-4351-88f8-08fefb8ab46e" containerID="2ae0e2ff149c5134630b388b3ae0566c67b5fb7c4e1b9edbece6a6b618fe72b1" exitCode=0 Feb 27 01:09:19 crc kubenswrapper[4771]: I0227 01:09:19.532923 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g5gxk" event={"ID":"b4029ae4-2dfb-4351-88f8-08fefb8ab46e","Type":"ContainerDied","Data":"2ae0e2ff149c5134630b388b3ae0566c67b5fb7c4e1b9edbece6a6b618fe72b1"} Feb 27 01:09:19 crc kubenswrapper[4771]: I0227 01:09:19.534714 4771 generic.go:334] "Generic (PLEG): container finished" podID="96a309b5-c10d-49d7-ade8-3c087250dd91" containerID="8ab16e9fcbc755af328bf7106826bde61bd5239d22b932d70a518fb4e5387e02" exitCode=0 Feb 27 01:09:19 crc kubenswrapper[4771]: I0227 01:09:19.534795 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbc2d" event={"ID":"96a309b5-c10d-49d7-ade8-3c087250dd91","Type":"ContainerDied","Data":"8ab16e9fcbc755af328bf7106826bde61bd5239d22b932d70a518fb4e5387e02"} Feb 27 01:09:19 crc kubenswrapper[4771]: I0227 01:09:19.536790 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7r96b" event={"ID":"33d95d7b-dfe7-495a-b686-5737dd95b974","Type":"ContainerStarted","Data":"08a8bfddb81f5f2af6b82fbf0bc4cdd2d42bdb3a31f76a788d084702fcd3fe23"} Feb 27 01:09:19 crc kubenswrapper[4771]: E0227 01:09:19.550462 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33d95d7b_dfe7_495a_b686_5737dd95b974.slice/crio-08a8bfddb81f5f2af6b82fbf0bc4cdd2d42bdb3a31f76a788d084702fcd3fe23.scope\": RecentStats: unable to find data in memory cache]" Feb 27 01:09:19 crc kubenswrapper[4771]: I0227 01:09:19.553397 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xk58g" podStartSLOduration=3.534599876 podStartE2EDuration="48.553375459s" podCreationTimestamp="2026-02-27 01:08:31 +0000 UTC" firstStartedPulling="2026-02-27 01:08:32.893848928 +0000 UTC m=+225.831410206" lastFinishedPulling="2026-02-27 01:09:17.912624501 +0000 UTC m=+270.850185789" observedRunningTime="2026-02-27 01:09:18.565321274 +0000 UTC m=+271.502882582" watchObservedRunningTime="2026-02-27 01:09:19.553375459 +0000 UTC m=+272.490936747" Feb 27 01:09:20 crc kubenswrapper[4771]: I0227 01:09:20.544973 4771 generic.go:334] "Generic (PLEG): container finished" podID="33d95d7b-dfe7-495a-b686-5737dd95b974" containerID="08a8bfddb81f5f2af6b82fbf0bc4cdd2d42bdb3a31f76a788d084702fcd3fe23" exitCode=0 Feb 27 01:09:20 crc kubenswrapper[4771]: I0227 01:09:20.545170 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7r96b" event={"ID":"33d95d7b-dfe7-495a-b686-5737dd95b974","Type":"ContainerDied","Data":"08a8bfddb81f5f2af6b82fbf0bc4cdd2d42bdb3a31f76a788d084702fcd3fe23"} Feb 27 01:09:21 crc kubenswrapper[4771]: I0227 01:09:21.320700 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lds4k" Feb 27 01:09:21 crc kubenswrapper[4771]: I0227 01:09:21.320776 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lds4k" Feb 27 01:09:21 crc kubenswrapper[4771]: I0227 01:09:21.523887 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xk58g" Feb 27 01:09:21 crc kubenswrapper[4771]: I0227 01:09:21.524283 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xk58g" Feb 27 01:09:21 crc kubenswrapper[4771]: I0227 01:09:21.540103 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lds4k" Feb 27 01:09:21 crc kubenswrapper[4771]: I0227 01:09:21.561184 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbc2d" event={"ID":"96a309b5-c10d-49d7-ade8-3c087250dd91","Type":"ContainerStarted","Data":"acf8f48a6bca2df434649400318e72c81e1e17c5ad4a9cb69acb1b15b603a50f"} Feb 27 01:09:21 crc kubenswrapper[4771]: I0227 01:09:21.574677 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7r96b" event={"ID":"33d95d7b-dfe7-495a-b686-5737dd95b974","Type":"ContainerStarted","Data":"adc583580a2e6555467330ce2ef6830ba65f8c21a4c2ca71e73d712e5c113f54"} Feb 27 01:09:21 crc kubenswrapper[4771]: I0227 01:09:21.581881 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sbc2d" podStartSLOduration=9.191452535 podStartE2EDuration="48.581859916s" podCreationTimestamp="2026-02-27 01:08:33 +0000 UTC" firstStartedPulling="2026-02-27 01:08:41.65035877 +0000 UTC m=+234.587920058" lastFinishedPulling="2026-02-27 01:09:21.040766161 +0000 UTC m=+273.978327439" observedRunningTime="2026-02-27 01:09:21.577339726 +0000 UTC m=+274.514901014" watchObservedRunningTime="2026-02-27 01:09:21.581859916 +0000 UTC m=+274.519421204" Feb 27 01:09:21 crc kubenswrapper[4771]: I0227 01:09:21.598973 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xk58g" Feb 27 01:09:21 crc kubenswrapper[4771]: I0227 01:09:21.608956 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7r96b" podStartSLOduration=8.230792749 podStartE2EDuration="47.608885243s" podCreationTimestamp="2026-02-27 01:08:34 +0000 UTC" firstStartedPulling="2026-02-27 01:08:41.650488454 +0000 UTC m=+234.588049742" lastFinishedPulling="2026-02-27 01:09:21.028580948 +0000 UTC m=+273.966142236" observedRunningTime="2026-02-27 01:09:21.603584362 +0000 UTC m=+274.541145650" watchObservedRunningTime="2026-02-27 01:09:21.608885243 +0000 UTC m=+274.546446541" Feb 27 01:09:21 crc kubenswrapper[4771]: I0227 01:09:21.693286 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z5hx5" Feb 27 01:09:21 crc kubenswrapper[4771]: I0227 01:09:21.693532 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z5hx5" Feb 27 01:09:21 crc kubenswrapper[4771]: I0227 01:09:21.732755 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z5hx5" Feb 27 01:09:22 crc kubenswrapper[4771]: I0227 01:09:22.581178 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g5gxk" event={"ID":"b4029ae4-2dfb-4351-88f8-08fefb8ab46e","Type":"ContainerStarted","Data":"5b329ce99e94ed0551bbd390d29da6b08d178907a0dc5ee073ff1a0389e79342"} Feb 27 01:09:22 crc kubenswrapper[4771]: I0227 01:09:22.583025 4771 generic.go:334] "Generic (PLEG): container finished" podID="65fbc634-d941-4dee-a758-3b0b10bd60f0" containerID="e905fed9b166a800e271dfb93c69fd17a7c60860a15a7fae25400bd9f145b1a7" exitCode=0 Feb 27 01:09:22 crc kubenswrapper[4771]: I0227 01:09:22.583590 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qs5th" event={"ID":"65fbc634-d941-4dee-a758-3b0b10bd60f0","Type":"ContainerDied","Data":"e905fed9b166a800e271dfb93c69fd17a7c60860a15a7fae25400bd9f145b1a7"} Feb 27 01:09:22 crc kubenswrapper[4771]: I0227 01:09:22.601413 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g5gxk" podStartSLOduration=3.254648592 podStartE2EDuration="51.601399996s" podCreationTimestamp="2026-02-27 01:08:31 +0000 UTC" firstStartedPulling="2026-02-27 01:08:33.02605909 +0000 UTC m=+225.963620378" lastFinishedPulling="2026-02-27 01:09:21.372810494 +0000 UTC m=+274.310371782" observedRunningTime="2026-02-27 01:09:22.599481134 +0000 UTC m=+275.537042422" watchObservedRunningTime="2026-02-27 01:09:22.601399996 +0000 UTC m=+275.538961284" Feb 27 01:09:22 crc kubenswrapper[4771]: I0227 01:09:22.630882 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lds4k" Feb 27 01:09:22 crc kubenswrapper[4771]: I0227 01:09:22.646970 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z5hx5" Feb 27 01:09:23 crc kubenswrapper[4771]: I0227 01:09:23.506501 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z5hx5"] Feb 27 01:09:23 crc kubenswrapper[4771]: I0227 01:09:23.907984 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sbc2d" Feb 27 01:09:23 crc kubenswrapper[4771]: I0227 01:09:23.908252 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sbc2d" Feb 27 01:09:23 crc kubenswrapper[4771]: I0227 01:09:23.952018 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sbc2d" Feb 27 01:09:24 crc kubenswrapper[4771]: I0227 01:09:24.490952 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7r96b" Feb 27 01:09:24 crc kubenswrapper[4771]: I0227 01:09:24.490997 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7r96b" Feb 27 01:09:24 crc kubenswrapper[4771]: I0227 01:09:24.601393 4771 generic.go:334] "Generic (PLEG): container finished" podID="deb9a4a5-1474-4744-a57e-fcdcc97922ed" containerID="53c069778ad436ecffc8db28eaf5a029961d85a3bc94849e7778bda0d571d9b3" exitCode=0 Feb 27 01:09:24 crc kubenswrapper[4771]: I0227 01:09:24.601492 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5l2f" event={"ID":"deb9a4a5-1474-4744-a57e-fcdcc97922ed","Type":"ContainerDied","Data":"53c069778ad436ecffc8db28eaf5a029961d85a3bc94849e7778bda0d571d9b3"} Feb 27 01:09:24 crc kubenswrapper[4771]: I0227 01:09:24.602118 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z5hx5" podUID="739ccbd2-56c7-4a26-ad40-4f0f908089e8" containerName="registry-server" containerID="cri-o://152d0ec9817218b57a398bab4a74271426223cf7219a89961aefc31ce90a8199" gracePeriod=2 Feb 27 01:09:25 crc kubenswrapper[4771]: I0227 01:09:25.527015 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7r96b" podUID="33d95d7b-dfe7-495a-b686-5737dd95b974" containerName="registry-server" probeResult="failure" output=< Feb 27 01:09:25 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 27 01:09:25 crc kubenswrapper[4771]: > Feb 27 01:09:25 crc kubenswrapper[4771]: I0227 01:09:25.608123 4771 generic.go:334] "Generic (PLEG): container finished" podID="739ccbd2-56c7-4a26-ad40-4f0f908089e8" containerID="152d0ec9817218b57a398bab4a74271426223cf7219a89961aefc31ce90a8199" exitCode=0 Feb 27 01:09:25 crc kubenswrapper[4771]: I0227 01:09:25.608159 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z5hx5" event={"ID":"739ccbd2-56c7-4a26-ad40-4f0f908089e8","Type":"ContainerDied","Data":"152d0ec9817218b57a398bab4a74271426223cf7219a89961aefc31ce90a8199"} Feb 27 01:09:26 crc kubenswrapper[4771]: I0227 01:09:26.614677 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z5hx5" event={"ID":"739ccbd2-56c7-4a26-ad40-4f0f908089e8","Type":"ContainerDied","Data":"da90fe4302f6fc3636c974a254edf9cb034bb28bde68d1d4380a2d590be2b546"} Feb 27 01:09:26 crc kubenswrapper[4771]: I0227 01:09:26.615005 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da90fe4302f6fc3636c974a254edf9cb034bb28bde68d1d4380a2d590be2b546" Feb 27 01:09:26 crc kubenswrapper[4771]: I0227 01:09:26.616701 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qs5th" event={"ID":"65fbc634-d941-4dee-a758-3b0b10bd60f0","Type":"ContainerStarted","Data":"f38d18dbf0fd05c4818a47695fdb839783a19422fc076dfa197c9fcce4ac8bf4"} Feb 27 01:09:26 crc kubenswrapper[4771]: I0227 01:09:26.634539 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z5hx5" Feb 27 01:09:26 crc kubenswrapper[4771]: I0227 01:09:26.654724 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/739ccbd2-56c7-4a26-ad40-4f0f908089e8-catalog-content\") pod \"739ccbd2-56c7-4a26-ad40-4f0f908089e8\" (UID: \"739ccbd2-56c7-4a26-ad40-4f0f908089e8\") " Feb 27 01:09:26 crc kubenswrapper[4771]: I0227 01:09:26.654810 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b8tx\" (UniqueName: \"kubernetes.io/projected/739ccbd2-56c7-4a26-ad40-4f0f908089e8-kube-api-access-4b8tx\") pod \"739ccbd2-56c7-4a26-ad40-4f0f908089e8\" (UID: \"739ccbd2-56c7-4a26-ad40-4f0f908089e8\") " Feb 27 01:09:26 crc kubenswrapper[4771]: I0227 01:09:26.654835 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/739ccbd2-56c7-4a26-ad40-4f0f908089e8-utilities\") pod \"739ccbd2-56c7-4a26-ad40-4f0f908089e8\" (UID: \"739ccbd2-56c7-4a26-ad40-4f0f908089e8\") " Feb 27 01:09:26 crc kubenswrapper[4771]: I0227 01:09:26.656607 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/739ccbd2-56c7-4a26-ad40-4f0f908089e8-utilities" (OuterVolumeSpecName: "utilities") pod "739ccbd2-56c7-4a26-ad40-4f0f908089e8" (UID: "739ccbd2-56c7-4a26-ad40-4f0f908089e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:09:26 crc kubenswrapper[4771]: I0227 01:09:26.663816 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/739ccbd2-56c7-4a26-ad40-4f0f908089e8-kube-api-access-4b8tx" (OuterVolumeSpecName: "kube-api-access-4b8tx") pod "739ccbd2-56c7-4a26-ad40-4f0f908089e8" (UID: "739ccbd2-56c7-4a26-ad40-4f0f908089e8"). InnerVolumeSpecName "kube-api-access-4b8tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:09:26 crc kubenswrapper[4771]: I0227 01:09:26.735489 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/739ccbd2-56c7-4a26-ad40-4f0f908089e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "739ccbd2-56c7-4a26-ad40-4f0f908089e8" (UID: "739ccbd2-56c7-4a26-ad40-4f0f908089e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:09:26 crc kubenswrapper[4771]: I0227 01:09:26.755704 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b8tx\" (UniqueName: \"kubernetes.io/projected/739ccbd2-56c7-4a26-ad40-4f0f908089e8-kube-api-access-4b8tx\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:26 crc kubenswrapper[4771]: I0227 01:09:26.755740 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/739ccbd2-56c7-4a26-ad40-4f0f908089e8-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:26 crc kubenswrapper[4771]: I0227 01:09:26.755751 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/739ccbd2-56c7-4a26-ad40-4f0f908089e8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:27 crc kubenswrapper[4771]: I0227 01:09:27.624438 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z5hx5" Feb 27 01:09:27 crc kubenswrapper[4771]: I0227 01:09:27.671916 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qs5th" podStartSLOduration=9.727492188 podStartE2EDuration="53.671897261s" podCreationTimestamp="2026-02-27 01:08:34 +0000 UTC" firstStartedPulling="2026-02-27 01:08:41.650369861 +0000 UTC m=+234.587931149" lastFinishedPulling="2026-02-27 01:09:25.594774934 +0000 UTC m=+278.532336222" observedRunningTime="2026-02-27 01:09:27.6681277 +0000 UTC m=+280.605688998" watchObservedRunningTime="2026-02-27 01:09:27.671897261 +0000 UTC m=+280.609458559" Feb 27 01:09:27 crc kubenswrapper[4771]: I0227 01:09:27.684012 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z5hx5"] Feb 27 01:09:27 crc kubenswrapper[4771]: I0227 01:09:27.686357 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z5hx5"] Feb 27 01:09:27 crc kubenswrapper[4771]: I0227 01:09:27.785802 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="739ccbd2-56c7-4a26-ad40-4f0f908089e8" path="/var/lib/kubelet/pods/739ccbd2-56c7-4a26-ad40-4f0f908089e8/volumes" Feb 27 01:09:28 crc kubenswrapper[4771]: I0227 01:09:28.599810 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74565c8b54-pzq4n"] Feb 27 01:09:28 crc kubenswrapper[4771]: I0227 01:09:28.600368 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" podUID="23a12bb6-171f-4e2b-9706-1b3ba0948752" containerName="controller-manager" containerID="cri-o://31f738954f9f29b777d10e12b36b1bbbfabfa0e060d3ebeb82f88afb4ad90205" gracePeriod=30 Feb 27 01:09:28 crc kubenswrapper[4771]: I0227 01:09:28.624418 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv"] Feb 27 01:09:28 crc kubenswrapper[4771]: I0227 01:09:28.624743 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv" podUID="f000cf8a-5b9a-472c-b986-881daa2151e1" containerName="route-controller-manager" containerID="cri-o://6a7e8563774609c883331d2fe0b74fb7e738953d3b2c873c8c845205a013fd9c" gracePeriod=30 Feb 27 01:09:28 crc kubenswrapper[4771]: I0227 01:09:28.952860 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:09:28 crc kubenswrapper[4771]: I0227 01:09:28.953048 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:09:29 crc kubenswrapper[4771]: I0227 01:09:29.635266 4771 generic.go:334] "Generic (PLEG): container finished" podID="23a12bb6-171f-4e2b-9706-1b3ba0948752" containerID="31f738954f9f29b777d10e12b36b1bbbfabfa0e060d3ebeb82f88afb4ad90205" exitCode=0 Feb 27 01:09:29 crc kubenswrapper[4771]: I0227 01:09:29.635337 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" event={"ID":"23a12bb6-171f-4e2b-9706-1b3ba0948752","Type":"ContainerDied","Data":"31f738954f9f29b777d10e12b36b1bbbfabfa0e060d3ebeb82f88afb4ad90205"} Feb 27 01:09:29 crc kubenswrapper[4771]: I0227 01:09:29.639030 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5l2f" event={"ID":"deb9a4a5-1474-4744-a57e-fcdcc97922ed","Type":"ContainerStarted","Data":"8f8bd6441fd3159b7d3a79d09507058123c78ad7606f4d6a21f28afba06b5e6d"} Feb 27 01:09:29 crc kubenswrapper[4771]: I0227 01:09:29.640406 4771 generic.go:334] "Generic (PLEG): container finished" podID="f000cf8a-5b9a-472c-b986-881daa2151e1" containerID="6a7e8563774609c883331d2fe0b74fb7e738953d3b2c873c8c845205a013fd9c" exitCode=0 Feb 27 01:09:29 crc kubenswrapper[4771]: I0227 01:09:29.640436 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv" event={"ID":"f000cf8a-5b9a-472c-b986-881daa2151e1","Type":"ContainerDied","Data":"6a7e8563774609c883331d2fe0b74fb7e738953d3b2c873c8c845205a013fd9c"} Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.064740 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.093352 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps"] Feb 27 01:09:30 crc kubenswrapper[4771]: E0227 01:09:30.093610 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="739ccbd2-56c7-4a26-ad40-4f0f908089e8" containerName="extract-content" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.093628 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="739ccbd2-56c7-4a26-ad40-4f0f908089e8" containerName="extract-content" Feb 27 01:09:30 crc kubenswrapper[4771]: E0227 01:09:30.093644 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="739ccbd2-56c7-4a26-ad40-4f0f908089e8" containerName="registry-server" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.093651 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="739ccbd2-56c7-4a26-ad40-4f0f908089e8" containerName="registry-server" Feb 27 01:09:30 crc kubenswrapper[4771]: E0227 01:09:30.093664 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f000cf8a-5b9a-472c-b986-881daa2151e1" containerName="route-controller-manager" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.093672 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f000cf8a-5b9a-472c-b986-881daa2151e1" containerName="route-controller-manager" Feb 27 01:09:30 crc kubenswrapper[4771]: E0227 01:09:30.093681 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="739ccbd2-56c7-4a26-ad40-4f0f908089e8" containerName="extract-utilities" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.093689 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="739ccbd2-56c7-4a26-ad40-4f0f908089e8" containerName="extract-utilities" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.093794 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f000cf8a-5b9a-472c-b986-881daa2151e1" containerName="route-controller-manager" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.093810 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="739ccbd2-56c7-4a26-ad40-4f0f908089e8" containerName="registry-server" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.094197 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.106277 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps"] Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.183365 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.194979 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f000cf8a-5b9a-472c-b986-881daa2151e1-config\") pod \"f000cf8a-5b9a-472c-b986-881daa2151e1\" (UID: \"f000cf8a-5b9a-472c-b986-881daa2151e1\") " Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.195068 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f62tz\" (UniqueName: \"kubernetes.io/projected/f000cf8a-5b9a-472c-b986-881daa2151e1-kube-api-access-f62tz\") pod \"f000cf8a-5b9a-472c-b986-881daa2151e1\" (UID: \"f000cf8a-5b9a-472c-b986-881daa2151e1\") " Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.195132 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f000cf8a-5b9a-472c-b986-881daa2151e1-client-ca\") pod \"f000cf8a-5b9a-472c-b986-881daa2151e1\" (UID: \"f000cf8a-5b9a-472c-b986-881daa2151e1\") " Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.195167 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f000cf8a-5b9a-472c-b986-881daa2151e1-serving-cert\") pod \"f000cf8a-5b9a-472c-b986-881daa2151e1\" (UID: \"f000cf8a-5b9a-472c-b986-881daa2151e1\") " Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.195329 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6xmr\" (UniqueName: \"kubernetes.io/projected/63faec48-e964-4842-b418-b5f0fc000f37-kube-api-access-v6xmr\") pod \"route-controller-manager-69784b8f5d-c8rps\" (UID: \"63faec48-e964-4842-b418-b5f0fc000f37\") " pod="openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.195376 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63faec48-e964-4842-b418-b5f0fc000f37-client-ca\") pod \"route-controller-manager-69784b8f5d-c8rps\" (UID: \"63faec48-e964-4842-b418-b5f0fc000f37\") " pod="openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.195397 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63faec48-e964-4842-b418-b5f0fc000f37-config\") pod \"route-controller-manager-69784b8f5d-c8rps\" (UID: \"63faec48-e964-4842-b418-b5f0fc000f37\") " pod="openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.195417 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63faec48-e964-4842-b418-b5f0fc000f37-serving-cert\") pod \"route-controller-manager-69784b8f5d-c8rps\" (UID: \"63faec48-e964-4842-b418-b5f0fc000f37\") " pod="openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.195932 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f000cf8a-5b9a-472c-b986-881daa2151e1-client-ca" (OuterVolumeSpecName: "client-ca") pod "f000cf8a-5b9a-472c-b986-881daa2151e1" (UID: "f000cf8a-5b9a-472c-b986-881daa2151e1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.196061 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f000cf8a-5b9a-472c-b986-881daa2151e1-config" (OuterVolumeSpecName: "config") pod "f000cf8a-5b9a-472c-b986-881daa2151e1" (UID: "f000cf8a-5b9a-472c-b986-881daa2151e1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.200417 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f000cf8a-5b9a-472c-b986-881daa2151e1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f000cf8a-5b9a-472c-b986-881daa2151e1" (UID: "f000cf8a-5b9a-472c-b986-881daa2151e1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.203712 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f000cf8a-5b9a-472c-b986-881daa2151e1-kube-api-access-f62tz" (OuterVolumeSpecName: "kube-api-access-f62tz") pod "f000cf8a-5b9a-472c-b986-881daa2151e1" (UID: "f000cf8a-5b9a-472c-b986-881daa2151e1"). InnerVolumeSpecName "kube-api-access-f62tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.296594 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23a12bb6-171f-4e2b-9706-1b3ba0948752-serving-cert\") pod \"23a12bb6-171f-4e2b-9706-1b3ba0948752\" (UID: \"23a12bb6-171f-4e2b-9706-1b3ba0948752\") " Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.296667 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhnmh\" (UniqueName: \"kubernetes.io/projected/23a12bb6-171f-4e2b-9706-1b3ba0948752-kube-api-access-nhnmh\") pod \"23a12bb6-171f-4e2b-9706-1b3ba0948752\" (UID: \"23a12bb6-171f-4e2b-9706-1b3ba0948752\") " Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.296698 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23a12bb6-171f-4e2b-9706-1b3ba0948752-client-ca\") pod \"23a12bb6-171f-4e2b-9706-1b3ba0948752\" (UID: \"23a12bb6-171f-4e2b-9706-1b3ba0948752\") " Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.296720 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/23a12bb6-171f-4e2b-9706-1b3ba0948752-proxy-ca-bundles\") pod \"23a12bb6-171f-4e2b-9706-1b3ba0948752\" (UID: \"23a12bb6-171f-4e2b-9706-1b3ba0948752\") " Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.296785 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23a12bb6-171f-4e2b-9706-1b3ba0948752-config\") pod \"23a12bb6-171f-4e2b-9706-1b3ba0948752\" (UID: \"23a12bb6-171f-4e2b-9706-1b3ba0948752\") " Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.296906 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6xmr\" (UniqueName: \"kubernetes.io/projected/63faec48-e964-4842-b418-b5f0fc000f37-kube-api-access-v6xmr\") pod \"route-controller-manager-69784b8f5d-c8rps\" (UID: \"63faec48-e964-4842-b418-b5f0fc000f37\") " pod="openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.296946 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63faec48-e964-4842-b418-b5f0fc000f37-client-ca\") pod \"route-controller-manager-69784b8f5d-c8rps\" (UID: \"63faec48-e964-4842-b418-b5f0fc000f37\") " pod="openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.296968 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63faec48-e964-4842-b418-b5f0fc000f37-config\") pod \"route-controller-manager-69784b8f5d-c8rps\" (UID: \"63faec48-e964-4842-b418-b5f0fc000f37\") " pod="openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.296993 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63faec48-e964-4842-b418-b5f0fc000f37-serving-cert\") pod \"route-controller-manager-69784b8f5d-c8rps\" (UID: \"63faec48-e964-4842-b418-b5f0fc000f37\") " pod="openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.297051 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f62tz\" (UniqueName: \"kubernetes.io/projected/f000cf8a-5b9a-472c-b986-881daa2151e1-kube-api-access-f62tz\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.297065 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f000cf8a-5b9a-472c-b986-881daa2151e1-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.297077 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f000cf8a-5b9a-472c-b986-881daa2151e1-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.297088 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f000cf8a-5b9a-472c-b986-881daa2151e1-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.298462 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23a12bb6-171f-4e2b-9706-1b3ba0948752-client-ca" (OuterVolumeSpecName: "client-ca") pod "23a12bb6-171f-4e2b-9706-1b3ba0948752" (UID: "23a12bb6-171f-4e2b-9706-1b3ba0948752"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.298485 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23a12bb6-171f-4e2b-9706-1b3ba0948752-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "23a12bb6-171f-4e2b-9706-1b3ba0948752" (UID: "23a12bb6-171f-4e2b-9706-1b3ba0948752"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.298479 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23a12bb6-171f-4e2b-9706-1b3ba0948752-config" (OuterVolumeSpecName: "config") pod "23a12bb6-171f-4e2b-9706-1b3ba0948752" (UID: "23a12bb6-171f-4e2b-9706-1b3ba0948752"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.298885 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63faec48-e964-4842-b418-b5f0fc000f37-client-ca\") pod \"route-controller-manager-69784b8f5d-c8rps\" (UID: \"63faec48-e964-4842-b418-b5f0fc000f37\") " pod="openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.299121 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63faec48-e964-4842-b418-b5f0fc000f37-config\") pod \"route-controller-manager-69784b8f5d-c8rps\" (UID: \"63faec48-e964-4842-b418-b5f0fc000f37\") " pod="openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.300140 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23a12bb6-171f-4e2b-9706-1b3ba0948752-kube-api-access-nhnmh" (OuterVolumeSpecName: "kube-api-access-nhnmh") pod "23a12bb6-171f-4e2b-9706-1b3ba0948752" (UID: "23a12bb6-171f-4e2b-9706-1b3ba0948752"). InnerVolumeSpecName "kube-api-access-nhnmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.301030 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63faec48-e964-4842-b418-b5f0fc000f37-serving-cert\") pod \"route-controller-manager-69784b8f5d-c8rps\" (UID: \"63faec48-e964-4842-b418-b5f0fc000f37\") " pod="openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.301092 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a12bb6-171f-4e2b-9706-1b3ba0948752-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "23a12bb6-171f-4e2b-9706-1b3ba0948752" (UID: "23a12bb6-171f-4e2b-9706-1b3ba0948752"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.316224 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6xmr\" (UniqueName: \"kubernetes.io/projected/63faec48-e964-4842-b418-b5f0fc000f37-kube-api-access-v6xmr\") pod \"route-controller-manager-69784b8f5d-c8rps\" (UID: \"63faec48-e964-4842-b418-b5f0fc000f37\") " pod="openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.399588 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23a12bb6-171f-4e2b-9706-1b3ba0948752-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.399636 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/23a12bb6-171f-4e2b-9706-1b3ba0948752-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.399652 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23a12bb6-171f-4e2b-9706-1b3ba0948752-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.399664 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23a12bb6-171f-4e2b-9706-1b3ba0948752-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.399677 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhnmh\" (UniqueName: \"kubernetes.io/projected/23a12bb6-171f-4e2b-9706-1b3ba0948752-kube-api-access-nhnmh\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.409582 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.647188 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv" event={"ID":"f000cf8a-5b9a-472c-b986-881daa2151e1","Type":"ContainerDied","Data":"45d0b174205ff1ea7f95ebe53a08e6edff03851e7f5ee4069a8f93b4e0dc2ea9"} Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.647417 4771 scope.go:117] "RemoveContainer" containerID="6a7e8563774609c883331d2fe0b74fb7e738953d3b2c873c8c845205a013fd9c" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.647263 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.648516 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" event={"ID":"23a12bb6-171f-4e2b-9706-1b3ba0948752","Type":"ContainerDied","Data":"c808d905c2a943fcda916ad222a2d5dedb686587c6484600f094a8f7699eb5e8"} Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.648589 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74565c8b54-pzq4n" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.668394 4771 scope.go:117] "RemoveContainer" containerID="31f738954f9f29b777d10e12b36b1bbbfabfa0e060d3ebeb82f88afb4ad90205" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.686197 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d5l2f" podStartSLOduration=2.938191784 podStartE2EDuration="57.686172633s" podCreationTimestamp="2026-02-27 01:08:33 +0000 UTC" firstStartedPulling="2026-02-27 01:08:34.056144829 +0000 UTC m=+226.993706117" lastFinishedPulling="2026-02-27 01:09:28.804125688 +0000 UTC m=+281.741686966" observedRunningTime="2026-02-27 01:09:30.684731745 +0000 UTC m=+283.622293033" watchObservedRunningTime="2026-02-27 01:09:30.686172633 +0000 UTC m=+283.623733961" Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.701592 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv"] Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.704661 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75c84b495d-b5fpv"] Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.707682 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74565c8b54-pzq4n"] Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.708114 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-74565c8b54-pzq4n"] Feb 27 01:09:30 crc kubenswrapper[4771]: I0227 01:09:30.837971 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps"] Feb 27 01:09:30 crc kubenswrapper[4771]: W0227 01:09:30.849596 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63faec48_e964_4842_b418_b5f0fc000f37.slice/crio-3e23a7b568d5a49c2385887cd6fd323b132aa02503eb19b17971370d064f67fe WatchSource:0}: Error finding container 3e23a7b568d5a49c2385887cd6fd323b132aa02503eb19b17971370d064f67fe: Status 404 returned error can't find the container with id 3e23a7b568d5a49c2385887cd6fd323b132aa02503eb19b17971370d064f67fe Feb 27 01:09:31 crc kubenswrapper[4771]: I0227 01:09:31.632206 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xk58g" Feb 27 01:09:31 crc kubenswrapper[4771]: I0227 01:09:31.663403 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps" event={"ID":"63faec48-e964-4842-b418-b5f0fc000f37","Type":"ContainerStarted","Data":"cacbf38fb50b43d51cbcb11d1279a5ef18baf4d8495bd6ea6c912dcaa27e7573"} Feb 27 01:09:31 crc kubenswrapper[4771]: I0227 01:09:31.663459 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps" event={"ID":"63faec48-e964-4842-b418-b5f0fc000f37","Type":"ContainerStarted","Data":"3e23a7b568d5a49c2385887cd6fd323b132aa02503eb19b17971370d064f67fe"} Feb 27 01:09:31 crc kubenswrapper[4771]: I0227 01:09:31.781625 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23a12bb6-171f-4e2b-9706-1b3ba0948752" path="/var/lib/kubelet/pods/23a12bb6-171f-4e2b-9706-1b3ba0948752/volumes" Feb 27 01:09:31 crc kubenswrapper[4771]: I0227 01:09:31.782760 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f000cf8a-5b9a-472c-b986-881daa2151e1" path="/var/lib/kubelet/pods/f000cf8a-5b9a-472c-b986-881daa2151e1/volumes" Feb 27 01:09:31 crc kubenswrapper[4771]: I0227 01:09:31.894128 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g5gxk" Feb 27 01:09:31 crc kubenswrapper[4771]: I0227 01:09:31.894180 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g5gxk" Feb 27 01:09:31 crc kubenswrapper[4771]: I0227 01:09:31.940654 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g5gxk" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.623588 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-69c78d7f85-zj485"] Feb 27 01:09:32 crc kubenswrapper[4771]: E0227 01:09:32.624204 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a12bb6-171f-4e2b-9706-1b3ba0948752" containerName="controller-manager" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.624223 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a12bb6-171f-4e2b-9706-1b3ba0948752" containerName="controller-manager" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.624421 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="23a12bb6-171f-4e2b-9706-1b3ba0948752" containerName="controller-manager" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.625003 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.627477 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.631169 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.631183 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.631203 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.631236 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.631401 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.633449 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69c78d7f85-zj485"] Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.641027 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.687740 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps" podStartSLOduration=4.687717956 podStartE2EDuration="4.687717956s" podCreationTimestamp="2026-02-27 01:09:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:09:32.684932402 +0000 UTC m=+285.622493700" watchObservedRunningTime="2026-02-27 01:09:32.687717956 +0000 UTC m=+285.625279254" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.709824 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g5gxk" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.725820 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/beb27eba-ff4c-41aa-b322-6ec456945455-proxy-ca-bundles\") pod \"controller-manager-69c78d7f85-zj485\" (UID: \"beb27eba-ff4c-41aa-b322-6ec456945455\") " pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.725870 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beb27eba-ff4c-41aa-b322-6ec456945455-serving-cert\") pod \"controller-manager-69c78d7f85-zj485\" (UID: \"beb27eba-ff4c-41aa-b322-6ec456945455\") " pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.725950 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beb27eba-ff4c-41aa-b322-6ec456945455-config\") pod \"controller-manager-69c78d7f85-zj485\" (UID: \"beb27eba-ff4c-41aa-b322-6ec456945455\") " pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.725989 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzgdz\" (UniqueName: \"kubernetes.io/projected/beb27eba-ff4c-41aa-b322-6ec456945455-kube-api-access-pzgdz\") pod \"controller-manager-69c78d7f85-zj485\" (UID: \"beb27eba-ff4c-41aa-b322-6ec456945455\") " pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.726024 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/beb27eba-ff4c-41aa-b322-6ec456945455-client-ca\") pod \"controller-manager-69c78d7f85-zj485\" (UID: \"beb27eba-ff4c-41aa-b322-6ec456945455\") " pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.827733 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beb27eba-ff4c-41aa-b322-6ec456945455-config\") pod \"controller-manager-69c78d7f85-zj485\" (UID: \"beb27eba-ff4c-41aa-b322-6ec456945455\") " pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.827782 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzgdz\" (UniqueName: \"kubernetes.io/projected/beb27eba-ff4c-41aa-b322-6ec456945455-kube-api-access-pzgdz\") pod \"controller-manager-69c78d7f85-zj485\" (UID: \"beb27eba-ff4c-41aa-b322-6ec456945455\") " pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.828459 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/beb27eba-ff4c-41aa-b322-6ec456945455-client-ca\") pod \"controller-manager-69c78d7f85-zj485\" (UID: \"beb27eba-ff4c-41aa-b322-6ec456945455\") " pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.828538 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/beb27eba-ff4c-41aa-b322-6ec456945455-proxy-ca-bundles\") pod \"controller-manager-69c78d7f85-zj485\" (UID: \"beb27eba-ff4c-41aa-b322-6ec456945455\") " pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.828584 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beb27eba-ff4c-41aa-b322-6ec456945455-serving-cert\") pod \"controller-manager-69c78d7f85-zj485\" (UID: \"beb27eba-ff4c-41aa-b322-6ec456945455\") " pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.829374 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/beb27eba-ff4c-41aa-b322-6ec456945455-client-ca\") pod \"controller-manager-69c78d7f85-zj485\" (UID: \"beb27eba-ff4c-41aa-b322-6ec456945455\") " pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.829464 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beb27eba-ff4c-41aa-b322-6ec456945455-config\") pod \"controller-manager-69c78d7f85-zj485\" (UID: \"beb27eba-ff4c-41aa-b322-6ec456945455\") " pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.829569 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/beb27eba-ff4c-41aa-b322-6ec456945455-proxy-ca-bundles\") pod \"controller-manager-69c78d7f85-zj485\" (UID: \"beb27eba-ff4c-41aa-b322-6ec456945455\") " pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.836044 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beb27eba-ff4c-41aa-b322-6ec456945455-serving-cert\") pod \"controller-manager-69c78d7f85-zj485\" (UID: \"beb27eba-ff4c-41aa-b322-6ec456945455\") " pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.852246 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzgdz\" (UniqueName: \"kubernetes.io/projected/beb27eba-ff4c-41aa-b322-6ec456945455-kube-api-access-pzgdz\") pod \"controller-manager-69c78d7f85-zj485\" (UID: \"beb27eba-ff4c-41aa-b322-6ec456945455\") " pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" Feb 27 01:09:32 crc kubenswrapper[4771]: I0227 01:09:32.942957 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" Feb 27 01:09:33 crc kubenswrapper[4771]: I0227 01:09:33.370689 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69c78d7f85-zj485"] Feb 27 01:09:33 crc kubenswrapper[4771]: I0227 01:09:33.491755 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d5l2f" Feb 27 01:09:33 crc kubenswrapper[4771]: I0227 01:09:33.492527 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d5l2f" Feb 27 01:09:33 crc kubenswrapper[4771]: I0227 01:09:33.569330 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d5l2f" Feb 27 01:09:33 crc kubenswrapper[4771]: I0227 01:09:33.676771 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" event={"ID":"beb27eba-ff4c-41aa-b322-6ec456945455","Type":"ContainerStarted","Data":"adb4112a33756f4e2880fb5800120cd0b5a3dd54ad84611c7e7756b428bf4397"} Feb 27 01:09:33 crc kubenswrapper[4771]: I0227 01:09:33.976132 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sbc2d" Feb 27 01:09:34 crc kubenswrapper[4771]: I0227 01:09:34.109794 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g5gxk"] Feb 27 01:09:34 crc kubenswrapper[4771]: I0227 01:09:34.554687 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7r96b" Feb 27 01:09:34 crc kubenswrapper[4771]: I0227 01:09:34.629994 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7r96b" Feb 27 01:09:34 crc kubenswrapper[4771]: I0227 01:09:34.685844 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g5gxk" podUID="b4029ae4-2dfb-4351-88f8-08fefb8ab46e" containerName="registry-server" containerID="cri-o://5b329ce99e94ed0551bbd390d29da6b08d178907a0dc5ee073ff1a0389e79342" gracePeriod=2 Feb 27 01:09:34 crc kubenswrapper[4771]: I0227 01:09:34.685950 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" event={"ID":"beb27eba-ff4c-41aa-b322-6ec456945455","Type":"ContainerStarted","Data":"517c64ffd803e603e4a5298bb35650e43ca72ece77c19e5f1e95c893dfe82ef5"} Feb 27 01:09:34 crc kubenswrapper[4771]: I0227 01:09:34.686524 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qs5th" Feb 27 01:09:34 crc kubenswrapper[4771]: I0227 01:09:34.686666 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qs5th" Feb 27 01:09:34 crc kubenswrapper[4771]: I0227 01:09:34.687773 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" Feb 27 01:09:34 crc kubenswrapper[4771]: I0227 01:09:34.692887 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" Feb 27 01:09:34 crc kubenswrapper[4771]: I0227 01:09:34.710525 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" podStartSLOduration=6.710499403 podStartE2EDuration="6.710499403s" podCreationTimestamp="2026-02-27 01:09:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:09:34.70432516 +0000 UTC m=+287.641886438" watchObservedRunningTime="2026-02-27 01:09:34.710499403 +0000 UTC m=+287.648060731" Feb 27 01:09:34 crc kubenswrapper[4771]: I0227 01:09:34.737233 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qs5th" Feb 27 01:09:34 crc kubenswrapper[4771]: I0227 01:09:34.754372 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d5l2f" Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.093462 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g5gxk" Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.265861 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67wv4\" (UniqueName: \"kubernetes.io/projected/b4029ae4-2dfb-4351-88f8-08fefb8ab46e-kube-api-access-67wv4\") pod \"b4029ae4-2dfb-4351-88f8-08fefb8ab46e\" (UID: \"b4029ae4-2dfb-4351-88f8-08fefb8ab46e\") " Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.265951 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4029ae4-2dfb-4351-88f8-08fefb8ab46e-catalog-content\") pod \"b4029ae4-2dfb-4351-88f8-08fefb8ab46e\" (UID: \"b4029ae4-2dfb-4351-88f8-08fefb8ab46e\") " Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.266021 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4029ae4-2dfb-4351-88f8-08fefb8ab46e-utilities\") pod \"b4029ae4-2dfb-4351-88f8-08fefb8ab46e\" (UID: \"b4029ae4-2dfb-4351-88f8-08fefb8ab46e\") " Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.267302 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4029ae4-2dfb-4351-88f8-08fefb8ab46e-utilities" (OuterVolumeSpecName: "utilities") pod "b4029ae4-2dfb-4351-88f8-08fefb8ab46e" (UID: "b4029ae4-2dfb-4351-88f8-08fefb8ab46e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.275399 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4029ae4-2dfb-4351-88f8-08fefb8ab46e-kube-api-access-67wv4" (OuterVolumeSpecName: "kube-api-access-67wv4") pod "b4029ae4-2dfb-4351-88f8-08fefb8ab46e" (UID: "b4029ae4-2dfb-4351-88f8-08fefb8ab46e"). InnerVolumeSpecName "kube-api-access-67wv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.351462 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4029ae4-2dfb-4351-88f8-08fefb8ab46e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4029ae4-2dfb-4351-88f8-08fefb8ab46e" (UID: "b4029ae4-2dfb-4351-88f8-08fefb8ab46e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.367431 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67wv4\" (UniqueName: \"kubernetes.io/projected/b4029ae4-2dfb-4351-88f8-08fefb8ab46e-kube-api-access-67wv4\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.367463 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4029ae4-2dfb-4351-88f8-08fefb8ab46e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.367472 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4029ae4-2dfb-4351-88f8-08fefb8ab46e-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.701209 4771 generic.go:334] "Generic (PLEG): container finished" podID="b4029ae4-2dfb-4351-88f8-08fefb8ab46e" containerID="5b329ce99e94ed0551bbd390d29da6b08d178907a0dc5ee073ff1a0389e79342" exitCode=0 Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.701337 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g5gxk" Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.702138 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g5gxk" event={"ID":"b4029ae4-2dfb-4351-88f8-08fefb8ab46e","Type":"ContainerDied","Data":"5b329ce99e94ed0551bbd390d29da6b08d178907a0dc5ee073ff1a0389e79342"} Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.702178 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g5gxk" event={"ID":"b4029ae4-2dfb-4351-88f8-08fefb8ab46e","Type":"ContainerDied","Data":"4737c8964bff6f541285d04beba79fb67ec0e8b294c7f0fd17c2e5758f58edf6"} Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.702206 4771 scope.go:117] "RemoveContainer" containerID="5b329ce99e94ed0551bbd390d29da6b08d178907a0dc5ee073ff1a0389e79342" Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.731586 4771 scope.go:117] "RemoveContainer" containerID="2ae0e2ff149c5134630b388b3ae0566c67b5fb7c4e1b9edbece6a6b618fe72b1" Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.734402 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g5gxk"] Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.741173 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g5gxk"] Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.758660 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qs5th" Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.761540 4771 scope.go:117] "RemoveContainer" containerID="95a2b65788e7ed8da3746791683c1d62b52242d950088997901356b83d480a1e" Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.792568 4771 scope.go:117] "RemoveContainer" containerID="5b329ce99e94ed0551bbd390d29da6b08d178907a0dc5ee073ff1a0389e79342" Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.793520 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4029ae4-2dfb-4351-88f8-08fefb8ab46e" path="/var/lib/kubelet/pods/b4029ae4-2dfb-4351-88f8-08fefb8ab46e/volumes" Feb 27 01:09:35 crc kubenswrapper[4771]: E0227 01:09:35.793857 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b329ce99e94ed0551bbd390d29da6b08d178907a0dc5ee073ff1a0389e79342\": container with ID starting with 5b329ce99e94ed0551bbd390d29da6b08d178907a0dc5ee073ff1a0389e79342 not found: ID does not exist" containerID="5b329ce99e94ed0551bbd390d29da6b08d178907a0dc5ee073ff1a0389e79342" Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.793902 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b329ce99e94ed0551bbd390d29da6b08d178907a0dc5ee073ff1a0389e79342"} err="failed to get container status \"5b329ce99e94ed0551bbd390d29da6b08d178907a0dc5ee073ff1a0389e79342\": rpc error: code = NotFound desc = could not find container \"5b329ce99e94ed0551bbd390d29da6b08d178907a0dc5ee073ff1a0389e79342\": container with ID starting with 5b329ce99e94ed0551bbd390d29da6b08d178907a0dc5ee073ff1a0389e79342 not found: ID does not exist" Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.793926 4771 scope.go:117] "RemoveContainer" containerID="2ae0e2ff149c5134630b388b3ae0566c67b5fb7c4e1b9edbece6a6b618fe72b1" Feb 27 01:09:35 crc kubenswrapper[4771]: E0227 01:09:35.794399 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ae0e2ff149c5134630b388b3ae0566c67b5fb7c4e1b9edbece6a6b618fe72b1\": container with ID starting with 2ae0e2ff149c5134630b388b3ae0566c67b5fb7c4e1b9edbece6a6b618fe72b1 not found: ID does not exist" containerID="2ae0e2ff149c5134630b388b3ae0566c67b5fb7c4e1b9edbece6a6b618fe72b1" Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.794435 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ae0e2ff149c5134630b388b3ae0566c67b5fb7c4e1b9edbece6a6b618fe72b1"} err="failed to get container status \"2ae0e2ff149c5134630b388b3ae0566c67b5fb7c4e1b9edbece6a6b618fe72b1\": rpc error: code = NotFound desc = could not find container \"2ae0e2ff149c5134630b388b3ae0566c67b5fb7c4e1b9edbece6a6b618fe72b1\": container with ID starting with 2ae0e2ff149c5134630b388b3ae0566c67b5fb7c4e1b9edbece6a6b618fe72b1 not found: ID does not exist" Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.794463 4771 scope.go:117] "RemoveContainer" containerID="95a2b65788e7ed8da3746791683c1d62b52242d950088997901356b83d480a1e" Feb 27 01:09:35 crc kubenswrapper[4771]: E0227 01:09:35.795676 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95a2b65788e7ed8da3746791683c1d62b52242d950088997901356b83d480a1e\": container with ID starting with 95a2b65788e7ed8da3746791683c1d62b52242d950088997901356b83d480a1e not found: ID does not exist" containerID="95a2b65788e7ed8da3746791683c1d62b52242d950088997901356b83d480a1e" Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.795729 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95a2b65788e7ed8da3746791683c1d62b52242d950088997901356b83d480a1e"} err="failed to get container status \"95a2b65788e7ed8da3746791683c1d62b52242d950088997901356b83d480a1e\": rpc error: code = NotFound desc = could not find container \"95a2b65788e7ed8da3746791683c1d62b52242d950088997901356b83d480a1e\": container with ID starting with 95a2b65788e7ed8da3746791683c1d62b52242d950088997901356b83d480a1e not found: ID does not exist" Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.906193 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbc2d"] Feb 27 01:09:35 crc kubenswrapper[4771]: I0227 01:09:35.906457 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sbc2d" podUID="96a309b5-c10d-49d7-ade8-3c087250dd91" containerName="registry-server" containerID="cri-o://acf8f48a6bca2df434649400318e72c81e1e17c5ad4a9cb69acb1b15b603a50f" gracePeriod=2 Feb 27 01:09:36 crc kubenswrapper[4771]: I0227 01:09:36.344632 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbc2d" Feb 27 01:09:36 crc kubenswrapper[4771]: I0227 01:09:36.482278 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a309b5-c10d-49d7-ade8-3c087250dd91-catalog-content\") pod \"96a309b5-c10d-49d7-ade8-3c087250dd91\" (UID: \"96a309b5-c10d-49d7-ade8-3c087250dd91\") " Feb 27 01:09:36 crc kubenswrapper[4771]: I0227 01:09:36.482359 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxnk7\" (UniqueName: \"kubernetes.io/projected/96a309b5-c10d-49d7-ade8-3c087250dd91-kube-api-access-vxnk7\") pod \"96a309b5-c10d-49d7-ade8-3c087250dd91\" (UID: \"96a309b5-c10d-49d7-ade8-3c087250dd91\") " Feb 27 01:09:36 crc kubenswrapper[4771]: I0227 01:09:36.482447 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a309b5-c10d-49d7-ade8-3c087250dd91-utilities\") pod \"96a309b5-c10d-49d7-ade8-3c087250dd91\" (UID: \"96a309b5-c10d-49d7-ade8-3c087250dd91\") " Feb 27 01:09:36 crc kubenswrapper[4771]: I0227 01:09:36.483279 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96a309b5-c10d-49d7-ade8-3c087250dd91-utilities" (OuterVolumeSpecName: "utilities") pod "96a309b5-c10d-49d7-ade8-3c087250dd91" (UID: "96a309b5-c10d-49d7-ade8-3c087250dd91"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:09:36 crc kubenswrapper[4771]: I0227 01:09:36.494790 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96a309b5-c10d-49d7-ade8-3c087250dd91-kube-api-access-vxnk7" (OuterVolumeSpecName: "kube-api-access-vxnk7") pod "96a309b5-c10d-49d7-ade8-3c087250dd91" (UID: "96a309b5-c10d-49d7-ade8-3c087250dd91"). InnerVolumeSpecName "kube-api-access-vxnk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:09:36 crc kubenswrapper[4771]: I0227 01:09:36.518116 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96a309b5-c10d-49d7-ade8-3c087250dd91-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96a309b5-c10d-49d7-ade8-3c087250dd91" (UID: "96a309b5-c10d-49d7-ade8-3c087250dd91"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:09:36 crc kubenswrapper[4771]: I0227 01:09:36.583562 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a309b5-c10d-49d7-ade8-3c087250dd91-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:36 crc kubenswrapper[4771]: I0227 01:09:36.583595 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a309b5-c10d-49d7-ade8-3c087250dd91-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:36 crc kubenswrapper[4771]: I0227 01:09:36.583608 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxnk7\" (UniqueName: \"kubernetes.io/projected/96a309b5-c10d-49d7-ade8-3c087250dd91-kube-api-access-vxnk7\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:36 crc kubenswrapper[4771]: I0227 01:09:36.708144 4771 generic.go:334] "Generic (PLEG): container finished" podID="96a309b5-c10d-49d7-ade8-3c087250dd91" containerID="acf8f48a6bca2df434649400318e72c81e1e17c5ad4a9cb69acb1b15b603a50f" exitCode=0 Feb 27 01:09:36 crc kubenswrapper[4771]: I0227 01:09:36.708201 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbc2d" event={"ID":"96a309b5-c10d-49d7-ade8-3c087250dd91","Type":"ContainerDied","Data":"acf8f48a6bca2df434649400318e72c81e1e17c5ad4a9cb69acb1b15b603a50f"} Feb 27 01:09:36 crc kubenswrapper[4771]: I0227 01:09:36.708262 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbc2d" event={"ID":"96a309b5-c10d-49d7-ade8-3c087250dd91","Type":"ContainerDied","Data":"e45be0b3465e90c9cd58b1d66c3a3466721bda1643a8fce19892514006807477"} Feb 27 01:09:36 crc kubenswrapper[4771]: I0227 01:09:36.708282 4771 scope.go:117] "RemoveContainer" containerID="acf8f48a6bca2df434649400318e72c81e1e17c5ad4a9cb69acb1b15b603a50f" Feb 27 01:09:36 crc kubenswrapper[4771]: I0227 01:09:36.708280 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbc2d" Feb 27 01:09:36 crc kubenswrapper[4771]: I0227 01:09:36.724885 4771 scope.go:117] "RemoveContainer" containerID="8ab16e9fcbc755af328bf7106826bde61bd5239d22b932d70a518fb4e5387e02" Feb 27 01:09:36 crc kubenswrapper[4771]: I0227 01:09:36.740506 4771 scope.go:117] "RemoveContainer" containerID="f3545511429d990989fa6ddab489786b6685316f32258dadab235bd234b17a9a" Feb 27 01:09:36 crc kubenswrapper[4771]: I0227 01:09:36.747582 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbc2d"] Feb 27 01:09:36 crc kubenswrapper[4771]: I0227 01:09:36.755030 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbc2d"] Feb 27 01:09:36 crc kubenswrapper[4771]: I0227 01:09:36.775015 4771 scope.go:117] "RemoveContainer" containerID="acf8f48a6bca2df434649400318e72c81e1e17c5ad4a9cb69acb1b15b603a50f" Feb 27 01:09:36 crc kubenswrapper[4771]: E0227 01:09:36.775412 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acf8f48a6bca2df434649400318e72c81e1e17c5ad4a9cb69acb1b15b603a50f\": container with ID starting with acf8f48a6bca2df434649400318e72c81e1e17c5ad4a9cb69acb1b15b603a50f not found: ID does not exist" containerID="acf8f48a6bca2df434649400318e72c81e1e17c5ad4a9cb69acb1b15b603a50f" Feb 27 01:09:36 crc kubenswrapper[4771]: I0227 01:09:36.775442 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acf8f48a6bca2df434649400318e72c81e1e17c5ad4a9cb69acb1b15b603a50f"} err="failed to get container status \"acf8f48a6bca2df434649400318e72c81e1e17c5ad4a9cb69acb1b15b603a50f\": rpc error: code = NotFound desc = could not find container \"acf8f48a6bca2df434649400318e72c81e1e17c5ad4a9cb69acb1b15b603a50f\": container with ID starting with acf8f48a6bca2df434649400318e72c81e1e17c5ad4a9cb69acb1b15b603a50f not found: ID does not exist" Feb 27 01:09:36 crc kubenswrapper[4771]: I0227 01:09:36.775465 4771 scope.go:117] "RemoveContainer" containerID="8ab16e9fcbc755af328bf7106826bde61bd5239d22b932d70a518fb4e5387e02" Feb 27 01:09:36 crc kubenswrapper[4771]: E0227 01:09:36.775702 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ab16e9fcbc755af328bf7106826bde61bd5239d22b932d70a518fb4e5387e02\": container with ID starting with 8ab16e9fcbc755af328bf7106826bde61bd5239d22b932d70a518fb4e5387e02 not found: ID does not exist" containerID="8ab16e9fcbc755af328bf7106826bde61bd5239d22b932d70a518fb4e5387e02" Feb 27 01:09:36 crc kubenswrapper[4771]: I0227 01:09:36.775733 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ab16e9fcbc755af328bf7106826bde61bd5239d22b932d70a518fb4e5387e02"} err="failed to get container status \"8ab16e9fcbc755af328bf7106826bde61bd5239d22b932d70a518fb4e5387e02\": rpc error: code = NotFound desc = could not find container \"8ab16e9fcbc755af328bf7106826bde61bd5239d22b932d70a518fb4e5387e02\": container with ID starting with 8ab16e9fcbc755af328bf7106826bde61bd5239d22b932d70a518fb4e5387e02 not found: ID does not exist" Feb 27 01:09:36 crc kubenswrapper[4771]: I0227 01:09:36.775749 4771 scope.go:117] "RemoveContainer" containerID="f3545511429d990989fa6ddab489786b6685316f32258dadab235bd234b17a9a" Feb 27 01:09:36 crc kubenswrapper[4771]: E0227 01:09:36.775968 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3545511429d990989fa6ddab489786b6685316f32258dadab235bd234b17a9a\": container with ID starting with f3545511429d990989fa6ddab489786b6685316f32258dadab235bd234b17a9a not found: ID does not exist" containerID="f3545511429d990989fa6ddab489786b6685316f32258dadab235bd234b17a9a" Feb 27 01:09:36 crc kubenswrapper[4771]: I0227 01:09:36.775991 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3545511429d990989fa6ddab489786b6685316f32258dadab235bd234b17a9a"} err="failed to get container status \"f3545511429d990989fa6ddab489786b6685316f32258dadab235bd234b17a9a\": rpc error: code = NotFound desc = could not find container \"f3545511429d990989fa6ddab489786b6685316f32258dadab235bd234b17a9a\": container with ID starting with f3545511429d990989fa6ddab489786b6685316f32258dadab235bd234b17a9a not found: ID does not exist" Feb 27 01:09:37 crc kubenswrapper[4771]: I0227 01:09:37.782191 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96a309b5-c10d-49d7-ade8-3c087250dd91" path="/var/lib/kubelet/pods/96a309b5-c10d-49d7-ade8-3c087250dd91/volumes" Feb 27 01:09:38 crc kubenswrapper[4771]: I0227 01:09:38.310982 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qs5th"] Feb 27 01:09:38 crc kubenswrapper[4771]: I0227 01:09:38.725429 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qs5th" podUID="65fbc634-d941-4dee-a758-3b0b10bd60f0" containerName="registry-server" containerID="cri-o://f38d18dbf0fd05c4818a47695fdb839783a19422fc076dfa197c9fcce4ac8bf4" gracePeriod=2 Feb 27 01:09:39 crc kubenswrapper[4771]: I0227 01:09:39.184002 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qs5th" Feb 27 01:09:39 crc kubenswrapper[4771]: I0227 01:09:39.327741 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65fbc634-d941-4dee-a758-3b0b10bd60f0-utilities\") pod \"65fbc634-d941-4dee-a758-3b0b10bd60f0\" (UID: \"65fbc634-d941-4dee-a758-3b0b10bd60f0\") " Feb 27 01:09:39 crc kubenswrapper[4771]: I0227 01:09:39.327844 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65fbc634-d941-4dee-a758-3b0b10bd60f0-catalog-content\") pod \"65fbc634-d941-4dee-a758-3b0b10bd60f0\" (UID: \"65fbc634-d941-4dee-a758-3b0b10bd60f0\") " Feb 27 01:09:39 crc kubenswrapper[4771]: I0227 01:09:39.327996 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2x9v\" (UniqueName: \"kubernetes.io/projected/65fbc634-d941-4dee-a758-3b0b10bd60f0-kube-api-access-f2x9v\") pod \"65fbc634-d941-4dee-a758-3b0b10bd60f0\" (UID: \"65fbc634-d941-4dee-a758-3b0b10bd60f0\") " Feb 27 01:09:39 crc kubenswrapper[4771]: I0227 01:09:39.329194 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65fbc634-d941-4dee-a758-3b0b10bd60f0-utilities" (OuterVolumeSpecName: "utilities") pod "65fbc634-d941-4dee-a758-3b0b10bd60f0" (UID: "65fbc634-d941-4dee-a758-3b0b10bd60f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:09:39 crc kubenswrapper[4771]: I0227 01:09:39.332912 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65fbc634-d941-4dee-a758-3b0b10bd60f0-kube-api-access-f2x9v" (OuterVolumeSpecName: "kube-api-access-f2x9v") pod "65fbc634-d941-4dee-a758-3b0b10bd60f0" (UID: "65fbc634-d941-4dee-a758-3b0b10bd60f0"). InnerVolumeSpecName "kube-api-access-f2x9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:09:39 crc kubenswrapper[4771]: I0227 01:09:39.429179 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65fbc634-d941-4dee-a758-3b0b10bd60f0-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:39 crc kubenswrapper[4771]: I0227 01:09:39.429216 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2x9v\" (UniqueName: \"kubernetes.io/projected/65fbc634-d941-4dee-a758-3b0b10bd60f0-kube-api-access-f2x9v\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:39 crc kubenswrapper[4771]: I0227 01:09:39.494858 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65fbc634-d941-4dee-a758-3b0b10bd60f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65fbc634-d941-4dee-a758-3b0b10bd60f0" (UID: "65fbc634-d941-4dee-a758-3b0b10bd60f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:09:39 crc kubenswrapper[4771]: I0227 01:09:39.530338 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65fbc634-d941-4dee-a758-3b0b10bd60f0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:39 crc kubenswrapper[4771]: I0227 01:09:39.733457 4771 generic.go:334] "Generic (PLEG): container finished" podID="65fbc634-d941-4dee-a758-3b0b10bd60f0" containerID="f38d18dbf0fd05c4818a47695fdb839783a19422fc076dfa197c9fcce4ac8bf4" exitCode=0 Feb 27 01:09:39 crc kubenswrapper[4771]: I0227 01:09:39.733500 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qs5th" event={"ID":"65fbc634-d941-4dee-a758-3b0b10bd60f0","Type":"ContainerDied","Data":"f38d18dbf0fd05c4818a47695fdb839783a19422fc076dfa197c9fcce4ac8bf4"} Feb 27 01:09:39 crc kubenswrapper[4771]: I0227 01:09:39.733529 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qs5th" event={"ID":"65fbc634-d941-4dee-a758-3b0b10bd60f0","Type":"ContainerDied","Data":"52474d2ba3c64320970a53825bb51ad389f4effbb229761f4021772b3454f8d8"} Feb 27 01:09:39 crc kubenswrapper[4771]: I0227 01:09:39.733583 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qs5th" Feb 27 01:09:39 crc kubenswrapper[4771]: I0227 01:09:39.733604 4771 scope.go:117] "RemoveContainer" containerID="f38d18dbf0fd05c4818a47695fdb839783a19422fc076dfa197c9fcce4ac8bf4" Feb 27 01:09:39 crc kubenswrapper[4771]: I0227 01:09:39.752393 4771 scope.go:117] "RemoveContainer" containerID="e905fed9b166a800e271dfb93c69fd17a7c60860a15a7fae25400bd9f145b1a7" Feb 27 01:09:39 crc kubenswrapper[4771]: I0227 01:09:39.772851 4771 scope.go:117] "RemoveContainer" containerID="559a793c8bc90ff6ec92096c31966d30de7b5630ba93dd06b4863076c13ad3f1" Feb 27 01:09:39 crc kubenswrapper[4771]: I0227 01:09:39.789428 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qs5th"] Feb 27 01:09:39 crc kubenswrapper[4771]: I0227 01:09:39.789483 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qs5th"] Feb 27 01:09:39 crc kubenswrapper[4771]: I0227 01:09:39.799990 4771 scope.go:117] "RemoveContainer" containerID="f38d18dbf0fd05c4818a47695fdb839783a19422fc076dfa197c9fcce4ac8bf4" Feb 27 01:09:39 crc kubenswrapper[4771]: E0227 01:09:39.800417 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f38d18dbf0fd05c4818a47695fdb839783a19422fc076dfa197c9fcce4ac8bf4\": container with ID starting with f38d18dbf0fd05c4818a47695fdb839783a19422fc076dfa197c9fcce4ac8bf4 not found: ID does not exist" containerID="f38d18dbf0fd05c4818a47695fdb839783a19422fc076dfa197c9fcce4ac8bf4" Feb 27 01:09:39 crc kubenswrapper[4771]: I0227 01:09:39.800476 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f38d18dbf0fd05c4818a47695fdb839783a19422fc076dfa197c9fcce4ac8bf4"} err="failed to get container status \"f38d18dbf0fd05c4818a47695fdb839783a19422fc076dfa197c9fcce4ac8bf4\": rpc error: code = NotFound desc = could not find container \"f38d18dbf0fd05c4818a47695fdb839783a19422fc076dfa197c9fcce4ac8bf4\": container with ID starting with f38d18dbf0fd05c4818a47695fdb839783a19422fc076dfa197c9fcce4ac8bf4 not found: ID does not exist" Feb 27 01:09:39 crc kubenswrapper[4771]: I0227 01:09:39.800503 4771 scope.go:117] "RemoveContainer" containerID="e905fed9b166a800e271dfb93c69fd17a7c60860a15a7fae25400bd9f145b1a7" Feb 27 01:09:39 crc kubenswrapper[4771]: E0227 01:09:39.801085 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e905fed9b166a800e271dfb93c69fd17a7c60860a15a7fae25400bd9f145b1a7\": container with ID starting with e905fed9b166a800e271dfb93c69fd17a7c60860a15a7fae25400bd9f145b1a7 not found: ID does not exist" containerID="e905fed9b166a800e271dfb93c69fd17a7c60860a15a7fae25400bd9f145b1a7" Feb 27 01:09:39 crc kubenswrapper[4771]: I0227 01:09:39.801149 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e905fed9b166a800e271dfb93c69fd17a7c60860a15a7fae25400bd9f145b1a7"} err="failed to get container status \"e905fed9b166a800e271dfb93c69fd17a7c60860a15a7fae25400bd9f145b1a7\": rpc error: code = NotFound desc = could not find container \"e905fed9b166a800e271dfb93c69fd17a7c60860a15a7fae25400bd9f145b1a7\": container with ID starting with e905fed9b166a800e271dfb93c69fd17a7c60860a15a7fae25400bd9f145b1a7 not found: ID does not exist" Feb 27 01:09:39 crc kubenswrapper[4771]: I0227 01:09:39.801188 4771 scope.go:117] "RemoveContainer" containerID="559a793c8bc90ff6ec92096c31966d30de7b5630ba93dd06b4863076c13ad3f1" Feb 27 01:09:39 crc kubenswrapper[4771]: E0227 01:09:39.801519 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"559a793c8bc90ff6ec92096c31966d30de7b5630ba93dd06b4863076c13ad3f1\": container with ID starting with 559a793c8bc90ff6ec92096c31966d30de7b5630ba93dd06b4863076c13ad3f1 not found: ID does not exist" containerID="559a793c8bc90ff6ec92096c31966d30de7b5630ba93dd06b4863076c13ad3f1" Feb 27 01:09:39 crc kubenswrapper[4771]: I0227 01:09:39.801559 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"559a793c8bc90ff6ec92096c31966d30de7b5630ba93dd06b4863076c13ad3f1"} err="failed to get container status \"559a793c8bc90ff6ec92096c31966d30de7b5630ba93dd06b4863076c13ad3f1\": rpc error: code = NotFound desc = could not find container \"559a793c8bc90ff6ec92096c31966d30de7b5630ba93dd06b4863076c13ad3f1\": container with ID starting with 559a793c8bc90ff6ec92096c31966d30de7b5630ba93dd06b4863076c13ad3f1 not found: ID does not exist" Feb 27 01:09:40 crc kubenswrapper[4771]: I0227 01:09:40.409887 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps" Feb 27 01:09:40 crc kubenswrapper[4771]: I0227 01:09:40.416382 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps" Feb 27 01:09:41 crc kubenswrapper[4771]: I0227 01:09:41.784110 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65fbc634-d941-4dee-a758-3b0b10bd60f0" path="/var/lib/kubelet/pods/65fbc634-d941-4dee-a758-3b0b10bd60f0/volumes" Feb 27 01:09:42 crc kubenswrapper[4771]: I0227 01:09:42.606667 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9scwl"] Feb 27 01:09:48 crc kubenswrapper[4771]: I0227 01:09:48.618256 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-69c78d7f85-zj485"] Feb 27 01:09:48 crc kubenswrapper[4771]: I0227 01:09:48.619176 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" podUID="beb27eba-ff4c-41aa-b322-6ec456945455" containerName="controller-manager" containerID="cri-o://517c64ffd803e603e4a5298bb35650e43ca72ece77c19e5f1e95c893dfe82ef5" gracePeriod=30 Feb 27 01:09:48 crc kubenswrapper[4771]: I0227 01:09:48.716374 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps"] Feb 27 01:09:48 crc kubenswrapper[4771]: I0227 01:09:48.716762 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps" podUID="63faec48-e964-4842-b418-b5f0fc000f37" containerName="route-controller-manager" containerID="cri-o://cacbf38fb50b43d51cbcb11d1279a5ef18baf4d8495bd6ea6c912dcaa27e7573" gracePeriod=30 Feb 27 01:09:48 crc kubenswrapper[4771]: I0227 01:09:48.790253 4771 generic.go:334] "Generic (PLEG): container finished" podID="beb27eba-ff4c-41aa-b322-6ec456945455" containerID="517c64ffd803e603e4a5298bb35650e43ca72ece77c19e5f1e95c893dfe82ef5" exitCode=0 Feb 27 01:09:48 crc kubenswrapper[4771]: I0227 01:09:48.790355 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" event={"ID":"beb27eba-ff4c-41aa-b322-6ec456945455","Type":"ContainerDied","Data":"517c64ffd803e603e4a5298bb35650e43ca72ece77c19e5f1e95c893dfe82ef5"} Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.267534 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps" Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.270254 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.377831 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63faec48-e964-4842-b418-b5f0fc000f37-config\") pod \"63faec48-e964-4842-b418-b5f0fc000f37\" (UID: \"63faec48-e964-4842-b418-b5f0fc000f37\") " Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.377916 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63faec48-e964-4842-b418-b5f0fc000f37-client-ca\") pod \"63faec48-e964-4842-b418-b5f0fc000f37\" (UID: \"63faec48-e964-4842-b418-b5f0fc000f37\") " Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.377955 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzgdz\" (UniqueName: \"kubernetes.io/projected/beb27eba-ff4c-41aa-b322-6ec456945455-kube-api-access-pzgdz\") pod \"beb27eba-ff4c-41aa-b322-6ec456945455\" (UID: \"beb27eba-ff4c-41aa-b322-6ec456945455\") " Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.377989 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beb27eba-ff4c-41aa-b322-6ec456945455-config\") pod \"beb27eba-ff4c-41aa-b322-6ec456945455\" (UID: \"beb27eba-ff4c-41aa-b322-6ec456945455\") " Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.378024 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/beb27eba-ff4c-41aa-b322-6ec456945455-proxy-ca-bundles\") pod \"beb27eba-ff4c-41aa-b322-6ec456945455\" (UID: \"beb27eba-ff4c-41aa-b322-6ec456945455\") " Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.378055 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/beb27eba-ff4c-41aa-b322-6ec456945455-client-ca\") pod \"beb27eba-ff4c-41aa-b322-6ec456945455\" (UID: \"beb27eba-ff4c-41aa-b322-6ec456945455\") " Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.378080 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63faec48-e964-4842-b418-b5f0fc000f37-serving-cert\") pod \"63faec48-e964-4842-b418-b5f0fc000f37\" (UID: \"63faec48-e964-4842-b418-b5f0fc000f37\") " Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.378100 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beb27eba-ff4c-41aa-b322-6ec456945455-serving-cert\") pod \"beb27eba-ff4c-41aa-b322-6ec456945455\" (UID: \"beb27eba-ff4c-41aa-b322-6ec456945455\") " Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.378120 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6xmr\" (UniqueName: \"kubernetes.io/projected/63faec48-e964-4842-b418-b5f0fc000f37-kube-api-access-v6xmr\") pod \"63faec48-e964-4842-b418-b5f0fc000f37\" (UID: \"63faec48-e964-4842-b418-b5f0fc000f37\") " Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.382533 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/beb27eba-ff4c-41aa-b322-6ec456945455-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "beb27eba-ff4c-41aa-b322-6ec456945455" (UID: "beb27eba-ff4c-41aa-b322-6ec456945455"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.382847 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63faec48-e964-4842-b418-b5f0fc000f37-client-ca" (OuterVolumeSpecName: "client-ca") pod "63faec48-e964-4842-b418-b5f0fc000f37" (UID: "63faec48-e964-4842-b418-b5f0fc000f37"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.383327 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63faec48-e964-4842-b418-b5f0fc000f37-config" (OuterVolumeSpecName: "config") pod "63faec48-e964-4842-b418-b5f0fc000f37" (UID: "63faec48-e964-4842-b418-b5f0fc000f37"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.383839 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/beb27eba-ff4c-41aa-b322-6ec456945455-config" (OuterVolumeSpecName: "config") pod "beb27eba-ff4c-41aa-b322-6ec456945455" (UID: "beb27eba-ff4c-41aa-b322-6ec456945455"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.384118 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/beb27eba-ff4c-41aa-b322-6ec456945455-client-ca" (OuterVolumeSpecName: "client-ca") pod "beb27eba-ff4c-41aa-b322-6ec456945455" (UID: "beb27eba-ff4c-41aa-b322-6ec456945455"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.389602 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beb27eba-ff4c-41aa-b322-6ec456945455-kube-api-access-pzgdz" (OuterVolumeSpecName: "kube-api-access-pzgdz") pod "beb27eba-ff4c-41aa-b322-6ec456945455" (UID: "beb27eba-ff4c-41aa-b322-6ec456945455"). InnerVolumeSpecName "kube-api-access-pzgdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.389740 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63faec48-e964-4842-b418-b5f0fc000f37-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "63faec48-e964-4842-b418-b5f0fc000f37" (UID: "63faec48-e964-4842-b418-b5f0fc000f37"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.389923 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63faec48-e964-4842-b418-b5f0fc000f37-kube-api-access-v6xmr" (OuterVolumeSpecName: "kube-api-access-v6xmr") pod "63faec48-e964-4842-b418-b5f0fc000f37" (UID: "63faec48-e964-4842-b418-b5f0fc000f37"). InnerVolumeSpecName "kube-api-access-v6xmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.395929 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beb27eba-ff4c-41aa-b322-6ec456945455-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "beb27eba-ff4c-41aa-b322-6ec456945455" (UID: "beb27eba-ff4c-41aa-b322-6ec456945455"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.482368 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63faec48-e964-4842-b418-b5f0fc000f37-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.482420 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzgdz\" (UniqueName: \"kubernetes.io/projected/beb27eba-ff4c-41aa-b322-6ec456945455-kube-api-access-pzgdz\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.482437 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beb27eba-ff4c-41aa-b322-6ec456945455-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.482449 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/beb27eba-ff4c-41aa-b322-6ec456945455-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.482461 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/beb27eba-ff4c-41aa-b322-6ec456945455-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.482472 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63faec48-e964-4842-b418-b5f0fc000f37-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.482483 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beb27eba-ff4c-41aa-b322-6ec456945455-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.482517 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6xmr\" (UniqueName: \"kubernetes.io/projected/63faec48-e964-4842-b418-b5f0fc000f37-kube-api-access-v6xmr\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.482528 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63faec48-e964-4842-b418-b5f0fc000f37-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.798200 4771 generic.go:334] "Generic (PLEG): container finished" podID="63faec48-e964-4842-b418-b5f0fc000f37" containerID="cacbf38fb50b43d51cbcb11d1279a5ef18baf4d8495bd6ea6c912dcaa27e7573" exitCode=0 Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.798251 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps" event={"ID":"63faec48-e964-4842-b418-b5f0fc000f37","Type":"ContainerDied","Data":"cacbf38fb50b43d51cbcb11d1279a5ef18baf4d8495bd6ea6c912dcaa27e7573"} Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.798315 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps" event={"ID":"63faec48-e964-4842-b418-b5f0fc000f37","Type":"ContainerDied","Data":"3e23a7b568d5a49c2385887cd6fd323b132aa02503eb19b17971370d064f67fe"} Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.798344 4771 scope.go:117] "RemoveContainer" containerID="cacbf38fb50b43d51cbcb11d1279a5ef18baf4d8495bd6ea6c912dcaa27e7573" Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.798394 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps" Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.800421 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" event={"ID":"beb27eba-ff4c-41aa-b322-6ec456945455","Type":"ContainerDied","Data":"adb4112a33756f4e2880fb5800120cd0b5a3dd54ad84611c7e7756b428bf4397"} Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.800508 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69c78d7f85-zj485" Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.824820 4771 scope.go:117] "RemoveContainer" containerID="cacbf38fb50b43d51cbcb11d1279a5ef18baf4d8495bd6ea6c912dcaa27e7573" Feb 27 01:09:49 crc kubenswrapper[4771]: E0227 01:09:49.825359 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cacbf38fb50b43d51cbcb11d1279a5ef18baf4d8495bd6ea6c912dcaa27e7573\": container with ID starting with cacbf38fb50b43d51cbcb11d1279a5ef18baf4d8495bd6ea6c912dcaa27e7573 not found: ID does not exist" containerID="cacbf38fb50b43d51cbcb11d1279a5ef18baf4d8495bd6ea6c912dcaa27e7573" Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.825405 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cacbf38fb50b43d51cbcb11d1279a5ef18baf4d8495bd6ea6c912dcaa27e7573"} err="failed to get container status \"cacbf38fb50b43d51cbcb11d1279a5ef18baf4d8495bd6ea6c912dcaa27e7573\": rpc error: code = NotFound desc = could not find container \"cacbf38fb50b43d51cbcb11d1279a5ef18baf4d8495bd6ea6c912dcaa27e7573\": container with ID starting with cacbf38fb50b43d51cbcb11d1279a5ef18baf4d8495bd6ea6c912dcaa27e7573 not found: ID does not exist" Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.825438 4771 scope.go:117] "RemoveContainer" containerID="517c64ffd803e603e4a5298bb35650e43ca72ece77c19e5f1e95c893dfe82ef5" Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.834881 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-69c78d7f85-zj485"] Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.838346 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-69c78d7f85-zj485"] Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.849175 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps"] Feb 27 01:09:49 crc kubenswrapper[4771]: I0227 01:09:49.855878 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69784b8f5d-c8rps"] Feb 27 01:09:49 crc kubenswrapper[4771]: E0227 01:09:49.930710 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63faec48_e964_4842_b418_b5f0fc000f37.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63faec48_e964_4842_b418_b5f0fc000f37.slice/crio-3e23a7b568d5a49c2385887cd6fd323b132aa02503eb19b17971370d064f67fe\": RecentStats: unable to find data in memory cache]" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.633193 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-cc499969c-4f6v9"] Feb 27 01:09:50 crc kubenswrapper[4771]: E0227 01:09:50.633648 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a309b5-c10d-49d7-ade8-3c087250dd91" containerName="registry-server" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.633677 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a309b5-c10d-49d7-ade8-3c087250dd91" containerName="registry-server" Feb 27 01:09:50 crc kubenswrapper[4771]: E0227 01:09:50.633703 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beb27eba-ff4c-41aa-b322-6ec456945455" containerName="controller-manager" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.633711 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="beb27eba-ff4c-41aa-b322-6ec456945455" containerName="controller-manager" Feb 27 01:09:50 crc kubenswrapper[4771]: E0227 01:09:50.633724 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a309b5-c10d-49d7-ade8-3c087250dd91" containerName="extract-utilities" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.633734 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a309b5-c10d-49d7-ade8-3c087250dd91" containerName="extract-utilities" Feb 27 01:09:50 crc kubenswrapper[4771]: E0227 01:09:50.633745 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4029ae4-2dfb-4351-88f8-08fefb8ab46e" containerName="extract-content" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.633754 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4029ae4-2dfb-4351-88f8-08fefb8ab46e" containerName="extract-content" Feb 27 01:09:50 crc kubenswrapper[4771]: E0227 01:09:50.633764 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65fbc634-d941-4dee-a758-3b0b10bd60f0" containerName="extract-content" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.633773 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="65fbc634-d941-4dee-a758-3b0b10bd60f0" containerName="extract-content" Feb 27 01:09:50 crc kubenswrapper[4771]: E0227 01:09:50.633793 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65fbc634-d941-4dee-a758-3b0b10bd60f0" containerName="registry-server" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.633804 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="65fbc634-d941-4dee-a758-3b0b10bd60f0" containerName="registry-server" Feb 27 01:09:50 crc kubenswrapper[4771]: E0227 01:09:50.633815 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65fbc634-d941-4dee-a758-3b0b10bd60f0" containerName="extract-utilities" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.633823 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="65fbc634-d941-4dee-a758-3b0b10bd60f0" containerName="extract-utilities" Feb 27 01:09:50 crc kubenswrapper[4771]: E0227 01:09:50.633835 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4029ae4-2dfb-4351-88f8-08fefb8ab46e" containerName="registry-server" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.633848 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4029ae4-2dfb-4351-88f8-08fefb8ab46e" containerName="registry-server" Feb 27 01:09:50 crc kubenswrapper[4771]: E0227 01:09:50.633866 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63faec48-e964-4842-b418-b5f0fc000f37" containerName="route-controller-manager" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.633876 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="63faec48-e964-4842-b418-b5f0fc000f37" containerName="route-controller-manager" Feb 27 01:09:50 crc kubenswrapper[4771]: E0227 01:09:50.633887 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a309b5-c10d-49d7-ade8-3c087250dd91" containerName="extract-content" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.633894 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a309b5-c10d-49d7-ade8-3c087250dd91" containerName="extract-content" Feb 27 01:09:50 crc kubenswrapper[4771]: E0227 01:09:50.633910 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4029ae4-2dfb-4351-88f8-08fefb8ab46e" containerName="extract-utilities" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.633919 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4029ae4-2dfb-4351-88f8-08fefb8ab46e" containerName="extract-utilities" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.634062 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="63faec48-e964-4842-b418-b5f0fc000f37" containerName="route-controller-manager" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.634077 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="96a309b5-c10d-49d7-ade8-3c087250dd91" containerName="registry-server" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.634092 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="beb27eba-ff4c-41aa-b322-6ec456945455" containerName="controller-manager" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.634104 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4029ae4-2dfb-4351-88f8-08fefb8ab46e" containerName="registry-server" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.634117 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="65fbc634-d941-4dee-a758-3b0b10bd60f0" containerName="registry-server" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.634738 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cc499969c-4f6v9" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.636663 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.637107 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.637165 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.637291 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dfc64c5f6-k5dvc"] Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.637911 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.638029 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.638268 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dfc64c5f6-k5dvc" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.638521 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.641994 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.642252 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.642794 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.643005 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.643041 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.643304 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.649197 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-cc499969c-4f6v9"] Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.649880 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.657002 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dfc64c5f6-k5dvc"] Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.797971 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d329ab7f-d64c-4a3b-ad53-9c6198214cfa-config\") pod \"route-controller-manager-6dfc64c5f6-k5dvc\" (UID: \"d329ab7f-d64c-4a3b-ad53-9c6198214cfa\") " pod="openshift-route-controller-manager/route-controller-manager-6dfc64c5f6-k5dvc" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.798059 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9de3fb7b-0efd-4341-857f-8681b73e3fd4-config\") pod \"controller-manager-cc499969c-4f6v9\" (UID: \"9de3fb7b-0efd-4341-857f-8681b73e3fd4\") " pod="openshift-controller-manager/controller-manager-cc499969c-4f6v9" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.798087 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9de3fb7b-0efd-4341-857f-8681b73e3fd4-serving-cert\") pod \"controller-manager-cc499969c-4f6v9\" (UID: \"9de3fb7b-0efd-4341-857f-8681b73e3fd4\") " pod="openshift-controller-manager/controller-manager-cc499969c-4f6v9" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.798118 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d329ab7f-d64c-4a3b-ad53-9c6198214cfa-client-ca\") pod \"route-controller-manager-6dfc64c5f6-k5dvc\" (UID: \"d329ab7f-d64c-4a3b-ad53-9c6198214cfa\") " pod="openshift-route-controller-manager/route-controller-manager-6dfc64c5f6-k5dvc" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.798150 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d329ab7f-d64c-4a3b-ad53-9c6198214cfa-serving-cert\") pod \"route-controller-manager-6dfc64c5f6-k5dvc\" (UID: \"d329ab7f-d64c-4a3b-ad53-9c6198214cfa\") " pod="openshift-route-controller-manager/route-controller-manager-6dfc64c5f6-k5dvc" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.798179 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tbhv\" (UniqueName: \"kubernetes.io/projected/d329ab7f-d64c-4a3b-ad53-9c6198214cfa-kube-api-access-8tbhv\") pod \"route-controller-manager-6dfc64c5f6-k5dvc\" (UID: \"d329ab7f-d64c-4a3b-ad53-9c6198214cfa\") " pod="openshift-route-controller-manager/route-controller-manager-6dfc64c5f6-k5dvc" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.798212 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9de3fb7b-0efd-4341-857f-8681b73e3fd4-client-ca\") pod \"controller-manager-cc499969c-4f6v9\" (UID: \"9de3fb7b-0efd-4341-857f-8681b73e3fd4\") " pod="openshift-controller-manager/controller-manager-cc499969c-4f6v9" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.798230 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9de3fb7b-0efd-4341-857f-8681b73e3fd4-proxy-ca-bundles\") pod \"controller-manager-cc499969c-4f6v9\" (UID: \"9de3fb7b-0efd-4341-857f-8681b73e3fd4\") " pod="openshift-controller-manager/controller-manager-cc499969c-4f6v9" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.798248 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjfbm\" (UniqueName: \"kubernetes.io/projected/9de3fb7b-0efd-4341-857f-8681b73e3fd4-kube-api-access-cjfbm\") pod \"controller-manager-cc499969c-4f6v9\" (UID: \"9de3fb7b-0efd-4341-857f-8681b73e3fd4\") " pod="openshift-controller-manager/controller-manager-cc499969c-4f6v9" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.899537 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9de3fb7b-0efd-4341-857f-8681b73e3fd4-client-ca\") pod \"controller-manager-cc499969c-4f6v9\" (UID: \"9de3fb7b-0efd-4341-857f-8681b73e3fd4\") " pod="openshift-controller-manager/controller-manager-cc499969c-4f6v9" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.899619 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9de3fb7b-0efd-4341-857f-8681b73e3fd4-proxy-ca-bundles\") pod \"controller-manager-cc499969c-4f6v9\" (UID: \"9de3fb7b-0efd-4341-857f-8681b73e3fd4\") " pod="openshift-controller-manager/controller-manager-cc499969c-4f6v9" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.899652 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjfbm\" (UniqueName: \"kubernetes.io/projected/9de3fb7b-0efd-4341-857f-8681b73e3fd4-kube-api-access-cjfbm\") pod \"controller-manager-cc499969c-4f6v9\" (UID: \"9de3fb7b-0efd-4341-857f-8681b73e3fd4\") " pod="openshift-controller-manager/controller-manager-cc499969c-4f6v9" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.901011 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9de3fb7b-0efd-4341-857f-8681b73e3fd4-client-ca\") pod \"controller-manager-cc499969c-4f6v9\" (UID: \"9de3fb7b-0efd-4341-857f-8681b73e3fd4\") " pod="openshift-controller-manager/controller-manager-cc499969c-4f6v9" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.901198 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9de3fb7b-0efd-4341-857f-8681b73e3fd4-proxy-ca-bundles\") pod \"controller-manager-cc499969c-4f6v9\" (UID: \"9de3fb7b-0efd-4341-857f-8681b73e3fd4\") " pod="openshift-controller-manager/controller-manager-cc499969c-4f6v9" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.901314 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d329ab7f-d64c-4a3b-ad53-9c6198214cfa-config\") pod \"route-controller-manager-6dfc64c5f6-k5dvc\" (UID: \"d329ab7f-d64c-4a3b-ad53-9c6198214cfa\") " pod="openshift-route-controller-manager/route-controller-manager-6dfc64c5f6-k5dvc" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.901602 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9de3fb7b-0efd-4341-857f-8681b73e3fd4-config\") pod \"controller-manager-cc499969c-4f6v9\" (UID: \"9de3fb7b-0efd-4341-857f-8681b73e3fd4\") " pod="openshift-controller-manager/controller-manager-cc499969c-4f6v9" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.901635 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9de3fb7b-0efd-4341-857f-8681b73e3fd4-serving-cert\") pod \"controller-manager-cc499969c-4f6v9\" (UID: \"9de3fb7b-0efd-4341-857f-8681b73e3fd4\") " pod="openshift-controller-manager/controller-manager-cc499969c-4f6v9" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.901699 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d329ab7f-d64c-4a3b-ad53-9c6198214cfa-client-ca\") pod \"route-controller-manager-6dfc64c5f6-k5dvc\" (UID: \"d329ab7f-d64c-4a3b-ad53-9c6198214cfa\") " pod="openshift-route-controller-manager/route-controller-manager-6dfc64c5f6-k5dvc" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.901791 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d329ab7f-d64c-4a3b-ad53-9c6198214cfa-serving-cert\") pod \"route-controller-manager-6dfc64c5f6-k5dvc\" (UID: \"d329ab7f-d64c-4a3b-ad53-9c6198214cfa\") " pod="openshift-route-controller-manager/route-controller-manager-6dfc64c5f6-k5dvc" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.901834 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tbhv\" (UniqueName: \"kubernetes.io/projected/d329ab7f-d64c-4a3b-ad53-9c6198214cfa-kube-api-access-8tbhv\") pod \"route-controller-manager-6dfc64c5f6-k5dvc\" (UID: \"d329ab7f-d64c-4a3b-ad53-9c6198214cfa\") " pod="openshift-route-controller-manager/route-controller-manager-6dfc64c5f6-k5dvc" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.902392 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d329ab7f-d64c-4a3b-ad53-9c6198214cfa-config\") pod \"route-controller-manager-6dfc64c5f6-k5dvc\" (UID: \"d329ab7f-d64c-4a3b-ad53-9c6198214cfa\") " pod="openshift-route-controller-manager/route-controller-manager-6dfc64c5f6-k5dvc" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.903027 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d329ab7f-d64c-4a3b-ad53-9c6198214cfa-client-ca\") pod \"route-controller-manager-6dfc64c5f6-k5dvc\" (UID: \"d329ab7f-d64c-4a3b-ad53-9c6198214cfa\") " pod="openshift-route-controller-manager/route-controller-manager-6dfc64c5f6-k5dvc" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.904745 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9de3fb7b-0efd-4341-857f-8681b73e3fd4-config\") pod \"controller-manager-cc499969c-4f6v9\" (UID: \"9de3fb7b-0efd-4341-857f-8681b73e3fd4\") " pod="openshift-controller-manager/controller-manager-cc499969c-4f6v9" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.906961 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9de3fb7b-0efd-4341-857f-8681b73e3fd4-serving-cert\") pod \"controller-manager-cc499969c-4f6v9\" (UID: \"9de3fb7b-0efd-4341-857f-8681b73e3fd4\") " pod="openshift-controller-manager/controller-manager-cc499969c-4f6v9" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.906986 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d329ab7f-d64c-4a3b-ad53-9c6198214cfa-serving-cert\") pod \"route-controller-manager-6dfc64c5f6-k5dvc\" (UID: \"d329ab7f-d64c-4a3b-ad53-9c6198214cfa\") " pod="openshift-route-controller-manager/route-controller-manager-6dfc64c5f6-k5dvc" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.928288 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjfbm\" (UniqueName: \"kubernetes.io/projected/9de3fb7b-0efd-4341-857f-8681b73e3fd4-kube-api-access-cjfbm\") pod \"controller-manager-cc499969c-4f6v9\" (UID: \"9de3fb7b-0efd-4341-857f-8681b73e3fd4\") " pod="openshift-controller-manager/controller-manager-cc499969c-4f6v9" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.928671 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tbhv\" (UniqueName: \"kubernetes.io/projected/d329ab7f-d64c-4a3b-ad53-9c6198214cfa-kube-api-access-8tbhv\") pod \"route-controller-manager-6dfc64c5f6-k5dvc\" (UID: \"d329ab7f-d64c-4a3b-ad53-9c6198214cfa\") " pod="openshift-route-controller-manager/route-controller-manager-6dfc64c5f6-k5dvc" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.952321 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cc499969c-4f6v9" Feb 27 01:09:50 crc kubenswrapper[4771]: I0227 01:09:50.973912 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dfc64c5f6-k5dvc" Feb 27 01:09:51 crc kubenswrapper[4771]: I0227 01:09:51.440939 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-cc499969c-4f6v9"] Feb 27 01:09:51 crc kubenswrapper[4771]: I0227 01:09:51.544965 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dfc64c5f6-k5dvc"] Feb 27 01:09:51 crc kubenswrapper[4771]: W0227 01:09:51.560459 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd329ab7f_d64c_4a3b_ad53_9c6198214cfa.slice/crio-181d6c3cd3be31d8a43fedb9961296cc4f8bb373ffbd6dc46b281daa270787a1 WatchSource:0}: Error finding container 181d6c3cd3be31d8a43fedb9961296cc4f8bb373ffbd6dc46b281daa270787a1: Status 404 returned error can't find the container with id 181d6c3cd3be31d8a43fedb9961296cc4f8bb373ffbd6dc46b281daa270787a1 Feb 27 01:09:51 crc kubenswrapper[4771]: I0227 01:09:51.783538 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63faec48-e964-4842-b418-b5f0fc000f37" path="/var/lib/kubelet/pods/63faec48-e964-4842-b418-b5f0fc000f37/volumes" Feb 27 01:09:51 crc kubenswrapper[4771]: I0227 01:09:51.785149 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beb27eba-ff4c-41aa-b322-6ec456945455" path="/var/lib/kubelet/pods/beb27eba-ff4c-41aa-b322-6ec456945455/volumes" Feb 27 01:09:51 crc kubenswrapper[4771]: I0227 01:09:51.818242 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dfc64c5f6-k5dvc" event={"ID":"d329ab7f-d64c-4a3b-ad53-9c6198214cfa","Type":"ContainerStarted","Data":"742f0b6d42a09af087809ca744ec2a44b7f219bd282ee1208863479c908bfbdb"} Feb 27 01:09:51 crc kubenswrapper[4771]: I0227 01:09:51.818309 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dfc64c5f6-k5dvc" event={"ID":"d329ab7f-d64c-4a3b-ad53-9c6198214cfa","Type":"ContainerStarted","Data":"181d6c3cd3be31d8a43fedb9961296cc4f8bb373ffbd6dc46b281daa270787a1"} Feb 27 01:09:51 crc kubenswrapper[4771]: I0227 01:09:51.820251 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6dfc64c5f6-k5dvc" Feb 27 01:09:51 crc kubenswrapper[4771]: I0227 01:09:51.822946 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cc499969c-4f6v9" event={"ID":"9de3fb7b-0efd-4341-857f-8681b73e3fd4","Type":"ContainerStarted","Data":"c267718bf44258d3c9676c41101eb6ee99ce36ecc600aded5322670aeff79b1a"} Feb 27 01:09:51 crc kubenswrapper[4771]: I0227 01:09:51.823003 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cc499969c-4f6v9" event={"ID":"9de3fb7b-0efd-4341-857f-8681b73e3fd4","Type":"ContainerStarted","Data":"6267af05a330cd50d1427a8f7752a599a4cb94c7efa0d5978b76d523638a66a3"} Feb 27 01:09:51 crc kubenswrapper[4771]: I0227 01:09:51.823216 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-cc499969c-4f6v9" Feb 27 01:09:51 crc kubenswrapper[4771]: I0227 01:09:51.842322 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6dfc64c5f6-k5dvc" podStartSLOduration=3.842296429 podStartE2EDuration="3.842296429s" podCreationTimestamp="2026-02-27 01:09:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:09:51.839413953 +0000 UTC m=+304.776975241" watchObservedRunningTime="2026-02-27 01:09:51.842296429 +0000 UTC m=+304.779857737" Feb 27 01:09:51 crc kubenswrapper[4771]: I0227 01:09:51.851076 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-cc499969c-4f6v9" Feb 27 01:09:51 crc kubenswrapper[4771]: I0227 01:09:51.854830 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-cc499969c-4f6v9" podStartSLOduration=3.854806931 podStartE2EDuration="3.854806931s" podCreationTimestamp="2026-02-27 01:09:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:09:51.853064425 +0000 UTC m=+304.790625723" watchObservedRunningTime="2026-02-27 01:09:51.854806931 +0000 UTC m=+304.792368219" Feb 27 01:09:51 crc kubenswrapper[4771]: I0227 01:09:51.988266 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6dfc64c5f6-k5dvc" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.357011 4771 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.358177 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.358324 4771 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.359503 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba" gracePeriod=15 Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.359538 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40" gracePeriod=15 Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.359499 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e" gracePeriod=15 Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.359479 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4" gracePeriod=15 Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.359729 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645" gracePeriod=15 Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.360790 4771 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 27 01:09:52 crc kubenswrapper[4771]: E0227 01:09:52.361030 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.361050 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 01:09:52 crc kubenswrapper[4771]: E0227 01:09:52.361069 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.361081 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 27 01:09:52 crc kubenswrapper[4771]: E0227 01:09:52.361095 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.361107 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 01:09:52 crc kubenswrapper[4771]: E0227 01:09:52.361145 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.361158 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 01:09:52 crc kubenswrapper[4771]: E0227 01:09:52.361176 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.361188 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 01:09:52 crc kubenswrapper[4771]: E0227 01:09:52.361206 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.361221 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 27 01:09:52 crc kubenswrapper[4771]: E0227 01:09:52.361245 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.361261 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 27 01:09:52 crc kubenswrapper[4771]: E0227 01:09:52.361287 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.361301 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 27 01:09:52 crc kubenswrapper[4771]: E0227 01:09:52.361323 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.361336 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.361506 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.361532 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.361584 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.361603 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.361627 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.361642 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.361668 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 27 01:09:52 crc kubenswrapper[4771]: E0227 01:09:52.361835 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.361849 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.362013 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.362034 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.532525 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.532912 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.532966 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.533010 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.533224 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.533264 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.533316 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.533342 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.634782 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.634845 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.634879 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.634906 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.634955 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.635070 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.635096 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.635135 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.635214 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.635221 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.635372 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.635482 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.635510 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.635541 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.635618 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.635749 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.763424 4771 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.763473 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.830356 4771 generic.go:334] "Generic (PLEG): container finished" podID="dea387ee-2398-4e31-b664-58bae30775ca" containerID="c8129f634d51517be3065265b53503f117583182ac0d29c505a79acf301629fd" exitCode=0 Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.830455 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dea387ee-2398-4e31-b664-58bae30775ca","Type":"ContainerDied","Data":"c8129f634d51517be3065265b53503f117583182ac0d29c505a79acf301629fd"} Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.831964 4771 status_manager.go:851] "Failed to get status for pod" podUID="dea387ee-2398-4e31-b664-58bae30775ca" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.832320 4771 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.832509 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.834488 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.835677 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645" exitCode=0 Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.835699 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40" exitCode=0 Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.835709 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e" exitCode=0 Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.835717 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba" exitCode=2 Feb 27 01:09:52 crc kubenswrapper[4771]: I0227 01:09:52.835795 4771 scope.go:117] "RemoveContainer" containerID="6ac5cc9b673223da4e4a5da6eff1cc97a3bb76da42d54bed73554de8bd9f8846" Feb 27 01:09:52 crc kubenswrapper[4771]: E0227 01:09:52.899145 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:09:52Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:09:52Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:09:52Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T01:09:52Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:09:52 crc kubenswrapper[4771]: E0227 01:09:52.899672 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:09:52 crc kubenswrapper[4771]: E0227 01:09:52.900188 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:09:52 crc kubenswrapper[4771]: E0227 01:09:52.900648 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:09:52 crc kubenswrapper[4771]: E0227 01:09:52.901169 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:09:52 crc kubenswrapper[4771]: E0227 01:09:52.901212 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 01:09:53 crc kubenswrapper[4771]: E0227 01:09:53.573051 4771 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:09:53 crc kubenswrapper[4771]: E0227 01:09:53.573332 4771 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:09:53 crc kubenswrapper[4771]: E0227 01:09:53.573519 4771 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:09:53 crc kubenswrapper[4771]: E0227 01:09:53.573698 4771 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:09:53 crc kubenswrapper[4771]: E0227 01:09:53.573857 4771 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:09:53 crc kubenswrapper[4771]: I0227 01:09:53.573875 4771 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 27 01:09:53 crc kubenswrapper[4771]: E0227 01:09:53.574029 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" interval="200ms" Feb 27 01:09:53 crc kubenswrapper[4771]: E0227 01:09:53.775012 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" interval="400ms" Feb 27 01:09:53 crc kubenswrapper[4771]: I0227 01:09:53.849129 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 27 01:09:53 crc kubenswrapper[4771]: E0227 01:09:53.857722 4771 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.189:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" volumeName="registry-storage" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.164483 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.165653 4771 status_manager.go:851] "Failed to get status for pod" podUID="dea387ee-2398-4e31-b664-58bae30775ca" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:09:54 crc kubenswrapper[4771]: E0227 01:09:54.176475 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" interval="800ms" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.359676 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dea387ee-2398-4e31-b664-58bae30775ca-kubelet-dir\") pod \"dea387ee-2398-4e31-b664-58bae30775ca\" (UID: \"dea387ee-2398-4e31-b664-58bae30775ca\") " Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.359811 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dea387ee-2398-4e31-b664-58bae30775ca-kube-api-access\") pod \"dea387ee-2398-4e31-b664-58bae30775ca\" (UID: \"dea387ee-2398-4e31-b664-58bae30775ca\") " Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.359819 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dea387ee-2398-4e31-b664-58bae30775ca-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dea387ee-2398-4e31-b664-58bae30775ca" (UID: "dea387ee-2398-4e31-b664-58bae30775ca"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.359905 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dea387ee-2398-4e31-b664-58bae30775ca-var-lock\") pod \"dea387ee-2398-4e31-b664-58bae30775ca\" (UID: \"dea387ee-2398-4e31-b664-58bae30775ca\") " Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.359962 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dea387ee-2398-4e31-b664-58bae30775ca-var-lock" (OuterVolumeSpecName: "var-lock") pod "dea387ee-2398-4e31-b664-58bae30775ca" (UID: "dea387ee-2398-4e31-b664-58bae30775ca"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.360260 4771 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dea387ee-2398-4e31-b664-58bae30775ca-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.360282 4771 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dea387ee-2398-4e31-b664-58bae30775ca-var-lock\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.372250 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dea387ee-2398-4e31-b664-58bae30775ca-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dea387ee-2398-4e31-b664-58bae30775ca" (UID: "dea387ee-2398-4e31-b664-58bae30775ca"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.495520 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dea387ee-2398-4e31-b664-58bae30775ca-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.746714 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.747689 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.748267 4771 status_manager.go:851] "Failed to get status for pod" podUID="dea387ee-2398-4e31-b664-58bae30775ca" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.748990 4771 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.859882 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.860826 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4" exitCode=0 Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.860907 4771 scope.go:117] "RemoveContainer" containerID="2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.861002 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.863460 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dea387ee-2398-4e31-b664-58bae30775ca","Type":"ContainerDied","Data":"df397424bf00a1373d8980140f8119be6a6ba254b1c8dc93c00d0402d27d9b66"} Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.863513 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df397424bf00a1373d8980140f8119be6a6ba254b1c8dc93c00d0402d27d9b66" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.863526 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.877946 4771 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.878510 4771 status_manager.go:851] "Failed to get status for pod" podUID="dea387ee-2398-4e31-b664-58bae30775ca" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.881356 4771 scope.go:117] "RemoveContainer" containerID="45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.897817 4771 scope.go:117] "RemoveContainer" containerID="7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.899869 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.899970 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.899994 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.900020 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.900036 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.900163 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.900299 4771 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.900319 4771 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.900337 4771 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.920086 4771 scope.go:117] "RemoveContainer" containerID="cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.938413 4771 scope.go:117] "RemoveContainer" containerID="d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.962370 4771 scope.go:117] "RemoveContainer" containerID="c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b" Feb 27 01:09:54 crc kubenswrapper[4771]: E0227 01:09:54.977544 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" interval="1.6s" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.983767 4771 scope.go:117] "RemoveContainer" containerID="2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645" Feb 27 01:09:54 crc kubenswrapper[4771]: E0227 01:09:54.984295 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\": container with ID starting with 2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645 not found: ID does not exist" containerID="2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.984373 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645"} err="failed to get container status \"2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\": rpc error: code = NotFound desc = could not find container \"2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645\": container with ID starting with 2e1e64fbb995a949ebe7793765a90294f61fe894a598349ccca9a47e1e4f3645 not found: ID does not exist" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.984422 4771 scope.go:117] "RemoveContainer" containerID="45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40" Feb 27 01:09:54 crc kubenswrapper[4771]: E0227 01:09:54.984910 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\": container with ID starting with 45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40 not found: ID does not exist" containerID="45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.984949 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40"} err="failed to get container status \"45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\": rpc error: code = NotFound desc = could not find container \"45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40\": container with ID starting with 45a938724b889e649f9fed36cb5ff8c4f433428484bd9c1bc9256d61d8c82b40 not found: ID does not exist" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.984976 4771 scope.go:117] "RemoveContainer" containerID="7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e" Feb 27 01:09:54 crc kubenswrapper[4771]: E0227 01:09:54.985429 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\": container with ID starting with 7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e not found: ID does not exist" containerID="7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.985484 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e"} err="failed to get container status \"7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\": rpc error: code = NotFound desc = could not find container \"7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e\": container with ID starting with 7b3c4fe361e12a9e5c3f0ec349e8fbd8d06361d9df8258362a5140348369081e not found: ID does not exist" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.985522 4771 scope.go:117] "RemoveContainer" containerID="cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba" Feb 27 01:09:54 crc kubenswrapper[4771]: E0227 01:09:54.985927 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\": container with ID starting with cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba not found: ID does not exist" containerID="cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.985964 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba"} err="failed to get container status \"cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\": rpc error: code = NotFound desc = could not find container \"cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba\": container with ID starting with cc5b1a5435f46d0b41efb472e9aea381174f71ddda3445660d070c25923526ba not found: ID does not exist" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.985987 4771 scope.go:117] "RemoveContainer" containerID="d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4" Feb 27 01:09:54 crc kubenswrapper[4771]: E0227 01:09:54.986401 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\": container with ID starting with d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4 not found: ID does not exist" containerID="d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.986434 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4"} err="failed to get container status \"d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\": rpc error: code = NotFound desc = could not find container \"d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4\": container with ID starting with d5f28dbeec81fc32f67944ed3afbd75839b3a0c3f513bc34596a0fe9333e69b4 not found: ID does not exist" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.986454 4771 scope.go:117] "RemoveContainer" containerID="c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b" Feb 27 01:09:54 crc kubenswrapper[4771]: E0227 01:09:54.986867 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\": container with ID starting with c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b not found: ID does not exist" containerID="c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b" Feb 27 01:09:54 crc kubenswrapper[4771]: I0227 01:09:54.986925 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b"} err="failed to get container status \"c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\": rpc error: code = NotFound desc = could not find container \"c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b\": container with ID starting with c98d0f21ccd01809569f3025eded0583e1429c6afecf4dde8563ab107623401b not found: ID does not exist" Feb 27 01:09:55 crc kubenswrapper[4771]: I0227 01:09:55.185424 4771 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:09:55 crc kubenswrapper[4771]: I0227 01:09:55.186110 4771 status_manager.go:851] "Failed to get status for pod" podUID="dea387ee-2398-4e31-b664-58bae30775ca" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:09:55 crc kubenswrapper[4771]: I0227 01:09:55.783064 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 27 01:09:56 crc kubenswrapper[4771]: E0227 01:09:56.578464 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" interval="3.2s" Feb 27 01:09:57 crc kubenswrapper[4771]: E0227 01:09:57.394169 4771 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.189:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 01:09:57 crc kubenswrapper[4771]: I0227 01:09:57.394622 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 01:09:57 crc kubenswrapper[4771]: E0227 01:09:57.427877 4771 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.189:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1897f53792bd44fb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:09:57.427463419 +0000 UTC m=+310.365024727,LastTimestamp:2026-02-27 01:09:57.427463419 +0000 UTC m=+310.365024727,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:09:57 crc kubenswrapper[4771]: I0227 01:09:57.776616 4771 status_manager.go:851] "Failed to get status for pod" podUID="dea387ee-2398-4e31-b664-58bae30775ca" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:09:57 crc kubenswrapper[4771]: I0227 01:09:57.883569 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"cae11d46c067386cc68428efdb4e262868b574c2901efa46d820b67793ef1205"} Feb 27 01:09:57 crc kubenswrapper[4771]: I0227 01:09:57.883615 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"40abe10d8ce73af4e04cb9b8f473a7e9fbd1c074545d8df7765d7114fd5566ae"} Feb 27 01:09:57 crc kubenswrapper[4771]: E0227 01:09:57.884140 4771 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.189:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 01:09:57 crc kubenswrapper[4771]: I0227 01:09:57.884239 4771 status_manager.go:851] "Failed to get status for pod" podUID="dea387ee-2398-4e31-b664-58bae30775ca" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:09:58 crc kubenswrapper[4771]: I0227 01:09:58.952888 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:09:58 crc kubenswrapper[4771]: I0227 01:09:58.953754 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:09:58 crc kubenswrapper[4771]: I0227 01:09:58.953840 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 01:09:58 crc kubenswrapper[4771]: I0227 01:09:58.954727 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867"} pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 01:09:58 crc kubenswrapper[4771]: I0227 01:09:58.954851 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" containerID="cri-o://593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867" gracePeriod=600 Feb 27 01:09:59 crc kubenswrapper[4771]: E0227 01:09:59.175509 4771 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.189:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1897f53792bd44fb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:09:57.427463419 +0000 UTC m=+310.365024727,LastTimestamp:2026-02-27 01:09:57.427463419 +0000 UTC m=+310.365024727,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:09:59 crc kubenswrapper[4771]: E0227 01:09:59.779881 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" interval="6.4s" Feb 27 01:09:59 crc kubenswrapper[4771]: I0227 01:09:59.898980 4771 generic.go:334] "Generic (PLEG): container finished" podID="ca81e505-d53f-496e-bd26-7cec669591e4" containerID="593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867" exitCode=0 Feb 27 01:09:59 crc kubenswrapper[4771]: I0227 01:09:59.899050 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerDied","Data":"593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867"} Feb 27 01:09:59 crc kubenswrapper[4771]: I0227 01:09:59.899096 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerStarted","Data":"3edbc767662cebd8ad4cf0660d8b2225989bc9c500a2684a30fb57d6c7bf5f5f"} Feb 27 01:09:59 crc kubenswrapper[4771]: I0227 01:09:59.900073 4771 status_manager.go:851] "Failed to get status for pod" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-hw7dn\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:09:59 crc kubenswrapper[4771]: I0227 01:09:59.900606 4771 status_manager.go:851] "Failed to get status for pod" podUID="dea387ee-2398-4e31-b664-58bae30775ca" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:06 crc kubenswrapper[4771]: E0227 01:10:06.181711 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.189:6443: connect: connection refused" interval="7s" Feb 27 01:10:06 crc kubenswrapper[4771]: I0227 01:10:06.946533 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 27 01:10:06 crc kubenswrapper[4771]: I0227 01:10:06.947705 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 27 01:10:06 crc kubenswrapper[4771]: I0227 01:10:06.947784 4771 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf" exitCode=1 Feb 27 01:10:06 crc kubenswrapper[4771]: I0227 01:10:06.947837 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf"} Feb 27 01:10:06 crc kubenswrapper[4771]: I0227 01:10:06.948422 4771 scope.go:117] "RemoveContainer" containerID="74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf" Feb 27 01:10:06 crc kubenswrapper[4771]: I0227 01:10:06.948713 4771 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:06 crc kubenswrapper[4771]: I0227 01:10:06.949188 4771 status_manager.go:851] "Failed to get status for pod" podUID="dea387ee-2398-4e31-b664-58bae30775ca" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:06 crc kubenswrapper[4771]: I0227 01:10:06.949542 4771 status_manager.go:851] "Failed to get status for pod" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-hw7dn\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:07 crc kubenswrapper[4771]: I0227 01:10:07.165130 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:10:07 crc kubenswrapper[4771]: I0227 01:10:07.639893 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" podUID="8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7" containerName="oauth-openshift" containerID="cri-o://2f27cd8996898523845f5cd911350e6f00b9b64e518e26cf10570b63b113837a" gracePeriod=15 Feb 27 01:10:07 crc kubenswrapper[4771]: I0227 01:10:07.774950 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:10:07 crc kubenswrapper[4771]: I0227 01:10:07.780925 4771 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:07 crc kubenswrapper[4771]: I0227 01:10:07.781448 4771 status_manager.go:851] "Failed to get status for pod" podUID="dea387ee-2398-4e31-b664-58bae30775ca" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:07 crc kubenswrapper[4771]: I0227 01:10:07.782151 4771 status_manager.go:851] "Failed to get status for pod" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-hw7dn\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:07 crc kubenswrapper[4771]: I0227 01:10:07.782822 4771 status_manager.go:851] "Failed to get status for pod" podUID="dea387ee-2398-4e31-b664-58bae30775ca" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:07 crc kubenswrapper[4771]: I0227 01:10:07.783071 4771 status_manager.go:851] "Failed to get status for pod" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-hw7dn\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:07 crc kubenswrapper[4771]: I0227 01:10:07.783692 4771 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:07 crc kubenswrapper[4771]: I0227 01:10:07.794153 4771 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4789374-f7c5-4270-a54a-5fbdd6319021" Feb 27 01:10:07 crc kubenswrapper[4771]: I0227 01:10:07.794183 4771 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4789374-f7c5-4270-a54a-5fbdd6319021" Feb 27 01:10:07 crc kubenswrapper[4771]: E0227 01:10:07.794465 4771 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:10:07 crc kubenswrapper[4771]: I0227 01:10:07.794966 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:10:07 crc kubenswrapper[4771]: W0227 01:10:07.824728 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-a519818ec9975f666a9b3ba5a080d596d6e93c955d3d05755c7225416728dd72 WatchSource:0}: Error finding container a519818ec9975f666a9b3ba5a080d596d6e93c955d3d05755c7225416728dd72: Status 404 returned error can't find the container with id a519818ec9975f666a9b3ba5a080d596d6e93c955d3d05755c7225416728dd72 Feb 27 01:10:07 crc kubenswrapper[4771]: I0227 01:10:07.956469 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a519818ec9975f666a9b3ba5a080d596d6e93c955d3d05755c7225416728dd72"} Feb 27 01:10:07 crc kubenswrapper[4771]: I0227 01:10:07.962692 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 27 01:10:07 crc kubenswrapper[4771]: I0227 01:10:07.963850 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 27 01:10:07 crc kubenswrapper[4771]: I0227 01:10:07.963927 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c4029c63755600e25b181f123ca63023911715b2cffea80198d5c361bd309b14"} Feb 27 01:10:07 crc kubenswrapper[4771]: I0227 01:10:07.965671 4771 status_manager.go:851] "Failed to get status for pod" podUID="dea387ee-2398-4e31-b664-58bae30775ca" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:07 crc kubenswrapper[4771]: I0227 01:10:07.966169 4771 status_manager.go:851] "Failed to get status for pod" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-hw7dn\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:07 crc kubenswrapper[4771]: I0227 01:10:07.966647 4771 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:07 crc kubenswrapper[4771]: I0227 01:10:07.967434 4771 generic.go:334] "Generic (PLEG): container finished" podID="8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7" containerID="2f27cd8996898523845f5cd911350e6f00b9b64e518e26cf10570b63b113837a" exitCode=0 Feb 27 01:10:07 crc kubenswrapper[4771]: I0227 01:10:07.967578 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" event={"ID":"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7","Type":"ContainerDied","Data":"2f27cd8996898523845f5cd911350e6f00b9b64e518e26cf10570b63b113837a"} Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.219219 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.219718 4771 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.219943 4771 status_manager.go:851] "Failed to get status for pod" podUID="dea387ee-2398-4e31-b664-58bae30775ca" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.220181 4771 status_manager.go:851] "Failed to get status for pod" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-hw7dn\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.220414 4771 status_manager.go:851] "Failed to get status for pod" podUID="8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7" pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-9scwl\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.268751 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-user-idp-0-file-data\") pod \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.268815 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-cliconfig\") pod \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.268859 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khtxn\" (UniqueName: \"kubernetes.io/projected/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-kube-api-access-khtxn\") pod \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.268917 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-serving-cert\") pod \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.268951 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-session\") pod \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.268982 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-user-template-error\") pod \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.269018 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-service-ca\") pod \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.269054 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-audit-dir\") pod \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.269089 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-audit-policies\") pod \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.269132 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-router-certs\") pod \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.269198 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-user-template-login\") pod \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.269242 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-ocp-branding-template\") pod \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.269281 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-user-template-provider-selection\") pod \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.269321 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-trusted-ca-bundle\") pod \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\" (UID: \"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7\") " Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.269771 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7" (UID: "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.270295 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7" (UID: "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.270383 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7" (UID: "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.273448 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7" (UID: "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.274400 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7" (UID: "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.275895 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7" (UID: "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.276477 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-kube-api-access-khtxn" (OuterVolumeSpecName: "kube-api-access-khtxn") pod "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7" (UID: "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7"). InnerVolumeSpecName "kube-api-access-khtxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.277072 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7" (UID: "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.277613 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7" (UID: "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.278069 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7" (UID: "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.278292 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7" (UID: "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.278482 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7" (UID: "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.279226 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7" (UID: "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.285999 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7" (UID: "8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.370323 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.370373 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.370387 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khtxn\" (UniqueName: \"kubernetes.io/projected/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-kube-api-access-khtxn\") on node \"crc\" DevicePath \"\"" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.370399 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.370413 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.370425 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.370438 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.370450 4771 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.370465 4771 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.370477 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.370489 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.370501 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.370516 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.370530 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.975624 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" event={"ID":"8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7","Type":"ContainerDied","Data":"bd48d9c6210401197a31138d84b868da7abaf8546c8833dfe1bc6c759639d834"} Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.975688 4771 scope.go:117] "RemoveContainer" containerID="2f27cd8996898523845f5cd911350e6f00b9b64e518e26cf10570b63b113837a" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.976660 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.977476 4771 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.978051 4771 status_manager.go:851] "Failed to get status for pod" podUID="dea387ee-2398-4e31-b664-58bae30775ca" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.978141 4771 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="7ae7600b28a8de15424052c5e7c8aad4fd9ac294bbef5f0e217612dc77b27961" exitCode=0 Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.978252 4771 status_manager.go:851] "Failed to get status for pod" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-hw7dn\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.978425 4771 status_manager.go:851] "Failed to get status for pod" podUID="8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7" pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-9scwl\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.978464 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"7ae7600b28a8de15424052c5e7c8aad4fd9ac294bbef5f0e217612dc77b27961"} Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.978906 4771 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4789374-f7c5-4270-a54a-5fbdd6319021" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.979028 4771 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4789374-f7c5-4270-a54a-5fbdd6319021" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.979251 4771 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.979581 4771 status_manager.go:851] "Failed to get status for pod" podUID="dea387ee-2398-4e31-b664-58bae30775ca" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.979870 4771 status_manager.go:851] "Failed to get status for pod" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-hw7dn\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:08 crc kubenswrapper[4771]: E0227 01:10:08.980029 4771 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.980147 4771 status_manager.go:851] "Failed to get status for pod" podUID="8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7" pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-9scwl\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.998448 4771 status_manager.go:851] "Failed to get status for pod" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-hw7dn\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.998985 4771 status_manager.go:851] "Failed to get status for pod" podUID="8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7" pod="openshift-authentication/oauth-openshift-558db77b4-9scwl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-9scwl\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:08 crc kubenswrapper[4771]: I0227 01:10:08.999432 4771 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:09 crc kubenswrapper[4771]: I0227 01:10:08.999785 4771 status_manager.go:851] "Failed to get status for pod" podUID="dea387ee-2398-4e31-b664-58bae30775ca" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.189:6443: connect: connection refused" Feb 27 01:10:09 crc kubenswrapper[4771]: E0227 01:10:09.178127 4771 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.189:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1897f53792bd44fb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 01:09:57.427463419 +0000 UTC m=+310.365024727,LastTimestamp:2026-02-27 01:09:57.427463419 +0000 UTC m=+310.365024727,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 01:10:09 crc kubenswrapper[4771]: I0227 01:10:09.990092 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5f02fd45c6b8aadc9df692c00b47d441a83a9cb615dea26d0412d52900ba4c78"} Feb 27 01:10:09 crc kubenswrapper[4771]: I0227 01:10:09.990135 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"47c3468f5bd41a238e5a18c3889e0384eda2e5e3f0d6ae99669666281acd9128"} Feb 27 01:10:09 crc kubenswrapper[4771]: I0227 01:10:09.990145 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c6a291b5813aea49d9341cb9f8c6478f7614b80e6c7abf725b28833e6cd50aac"} Feb 27 01:10:11 crc kubenswrapper[4771]: I0227 01:10:11.002362 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"de76d4ab5303c3dabb589895dd239deeeca38bf086943e8fd5ac533b68cf260d"} Feb 27 01:10:11 crc kubenswrapper[4771]: I0227 01:10:11.002653 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:10:11 crc kubenswrapper[4771]: I0227 01:10:11.002670 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"26093bf011b45aef96d971cd9a20ebf09b32ce95130e3e1f5f41b1a510846115"} Feb 27 01:10:11 crc kubenswrapper[4771]: I0227 01:10:11.002810 4771 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4789374-f7c5-4270-a54a-5fbdd6319021" Feb 27 01:10:11 crc kubenswrapper[4771]: I0227 01:10:11.002845 4771 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4789374-f7c5-4270-a54a-5fbdd6319021" Feb 27 01:10:12 crc kubenswrapper[4771]: I0227 01:10:12.795349 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:10:12 crc kubenswrapper[4771]: I0227 01:10:12.795417 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:10:12 crc kubenswrapper[4771]: I0227 01:10:12.803138 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:10:14 crc kubenswrapper[4771]: I0227 01:10:14.702294 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:10:16 crc kubenswrapper[4771]: I0227 01:10:16.015743 4771 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:10:16 crc kubenswrapper[4771]: I0227 01:10:16.281734 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:10:16 crc kubenswrapper[4771]: I0227 01:10:16.282602 4771 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 27 01:10:16 crc kubenswrapper[4771]: I0227 01:10:16.282642 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 27 01:10:17 crc kubenswrapper[4771]: I0227 01:10:17.034986 4771 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4789374-f7c5-4270-a54a-5fbdd6319021" Feb 27 01:10:17 crc kubenswrapper[4771]: I0227 01:10:17.035655 4771 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4789374-f7c5-4270-a54a-5fbdd6319021" Feb 27 01:10:17 crc kubenswrapper[4771]: I0227 01:10:17.041290 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:10:17 crc kubenswrapper[4771]: I0227 01:10:17.809590 4771 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d3716583-fe22-4df0-9a52-1b686b016f91" Feb 27 01:10:18 crc kubenswrapper[4771]: I0227 01:10:18.040194 4771 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4789374-f7c5-4270-a54a-5fbdd6319021" Feb 27 01:10:18 crc kubenswrapper[4771]: I0227 01:10:18.040237 4771 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4789374-f7c5-4270-a54a-5fbdd6319021" Feb 27 01:10:18 crc kubenswrapper[4771]: I0227 01:10:18.043377 4771 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d3716583-fe22-4df0-9a52-1b686b016f91" Feb 27 01:10:25 crc kubenswrapper[4771]: I0227 01:10:25.691469 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 27 01:10:25 crc kubenswrapper[4771]: I0227 01:10:25.978381 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 01:10:25 crc kubenswrapper[4771]: I0227 01:10:25.982122 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 27 01:10:26 crc kubenswrapper[4771]: I0227 01:10:26.115425 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 27 01:10:26 crc kubenswrapper[4771]: I0227 01:10:26.208826 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 27 01:10:26 crc kubenswrapper[4771]: I0227 01:10:26.281647 4771 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 27 01:10:26 crc kubenswrapper[4771]: I0227 01:10:26.281725 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 27 01:10:26 crc kubenswrapper[4771]: I0227 01:10:26.963031 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 27 01:10:27 crc kubenswrapper[4771]: I0227 01:10:27.054126 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 27 01:10:27 crc kubenswrapper[4771]: I0227 01:10:27.482400 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 27 01:10:27 crc kubenswrapper[4771]: I0227 01:10:27.567266 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 27 01:10:27 crc kubenswrapper[4771]: I0227 01:10:27.636091 4771 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 27 01:10:27 crc kubenswrapper[4771]: I0227 01:10:27.639098 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 27 01:10:27 crc kubenswrapper[4771]: I0227 01:10:27.714440 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 27 01:10:27 crc kubenswrapper[4771]: I0227 01:10:27.730065 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 27 01:10:27 crc kubenswrapper[4771]: I0227 01:10:27.818523 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 27 01:10:28 crc kubenswrapper[4771]: I0227 01:10:28.530832 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 01:10:28 crc kubenswrapper[4771]: I0227 01:10:28.565635 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 27 01:10:28 crc kubenswrapper[4771]: I0227 01:10:28.587537 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 27 01:10:28 crc kubenswrapper[4771]: I0227 01:10:28.626900 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 01:10:28 crc kubenswrapper[4771]: I0227 01:10:28.682500 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 27 01:10:28 crc kubenswrapper[4771]: I0227 01:10:28.812354 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 27 01:10:28 crc kubenswrapper[4771]: I0227 01:10:28.851446 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 27 01:10:28 crc kubenswrapper[4771]: I0227 01:10:28.941083 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 01:10:28 crc kubenswrapper[4771]: I0227 01:10:28.954253 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 27 01:10:28 crc kubenswrapper[4771]: I0227 01:10:28.992670 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 27 01:10:29 crc kubenswrapper[4771]: I0227 01:10:29.040706 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 27 01:10:29 crc kubenswrapper[4771]: I0227 01:10:29.053669 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 27 01:10:29 crc kubenswrapper[4771]: I0227 01:10:29.327934 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 27 01:10:29 crc kubenswrapper[4771]: I0227 01:10:29.415051 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 27 01:10:29 crc kubenswrapper[4771]: I0227 01:10:29.439337 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 27 01:10:29 crc kubenswrapper[4771]: I0227 01:10:29.614043 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 27 01:10:29 crc kubenswrapper[4771]: I0227 01:10:29.662798 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 27 01:10:29 crc kubenswrapper[4771]: I0227 01:10:29.705196 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 27 01:10:29 crc kubenswrapper[4771]: I0227 01:10:29.847911 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 27 01:10:29 crc kubenswrapper[4771]: I0227 01:10:29.858569 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 27 01:10:29 crc kubenswrapper[4771]: I0227 01:10:29.917802 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 27 01:10:29 crc kubenswrapper[4771]: I0227 01:10:29.930605 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 27 01:10:29 crc kubenswrapper[4771]: I0227 01:10:29.968839 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 27 01:10:30 crc kubenswrapper[4771]: I0227 01:10:30.022628 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 27 01:10:30 crc kubenswrapper[4771]: I0227 01:10:30.028869 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 27 01:10:30 crc kubenswrapper[4771]: I0227 01:10:30.040854 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 27 01:10:30 crc kubenswrapper[4771]: I0227 01:10:30.411694 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 27 01:10:30 crc kubenswrapper[4771]: I0227 01:10:30.462632 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 27 01:10:30 crc kubenswrapper[4771]: I0227 01:10:30.565499 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 27 01:10:30 crc kubenswrapper[4771]: I0227 01:10:30.633132 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 27 01:10:30 crc kubenswrapper[4771]: I0227 01:10:30.730901 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 27 01:10:30 crc kubenswrapper[4771]: I0227 01:10:30.891323 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 01:10:30 crc kubenswrapper[4771]: I0227 01:10:30.922993 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 27 01:10:30 crc kubenswrapper[4771]: I0227 01:10:30.979957 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 27 01:10:30 crc kubenswrapper[4771]: I0227 01:10:30.984255 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 27 01:10:31 crc kubenswrapper[4771]: I0227 01:10:31.033038 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 27 01:10:31 crc kubenswrapper[4771]: I0227 01:10:31.098048 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 27 01:10:31 crc kubenswrapper[4771]: I0227 01:10:31.271095 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 27 01:10:31 crc kubenswrapper[4771]: I0227 01:10:31.307362 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 27 01:10:31 crc kubenswrapper[4771]: I0227 01:10:31.308743 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 27 01:10:31 crc kubenswrapper[4771]: I0227 01:10:31.372569 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 27 01:10:31 crc kubenswrapper[4771]: I0227 01:10:31.412855 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 27 01:10:31 crc kubenswrapper[4771]: I0227 01:10:31.501475 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 27 01:10:31 crc kubenswrapper[4771]: I0227 01:10:31.599066 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 27 01:10:31 crc kubenswrapper[4771]: I0227 01:10:31.621782 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 27 01:10:31 crc kubenswrapper[4771]: I0227 01:10:31.654316 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 27 01:10:31 crc kubenswrapper[4771]: I0227 01:10:31.671176 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 27 01:10:31 crc kubenswrapper[4771]: I0227 01:10:31.698640 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 27 01:10:31 crc kubenswrapper[4771]: I0227 01:10:31.724221 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 27 01:10:31 crc kubenswrapper[4771]: I0227 01:10:31.806842 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 27 01:10:31 crc kubenswrapper[4771]: I0227 01:10:31.916402 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 27 01:10:31 crc kubenswrapper[4771]: I0227 01:10:31.921994 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 27 01:10:32 crc kubenswrapper[4771]: I0227 01:10:32.008265 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 27 01:10:32 crc kubenswrapper[4771]: I0227 01:10:32.068541 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 27 01:10:32 crc kubenswrapper[4771]: I0227 01:10:32.150607 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 27 01:10:32 crc kubenswrapper[4771]: I0227 01:10:32.276453 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 27 01:10:32 crc kubenswrapper[4771]: I0227 01:10:32.292963 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 27 01:10:32 crc kubenswrapper[4771]: I0227 01:10:32.319748 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 27 01:10:32 crc kubenswrapper[4771]: I0227 01:10:32.371791 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 27 01:10:32 crc kubenswrapper[4771]: I0227 01:10:32.374613 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 27 01:10:32 crc kubenswrapper[4771]: I0227 01:10:32.460881 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 27 01:10:32 crc kubenswrapper[4771]: I0227 01:10:32.475660 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 27 01:10:32 crc kubenswrapper[4771]: I0227 01:10:32.500132 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 27 01:10:32 crc kubenswrapper[4771]: I0227 01:10:32.525840 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 27 01:10:32 crc kubenswrapper[4771]: I0227 01:10:32.587403 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 27 01:10:32 crc kubenswrapper[4771]: I0227 01:10:32.588381 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 27 01:10:32 crc kubenswrapper[4771]: I0227 01:10:32.638282 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 27 01:10:32 crc kubenswrapper[4771]: I0227 01:10:32.689836 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 27 01:10:32 crc kubenswrapper[4771]: I0227 01:10:32.690476 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 27 01:10:32 crc kubenswrapper[4771]: I0227 01:10:32.735974 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 27 01:10:32 crc kubenswrapper[4771]: I0227 01:10:32.747921 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 27 01:10:32 crc kubenswrapper[4771]: I0227 01:10:32.759700 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 27 01:10:32 crc kubenswrapper[4771]: I0227 01:10:32.768787 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 27 01:10:32 crc kubenswrapper[4771]: I0227 01:10:32.786809 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 27 01:10:32 crc kubenswrapper[4771]: I0227 01:10:32.845294 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 27 01:10:32 crc kubenswrapper[4771]: I0227 01:10:32.926107 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 27 01:10:32 crc kubenswrapper[4771]: I0227 01:10:32.964462 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 27 01:10:32 crc kubenswrapper[4771]: I0227 01:10:32.987003 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 01:10:33 crc kubenswrapper[4771]: I0227 01:10:33.076670 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 27 01:10:33 crc kubenswrapper[4771]: I0227 01:10:33.078166 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 27 01:10:33 crc kubenswrapper[4771]: I0227 01:10:33.206901 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 27 01:10:33 crc kubenswrapper[4771]: I0227 01:10:33.216593 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 27 01:10:33 crc kubenswrapper[4771]: I0227 01:10:33.487804 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 27 01:10:33 crc kubenswrapper[4771]: I0227 01:10:33.496835 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 27 01:10:33 crc kubenswrapper[4771]: I0227 01:10:33.564847 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 27 01:10:33 crc kubenswrapper[4771]: I0227 01:10:33.566999 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 27 01:10:33 crc kubenswrapper[4771]: I0227 01:10:33.647984 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 27 01:10:33 crc kubenswrapper[4771]: I0227 01:10:33.680345 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 27 01:10:33 crc kubenswrapper[4771]: I0227 01:10:33.695125 4771 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 27 01:10:33 crc kubenswrapper[4771]: I0227 01:10:33.743710 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 27 01:10:33 crc kubenswrapper[4771]: I0227 01:10:33.780481 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 27 01:10:33 crc kubenswrapper[4771]: I0227 01:10:33.791131 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 27 01:10:33 crc kubenswrapper[4771]: I0227 01:10:33.793348 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 27 01:10:33 crc kubenswrapper[4771]: I0227 01:10:33.894091 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 27 01:10:33 crc kubenswrapper[4771]: I0227 01:10:33.916519 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 27 01:10:33 crc kubenswrapper[4771]: I0227 01:10:33.931576 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 27 01:10:33 crc kubenswrapper[4771]: I0227 01:10:33.958850 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 27 01:10:34 crc kubenswrapper[4771]: I0227 01:10:34.070105 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 27 01:10:34 crc kubenswrapper[4771]: I0227 01:10:34.094133 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 27 01:10:34 crc kubenswrapper[4771]: I0227 01:10:34.150487 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 27 01:10:34 crc kubenswrapper[4771]: I0227 01:10:34.184481 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 27 01:10:34 crc kubenswrapper[4771]: I0227 01:10:34.200819 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 27 01:10:34 crc kubenswrapper[4771]: I0227 01:10:34.298066 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 27 01:10:34 crc kubenswrapper[4771]: I0227 01:10:34.315299 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 27 01:10:34 crc kubenswrapper[4771]: I0227 01:10:34.319315 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 27 01:10:34 crc kubenswrapper[4771]: I0227 01:10:34.329268 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 27 01:10:34 crc kubenswrapper[4771]: I0227 01:10:34.367439 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 27 01:10:34 crc kubenswrapper[4771]: I0227 01:10:34.664724 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 27 01:10:34 crc kubenswrapper[4771]: I0227 01:10:34.739220 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 27 01:10:34 crc kubenswrapper[4771]: I0227 01:10:34.763446 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 27 01:10:34 crc kubenswrapper[4771]: I0227 01:10:34.979152 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 27 01:10:35 crc kubenswrapper[4771]: I0227 01:10:35.052803 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 27 01:10:35 crc kubenswrapper[4771]: I0227 01:10:35.069539 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 27 01:10:35 crc kubenswrapper[4771]: I0227 01:10:35.140757 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 27 01:10:35 crc kubenswrapper[4771]: I0227 01:10:35.144612 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 27 01:10:35 crc kubenswrapper[4771]: I0227 01:10:35.331607 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 27 01:10:35 crc kubenswrapper[4771]: I0227 01:10:35.465229 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 27 01:10:35 crc kubenswrapper[4771]: I0227 01:10:35.549623 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 27 01:10:35 crc kubenswrapper[4771]: I0227 01:10:35.606321 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 27 01:10:35 crc kubenswrapper[4771]: I0227 01:10:35.615722 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 27 01:10:35 crc kubenswrapper[4771]: I0227 01:10:35.645641 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 27 01:10:35 crc kubenswrapper[4771]: I0227 01:10:35.691049 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 27 01:10:35 crc kubenswrapper[4771]: I0227 01:10:35.700057 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 27 01:10:35 crc kubenswrapper[4771]: I0227 01:10:35.780043 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 27 01:10:35 crc kubenswrapper[4771]: I0227 01:10:35.782437 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 27 01:10:35 crc kubenswrapper[4771]: I0227 01:10:35.814879 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 27 01:10:35 crc kubenswrapper[4771]: I0227 01:10:35.887761 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 27 01:10:35 crc kubenswrapper[4771]: I0227 01:10:35.913148 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 27 01:10:35 crc kubenswrapper[4771]: I0227 01:10:35.949155 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 01:10:36 crc kubenswrapper[4771]: I0227 01:10:36.225014 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 27 01:10:36 crc kubenswrapper[4771]: I0227 01:10:36.231809 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 01:10:36 crc kubenswrapper[4771]: I0227 01:10:36.271317 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 27 01:10:36 crc kubenswrapper[4771]: I0227 01:10:36.281656 4771 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 27 01:10:36 crc kubenswrapper[4771]: I0227 01:10:36.281736 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 27 01:10:36 crc kubenswrapper[4771]: I0227 01:10:36.281819 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:10:36 crc kubenswrapper[4771]: I0227 01:10:36.282830 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"c4029c63755600e25b181f123ca63023911715b2cffea80198d5c361bd309b14"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Feb 27 01:10:36 crc kubenswrapper[4771]: I0227 01:10:36.283063 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://c4029c63755600e25b181f123ca63023911715b2cffea80198d5c361bd309b14" gracePeriod=30 Feb 27 01:10:36 crc kubenswrapper[4771]: I0227 01:10:36.293269 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 27 01:10:36 crc kubenswrapper[4771]: I0227 01:10:36.310796 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 27 01:10:36 crc kubenswrapper[4771]: I0227 01:10:36.446668 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 27 01:10:36 crc kubenswrapper[4771]: I0227 01:10:36.450567 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 27 01:10:36 crc kubenswrapper[4771]: I0227 01:10:36.475768 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 27 01:10:36 crc kubenswrapper[4771]: I0227 01:10:36.587384 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 27 01:10:36 crc kubenswrapper[4771]: I0227 01:10:36.600233 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 27 01:10:36 crc kubenswrapper[4771]: I0227 01:10:36.758478 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 27 01:10:36 crc kubenswrapper[4771]: I0227 01:10:36.850748 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 01:10:36 crc kubenswrapper[4771]: I0227 01:10:36.963782 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 27 01:10:37 crc kubenswrapper[4771]: I0227 01:10:37.089833 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 27 01:10:37 crc kubenswrapper[4771]: I0227 01:10:37.107500 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 27 01:10:37 crc kubenswrapper[4771]: I0227 01:10:37.125134 4771 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 27 01:10:37 crc kubenswrapper[4771]: I0227 01:10:37.133095 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9scwl","openshift-kube-apiserver/kube-apiserver-crc"] Feb 27 01:10:37 crc kubenswrapper[4771]: I0227 01:10:37.133189 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 27 01:10:37 crc kubenswrapper[4771]: I0227 01:10:37.138671 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 01:10:37 crc kubenswrapper[4771]: I0227 01:10:37.181008 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=21.180987786 podStartE2EDuration="21.180987786s" podCreationTimestamp="2026-02-27 01:10:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:10:37.155838547 +0000 UTC m=+350.093399835" watchObservedRunningTime="2026-02-27 01:10:37.180987786 +0000 UTC m=+350.118549084" Feb 27 01:10:37 crc kubenswrapper[4771]: I0227 01:10:37.211734 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 27 01:10:37 crc kubenswrapper[4771]: I0227 01:10:37.244720 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 27 01:10:37 crc kubenswrapper[4771]: I0227 01:10:37.258066 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 27 01:10:37 crc kubenswrapper[4771]: I0227 01:10:37.260475 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 27 01:10:37 crc kubenswrapper[4771]: I0227 01:10:37.310296 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 27 01:10:37 crc kubenswrapper[4771]: I0227 01:10:37.334229 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 27 01:10:37 crc kubenswrapper[4771]: I0227 01:10:37.377872 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 27 01:10:37 crc kubenswrapper[4771]: I0227 01:10:37.401705 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 27 01:10:37 crc kubenswrapper[4771]: I0227 01:10:37.500083 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 27 01:10:37 crc kubenswrapper[4771]: I0227 01:10:37.671492 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 27 01:10:37 crc kubenswrapper[4771]: I0227 01:10:37.675141 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 27 01:10:37 crc kubenswrapper[4771]: I0227 01:10:37.681133 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 27 01:10:37 crc kubenswrapper[4771]: I0227 01:10:37.694753 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 27 01:10:37 crc kubenswrapper[4771]: I0227 01:10:37.780491 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7" path="/var/lib/kubelet/pods/8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7/volumes" Feb 27 01:10:37 crc kubenswrapper[4771]: I0227 01:10:37.817687 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 27 01:10:37 crc kubenswrapper[4771]: I0227 01:10:37.834351 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 27 01:10:37 crc kubenswrapper[4771]: I0227 01:10:37.836796 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 27 01:10:37 crc kubenswrapper[4771]: I0227 01:10:37.940460 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 27 01:10:37 crc kubenswrapper[4771]: I0227 01:10:37.967178 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 27 01:10:38 crc kubenswrapper[4771]: I0227 01:10:38.049859 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 27 01:10:38 crc kubenswrapper[4771]: I0227 01:10:38.075145 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 27 01:10:38 crc kubenswrapper[4771]: I0227 01:10:38.147083 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 27 01:10:38 crc kubenswrapper[4771]: I0227 01:10:38.151075 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 27 01:10:38 crc kubenswrapper[4771]: I0227 01:10:38.195718 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 27 01:10:38 crc kubenswrapper[4771]: I0227 01:10:38.256805 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 01:10:38 crc kubenswrapper[4771]: I0227 01:10:38.335688 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 27 01:10:38 crc kubenswrapper[4771]: I0227 01:10:38.427641 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 27 01:10:38 crc kubenswrapper[4771]: I0227 01:10:38.503222 4771 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 27 01:10:38 crc kubenswrapper[4771]: I0227 01:10:38.503578 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://cae11d46c067386cc68428efdb4e262868b574c2901efa46d820b67793ef1205" gracePeriod=5 Feb 27 01:10:38 crc kubenswrapper[4771]: I0227 01:10:38.612342 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 01:10:38 crc kubenswrapper[4771]: I0227 01:10:38.845890 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 27 01:10:38 crc kubenswrapper[4771]: I0227 01:10:38.855091 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 27 01:10:38 crc kubenswrapper[4771]: I0227 01:10:38.987139 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 01:10:39 crc kubenswrapper[4771]: I0227 01:10:39.030525 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 27 01:10:39 crc kubenswrapper[4771]: I0227 01:10:39.279795 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 27 01:10:39 crc kubenswrapper[4771]: I0227 01:10:39.320867 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 27 01:10:39 crc kubenswrapper[4771]: I0227 01:10:39.597748 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 27 01:10:39 crc kubenswrapper[4771]: I0227 01:10:39.658625 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 27 01:10:39 crc kubenswrapper[4771]: I0227 01:10:39.710479 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 27 01:10:39 crc kubenswrapper[4771]: I0227 01:10:39.718816 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 27 01:10:39 crc kubenswrapper[4771]: I0227 01:10:39.744178 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 27 01:10:39 crc kubenswrapper[4771]: I0227 01:10:39.844095 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 27 01:10:39 crc kubenswrapper[4771]: I0227 01:10:39.881025 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 27 01:10:40 crc kubenswrapper[4771]: I0227 01:10:40.006304 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 27 01:10:40 crc kubenswrapper[4771]: I0227 01:10:40.229577 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 27 01:10:40 crc kubenswrapper[4771]: I0227 01:10:40.284308 4771 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 27 01:10:40 crc kubenswrapper[4771]: I0227 01:10:40.379446 4771 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 27 01:10:40 crc kubenswrapper[4771]: I0227 01:10:40.397896 4771 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 27 01:10:40 crc kubenswrapper[4771]: I0227 01:10:40.443967 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 27 01:10:40 crc kubenswrapper[4771]: I0227 01:10:40.561343 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 27 01:10:40 crc kubenswrapper[4771]: I0227 01:10:40.583514 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 27 01:10:40 crc kubenswrapper[4771]: I0227 01:10:40.730036 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 27 01:10:40 crc kubenswrapper[4771]: I0227 01:10:40.846431 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 27 01:10:40 crc kubenswrapper[4771]: I0227 01:10:40.945690 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 27 01:10:40 crc kubenswrapper[4771]: I0227 01:10:40.953140 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 27 01:10:41 crc kubenswrapper[4771]: I0227 01:10:41.189963 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 27 01:10:41 crc kubenswrapper[4771]: I0227 01:10:41.238001 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 27 01:10:41 crc kubenswrapper[4771]: I0227 01:10:41.395404 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 27 01:10:41 crc kubenswrapper[4771]: I0227 01:10:41.527375 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 27 01:10:41 crc kubenswrapper[4771]: I0227 01:10:41.573621 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 27 01:10:41 crc kubenswrapper[4771]: I0227 01:10:41.825050 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.672197 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd"] Feb 27 01:10:43 crc kubenswrapper[4771]: E0227 01:10:43.672513 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea387ee-2398-4e31-b664-58bae30775ca" containerName="installer" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.672533 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea387ee-2398-4e31-b664-58bae30775ca" containerName="installer" Feb 27 01:10:43 crc kubenswrapper[4771]: E0227 01:10:43.672595 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.672608 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 27 01:10:43 crc kubenswrapper[4771]: E0227 01:10:43.672631 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7" containerName="oauth-openshift" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.672644 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7" containerName="oauth-openshift" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.672792 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.672813 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="8313e6c4-6dba-4edc-9e7b-5b7389c7bcf7" containerName="oauth-openshift" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.672838 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="dea387ee-2398-4e31-b664-58bae30775ca" containerName="installer" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.673456 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.677684 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.678012 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.678674 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.678922 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.679104 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.679250 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.679674 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.679868 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.680150 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.680326 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.680995 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.681032 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.681098 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd"] Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.690971 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.691301 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.702031 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.729161 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx6k8\" (UniqueName: \"kubernetes.io/projected/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-kube-api-access-vx6k8\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.729201 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-audit-policies\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.729223 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.729247 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-audit-dir\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.729269 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-user-template-error\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.729342 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.729371 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.729452 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-system-service-ca\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.729477 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-user-template-login\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.729505 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-system-router-certs\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.729529 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.729592 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-system-session\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.729624 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.729648 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.831078 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.831190 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-system-service-ca\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.831240 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-user-template-login\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.831327 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-system-router-certs\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.831362 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.831398 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-system-session\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.831434 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.831466 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.831544 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx6k8\" (UniqueName: \"kubernetes.io/projected/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-kube-api-access-vx6k8\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.831614 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-audit-policies\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.831684 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.831803 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-audit-dir\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.831840 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-user-template-error\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.831871 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.832508 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-audit-dir\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.832830 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-audit-policies\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.833153 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.833492 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.834800 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-system-service-ca\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.836983 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.838000 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.838513 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.838713 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-user-template-login\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.839667 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-system-session\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.840117 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-system-router-certs\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.840185 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.841097 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-v4-0-config-user-template-error\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.849889 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx6k8\" (UniqueName: \"kubernetes.io/projected/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d-kube-api-access-vx6k8\") pod \"oauth-openshift-7f8484fbcc-qxqnd\" (UID: \"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\") " pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:43 crc kubenswrapper[4771]: I0227 01:10:43.992869 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:44 crc kubenswrapper[4771]: I0227 01:10:44.076520 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 27 01:10:44 crc kubenswrapper[4771]: I0227 01:10:44.076628 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 01:10:44 crc kubenswrapper[4771]: I0227 01:10:44.135852 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 27 01:10:44 crc kubenswrapper[4771]: I0227 01:10:44.135977 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 27 01:10:44 crc kubenswrapper[4771]: I0227 01:10:44.136069 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 27 01:10:44 crc kubenswrapper[4771]: I0227 01:10:44.136164 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 27 01:10:44 crc kubenswrapper[4771]: I0227 01:10:44.136210 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 27 01:10:44 crc kubenswrapper[4771]: I0227 01:10:44.135978 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:10:44 crc kubenswrapper[4771]: I0227 01:10:44.136697 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:10:44 crc kubenswrapper[4771]: I0227 01:10:44.136652 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:10:44 crc kubenswrapper[4771]: I0227 01:10:44.136662 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:10:44 crc kubenswrapper[4771]: I0227 01:10:44.142592 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:10:44 crc kubenswrapper[4771]: I0227 01:10:44.202398 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 27 01:10:44 crc kubenswrapper[4771]: I0227 01:10:44.202480 4771 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="cae11d46c067386cc68428efdb4e262868b574c2901efa46d820b67793ef1205" exitCode=137 Feb 27 01:10:44 crc kubenswrapper[4771]: I0227 01:10:44.202537 4771 scope.go:117] "RemoveContainer" containerID="cae11d46c067386cc68428efdb4e262868b574c2901efa46d820b67793ef1205" Feb 27 01:10:44 crc kubenswrapper[4771]: I0227 01:10:44.202543 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 01:10:44 crc kubenswrapper[4771]: I0227 01:10:44.227066 4771 scope.go:117] "RemoveContainer" containerID="cae11d46c067386cc68428efdb4e262868b574c2901efa46d820b67793ef1205" Feb 27 01:10:44 crc kubenswrapper[4771]: E0227 01:10:44.227541 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cae11d46c067386cc68428efdb4e262868b574c2901efa46d820b67793ef1205\": container with ID starting with cae11d46c067386cc68428efdb4e262868b574c2901efa46d820b67793ef1205 not found: ID does not exist" containerID="cae11d46c067386cc68428efdb4e262868b574c2901efa46d820b67793ef1205" Feb 27 01:10:44 crc kubenswrapper[4771]: I0227 01:10:44.227593 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cae11d46c067386cc68428efdb4e262868b574c2901efa46d820b67793ef1205"} err="failed to get container status \"cae11d46c067386cc68428efdb4e262868b574c2901efa46d820b67793ef1205\": rpc error: code = NotFound desc = could not find container \"cae11d46c067386cc68428efdb4e262868b574c2901efa46d820b67793ef1205\": container with ID starting with cae11d46c067386cc68428efdb4e262868b574c2901efa46d820b67793ef1205 not found: ID does not exist" Feb 27 01:10:44 crc kubenswrapper[4771]: I0227 01:10:44.237903 4771 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 27 01:10:44 crc kubenswrapper[4771]: I0227 01:10:44.237944 4771 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 27 01:10:44 crc kubenswrapper[4771]: I0227 01:10:44.237956 4771 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 27 01:10:44 crc kubenswrapper[4771]: I0227 01:10:44.237967 4771 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 27 01:10:44 crc kubenswrapper[4771]: I0227 01:10:44.237978 4771 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 27 01:10:45 crc kubenswrapper[4771]: I0227 01:10:45.785821 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 27 01:10:47 crc kubenswrapper[4771]: E0227 01:10:47.174072 4771 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 27 01:10:47 crc kubenswrapper[4771]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7f8484fbcc-qxqnd_openshift-authentication_0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d_0(31759ea10c6f7f64cf79bde33799a61b01b745762d102f9b7460f72a568d8461): error adding pod openshift-authentication_oauth-openshift-7f8484fbcc-qxqnd to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"31759ea10c6f7f64cf79bde33799a61b01b745762d102f9b7460f72a568d8461" Netns:"/var/run/netns/6617d363-73d1-4a44-a5af-f3c283f6c7a7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7f8484fbcc-qxqnd;K8S_POD_INFRA_CONTAINER_ID=31759ea10c6f7f64cf79bde33799a61b01b745762d102f9b7460f72a568d8461;K8S_POD_UID=0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd] networking: Multus: [openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7f8484fbcc-qxqnd in out of cluster comm: pod "oauth-openshift-7f8484fbcc-qxqnd" not found Feb 27 01:10:47 crc kubenswrapper[4771]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 27 01:10:47 crc kubenswrapper[4771]: > Feb 27 01:10:47 crc kubenswrapper[4771]: E0227 01:10:47.174173 4771 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 27 01:10:47 crc kubenswrapper[4771]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7f8484fbcc-qxqnd_openshift-authentication_0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d_0(31759ea10c6f7f64cf79bde33799a61b01b745762d102f9b7460f72a568d8461): error adding pod openshift-authentication_oauth-openshift-7f8484fbcc-qxqnd to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"31759ea10c6f7f64cf79bde33799a61b01b745762d102f9b7460f72a568d8461" Netns:"/var/run/netns/6617d363-73d1-4a44-a5af-f3c283f6c7a7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7f8484fbcc-qxqnd;K8S_POD_INFRA_CONTAINER_ID=31759ea10c6f7f64cf79bde33799a61b01b745762d102f9b7460f72a568d8461;K8S_POD_UID=0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd] networking: Multus: [openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7f8484fbcc-qxqnd in out of cluster comm: pod "oauth-openshift-7f8484fbcc-qxqnd" not found Feb 27 01:10:47 crc kubenswrapper[4771]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 27 01:10:47 crc kubenswrapper[4771]: > pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:47 crc kubenswrapper[4771]: E0227 01:10:47.174206 4771 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 27 01:10:47 crc kubenswrapper[4771]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7f8484fbcc-qxqnd_openshift-authentication_0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d_0(31759ea10c6f7f64cf79bde33799a61b01b745762d102f9b7460f72a568d8461): error adding pod openshift-authentication_oauth-openshift-7f8484fbcc-qxqnd to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"31759ea10c6f7f64cf79bde33799a61b01b745762d102f9b7460f72a568d8461" Netns:"/var/run/netns/6617d363-73d1-4a44-a5af-f3c283f6c7a7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7f8484fbcc-qxqnd;K8S_POD_INFRA_CONTAINER_ID=31759ea10c6f7f64cf79bde33799a61b01b745762d102f9b7460f72a568d8461;K8S_POD_UID=0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd] networking: Multus: [openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7f8484fbcc-qxqnd in out of cluster comm: pod "oauth-openshift-7f8484fbcc-qxqnd" not found Feb 27 01:10:47 crc kubenswrapper[4771]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 27 01:10:47 crc kubenswrapper[4771]: > pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:47 crc kubenswrapper[4771]: E0227 01:10:47.174285 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-7f8484fbcc-qxqnd_openshift-authentication(0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-7f8484fbcc-qxqnd_openshift-authentication(0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7f8484fbcc-qxqnd_openshift-authentication_0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d_0(31759ea10c6f7f64cf79bde33799a61b01b745762d102f9b7460f72a568d8461): error adding pod openshift-authentication_oauth-openshift-7f8484fbcc-qxqnd to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"31759ea10c6f7f64cf79bde33799a61b01b745762d102f9b7460f72a568d8461\\\" Netns:\\\"/var/run/netns/6617d363-73d1-4a44-a5af-f3c283f6c7a7\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7f8484fbcc-qxqnd;K8S_POD_INFRA_CONTAINER_ID=31759ea10c6f7f64cf79bde33799a61b01b745762d102f9b7460f72a568d8461;K8S_POD_UID=0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd] networking: Multus: [openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7f8484fbcc-qxqnd in out of cluster comm: pod \\\"oauth-openshift-7f8484fbcc-qxqnd\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" podUID="0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d" Feb 27 01:10:47 crc kubenswrapper[4771]: I0227 01:10:47.221871 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:47 crc kubenswrapper[4771]: I0227 01:10:47.222504 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:50 crc kubenswrapper[4771]: E0227 01:10:50.449510 4771 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 27 01:10:50 crc kubenswrapper[4771]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7f8484fbcc-qxqnd_openshift-authentication_0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d_0(6cdd73a13287ee719ce7a64363e190703032ec881849ca84620a2d72a8c71e6b): error adding pod openshift-authentication_oauth-openshift-7f8484fbcc-qxqnd to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6cdd73a13287ee719ce7a64363e190703032ec881849ca84620a2d72a8c71e6b" Netns:"/var/run/netns/34b2d4c8-0393-4605-b5ed-4faeb9ca1fd8" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7f8484fbcc-qxqnd;K8S_POD_INFRA_CONTAINER_ID=6cdd73a13287ee719ce7a64363e190703032ec881849ca84620a2d72a8c71e6b;K8S_POD_UID=0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd] networking: Multus: [openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7f8484fbcc-qxqnd in out of cluster comm: pod "oauth-openshift-7f8484fbcc-qxqnd" not found Feb 27 01:10:50 crc kubenswrapper[4771]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 27 01:10:50 crc kubenswrapper[4771]: > Feb 27 01:10:50 crc kubenswrapper[4771]: E0227 01:10:50.450301 4771 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 27 01:10:50 crc kubenswrapper[4771]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7f8484fbcc-qxqnd_openshift-authentication_0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d_0(6cdd73a13287ee719ce7a64363e190703032ec881849ca84620a2d72a8c71e6b): error adding pod openshift-authentication_oauth-openshift-7f8484fbcc-qxqnd to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6cdd73a13287ee719ce7a64363e190703032ec881849ca84620a2d72a8c71e6b" Netns:"/var/run/netns/34b2d4c8-0393-4605-b5ed-4faeb9ca1fd8" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7f8484fbcc-qxqnd;K8S_POD_INFRA_CONTAINER_ID=6cdd73a13287ee719ce7a64363e190703032ec881849ca84620a2d72a8c71e6b;K8S_POD_UID=0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd] networking: Multus: [openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7f8484fbcc-qxqnd in out of cluster comm: pod "oauth-openshift-7f8484fbcc-qxqnd" not found Feb 27 01:10:50 crc kubenswrapper[4771]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 27 01:10:50 crc kubenswrapper[4771]: > pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:50 crc kubenswrapper[4771]: E0227 01:10:50.450335 4771 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 27 01:10:50 crc kubenswrapper[4771]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7f8484fbcc-qxqnd_openshift-authentication_0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d_0(6cdd73a13287ee719ce7a64363e190703032ec881849ca84620a2d72a8c71e6b): error adding pod openshift-authentication_oauth-openshift-7f8484fbcc-qxqnd to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6cdd73a13287ee719ce7a64363e190703032ec881849ca84620a2d72a8c71e6b" Netns:"/var/run/netns/34b2d4c8-0393-4605-b5ed-4faeb9ca1fd8" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7f8484fbcc-qxqnd;K8S_POD_INFRA_CONTAINER_ID=6cdd73a13287ee719ce7a64363e190703032ec881849ca84620a2d72a8c71e6b;K8S_POD_UID=0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd] networking: Multus: [openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7f8484fbcc-qxqnd in out of cluster comm: pod "oauth-openshift-7f8484fbcc-qxqnd" not found Feb 27 01:10:50 crc kubenswrapper[4771]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 27 01:10:50 crc kubenswrapper[4771]: > pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:10:50 crc kubenswrapper[4771]: E0227 01:10:50.450486 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-7f8484fbcc-qxqnd_openshift-authentication(0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-7f8484fbcc-qxqnd_openshift-authentication(0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7f8484fbcc-qxqnd_openshift-authentication_0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d_0(6cdd73a13287ee719ce7a64363e190703032ec881849ca84620a2d72a8c71e6b): error adding pod openshift-authentication_oauth-openshift-7f8484fbcc-qxqnd to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"6cdd73a13287ee719ce7a64363e190703032ec881849ca84620a2d72a8c71e6b\\\" Netns:\\\"/var/run/netns/34b2d4c8-0393-4605-b5ed-4faeb9ca1fd8\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7f8484fbcc-qxqnd;K8S_POD_INFRA_CONTAINER_ID=6cdd73a13287ee719ce7a64363e190703032ec881849ca84620a2d72a8c71e6b;K8S_POD_UID=0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd] networking: Multus: [openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7f8484fbcc-qxqnd in out of cluster comm: pod \\\"oauth-openshift-7f8484fbcc-qxqnd\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" podUID="0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d" Feb 27 01:10:54 crc kubenswrapper[4771]: I0227 01:10:54.057589 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 27 01:10:55 crc kubenswrapper[4771]: I0227 01:10:55.254786 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 27 01:10:56 crc kubenswrapper[4771]: I0227 01:10:56.398470 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 27 01:10:56 crc kubenswrapper[4771]: I0227 01:10:56.422611 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 27 01:10:57 crc kubenswrapper[4771]: I0227 01:10:57.673414 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535910-59vm6"] Feb 27 01:10:57 crc kubenswrapper[4771]: I0227 01:10:57.674334 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535910-59vm6" Feb 27 01:10:57 crc kubenswrapper[4771]: I0227 01:10:57.679264 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 01:10:57 crc kubenswrapper[4771]: I0227 01:10:57.679420 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:10:57 crc kubenswrapper[4771]: I0227 01:10:57.679527 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:10:57 crc kubenswrapper[4771]: I0227 01:10:57.683101 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535910-59vm6"] Feb 27 01:10:57 crc kubenswrapper[4771]: I0227 01:10:57.708798 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r282c\" (UniqueName: \"kubernetes.io/projected/13c83ff4-13dd-4091-a05b-1f9b624fa886-kube-api-access-r282c\") pod \"auto-csr-approver-29535910-59vm6\" (UID: \"13c83ff4-13dd-4091-a05b-1f9b624fa886\") " pod="openshift-infra/auto-csr-approver-29535910-59vm6" Feb 27 01:10:57 crc kubenswrapper[4771]: I0227 01:10:57.810135 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r282c\" (UniqueName: \"kubernetes.io/projected/13c83ff4-13dd-4091-a05b-1f9b624fa886-kube-api-access-r282c\") pod \"auto-csr-approver-29535910-59vm6\" (UID: \"13c83ff4-13dd-4091-a05b-1f9b624fa886\") " pod="openshift-infra/auto-csr-approver-29535910-59vm6" Feb 27 01:10:57 crc kubenswrapper[4771]: I0227 01:10:57.828458 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r282c\" (UniqueName: \"kubernetes.io/projected/13c83ff4-13dd-4091-a05b-1f9b624fa886-kube-api-access-r282c\") pod \"auto-csr-approver-29535910-59vm6\" (UID: \"13c83ff4-13dd-4091-a05b-1f9b624fa886\") " pod="openshift-infra/auto-csr-approver-29535910-59vm6" Feb 27 01:10:57 crc kubenswrapper[4771]: I0227 01:10:57.991812 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535910-59vm6" Feb 27 01:10:58 crc kubenswrapper[4771]: I0227 01:10:58.420041 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 27 01:10:58 crc kubenswrapper[4771]: I0227 01:10:58.433621 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 27 01:10:59 crc kubenswrapper[4771]: I0227 01:10:59.979624 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 27 01:11:00 crc kubenswrapper[4771]: I0227 01:11:00.378665 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 27 01:11:01 crc kubenswrapper[4771]: E0227 01:11:01.142593 4771 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 27 01:11:01 crc kubenswrapper[4771]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535910-59vm6_openshift-infra_13c83ff4-13dd-4091-a05b-1f9b624fa886_0(29b45299b273eb775ceae47942c8946692d265ffd2b41c538109a0b04848f107): error adding pod openshift-infra_auto-csr-approver-29535910-59vm6 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"29b45299b273eb775ceae47942c8946692d265ffd2b41c538109a0b04848f107" Netns:"/var/run/netns/d2c11a62-e970-4cb5-ad14-76682208cb13" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29535910-59vm6;K8S_POD_INFRA_CONTAINER_ID=29b45299b273eb775ceae47942c8946692d265ffd2b41c538109a0b04848f107;K8S_POD_UID=13c83ff4-13dd-4091-a05b-1f9b624fa886" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29535910-59vm6] networking: Multus: [openshift-infra/auto-csr-approver-29535910-59vm6/13c83ff4-13dd-4091-a05b-1f9b624fa886]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29535910-59vm6 in out of cluster comm: pod "auto-csr-approver-29535910-59vm6" not found Feb 27 01:11:01 crc kubenswrapper[4771]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 27 01:11:01 crc kubenswrapper[4771]: > Feb 27 01:11:01 crc kubenswrapper[4771]: E0227 01:11:01.142919 4771 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 27 01:11:01 crc kubenswrapper[4771]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535910-59vm6_openshift-infra_13c83ff4-13dd-4091-a05b-1f9b624fa886_0(29b45299b273eb775ceae47942c8946692d265ffd2b41c538109a0b04848f107): error adding pod openshift-infra_auto-csr-approver-29535910-59vm6 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"29b45299b273eb775ceae47942c8946692d265ffd2b41c538109a0b04848f107" Netns:"/var/run/netns/d2c11a62-e970-4cb5-ad14-76682208cb13" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29535910-59vm6;K8S_POD_INFRA_CONTAINER_ID=29b45299b273eb775ceae47942c8946692d265ffd2b41c538109a0b04848f107;K8S_POD_UID=13c83ff4-13dd-4091-a05b-1f9b624fa886" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29535910-59vm6] networking: Multus: [openshift-infra/auto-csr-approver-29535910-59vm6/13c83ff4-13dd-4091-a05b-1f9b624fa886]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29535910-59vm6 in out of cluster comm: pod "auto-csr-approver-29535910-59vm6" not found Feb 27 01:11:01 crc kubenswrapper[4771]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 27 01:11:01 crc kubenswrapper[4771]: > pod="openshift-infra/auto-csr-approver-29535910-59vm6" Feb 27 01:11:01 crc kubenswrapper[4771]: E0227 01:11:01.142943 4771 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 27 01:11:01 crc kubenswrapper[4771]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535910-59vm6_openshift-infra_13c83ff4-13dd-4091-a05b-1f9b624fa886_0(29b45299b273eb775ceae47942c8946692d265ffd2b41c538109a0b04848f107): error adding pod openshift-infra_auto-csr-approver-29535910-59vm6 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"29b45299b273eb775ceae47942c8946692d265ffd2b41c538109a0b04848f107" Netns:"/var/run/netns/d2c11a62-e970-4cb5-ad14-76682208cb13" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29535910-59vm6;K8S_POD_INFRA_CONTAINER_ID=29b45299b273eb775ceae47942c8946692d265ffd2b41c538109a0b04848f107;K8S_POD_UID=13c83ff4-13dd-4091-a05b-1f9b624fa886" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29535910-59vm6] networking: Multus: [openshift-infra/auto-csr-approver-29535910-59vm6/13c83ff4-13dd-4091-a05b-1f9b624fa886]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29535910-59vm6 in out of cluster comm: pod "auto-csr-approver-29535910-59vm6" not found Feb 27 01:11:01 crc kubenswrapper[4771]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 27 01:11:01 crc kubenswrapper[4771]: > pod="openshift-infra/auto-csr-approver-29535910-59vm6" Feb 27 01:11:01 crc kubenswrapper[4771]: E0227 01:11:01.143004 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29535910-59vm6_openshift-infra(13c83ff4-13dd-4091-a05b-1f9b624fa886)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29535910-59vm6_openshift-infra(13c83ff4-13dd-4091-a05b-1f9b624fa886)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535910-59vm6_openshift-infra_13c83ff4-13dd-4091-a05b-1f9b624fa886_0(29b45299b273eb775ceae47942c8946692d265ffd2b41c538109a0b04848f107): error adding pod openshift-infra_auto-csr-approver-29535910-59vm6 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"29b45299b273eb775ceae47942c8946692d265ffd2b41c538109a0b04848f107\\\" Netns:\\\"/var/run/netns/d2c11a62-e970-4cb5-ad14-76682208cb13\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29535910-59vm6;K8S_POD_INFRA_CONTAINER_ID=29b45299b273eb775ceae47942c8946692d265ffd2b41c538109a0b04848f107;K8S_POD_UID=13c83ff4-13dd-4091-a05b-1f9b624fa886\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29535910-59vm6] networking: Multus: [openshift-infra/auto-csr-approver-29535910-59vm6/13c83ff4-13dd-4091-a05b-1f9b624fa886]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29535910-59vm6 in out of cluster comm: pod \\\"auto-csr-approver-29535910-59vm6\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-infra/auto-csr-approver-29535910-59vm6" podUID="13c83ff4-13dd-4091-a05b-1f9b624fa886" Feb 27 01:11:01 crc kubenswrapper[4771]: I0227 01:11:01.421175 4771 generic.go:334] "Generic (PLEG): container finished" podID="963fd070-b5e6-4a67-afd6-d056aacf8bc2" containerID="a8e6d733021697d0fa055f19b568a4e1868ca7248f61c98a447627db114115de" exitCode=0 Feb 27 01:11:01 crc kubenswrapper[4771]: I0227 01:11:01.421253 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535910-59vm6" Feb 27 01:11:01 crc kubenswrapper[4771]: I0227 01:11:01.421310 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" event={"ID":"963fd070-b5e6-4a67-afd6-d056aacf8bc2","Type":"ContainerDied","Data":"a8e6d733021697d0fa055f19b568a4e1868ca7248f61c98a447627db114115de"} Feb 27 01:11:01 crc kubenswrapper[4771]: I0227 01:11:01.421672 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535910-59vm6" Feb 27 01:11:01 crc kubenswrapper[4771]: I0227 01:11:01.421962 4771 scope.go:117] "RemoveContainer" containerID="a8e6d733021697d0fa055f19b568a4e1868ca7248f61c98a447627db114115de" Feb 27 01:11:01 crc kubenswrapper[4771]: I0227 01:11:01.773197 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:11:01 crc kubenswrapper[4771]: I0227 01:11:01.774593 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:11:01 crc kubenswrapper[4771]: I0227 01:11:01.878218 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 27 01:11:02 crc kubenswrapper[4771]: I0227 01:11:02.430964 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" event={"ID":"963fd070-b5e6-4a67-afd6-d056aacf8bc2","Type":"ContainerStarted","Data":"29b73e867d24e4e4a2d589b54eb62e16fc0012f33e537fb487b81c830b48a6cb"} Feb 27 01:11:02 crc kubenswrapper[4771]: I0227 01:11:02.433707 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" Feb 27 01:11:02 crc kubenswrapper[4771]: I0227 01:11:02.435331 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" Feb 27 01:11:04 crc kubenswrapper[4771]: E0227 01:11:04.664519 4771 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 27 01:11:04 crc kubenswrapper[4771]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535910-59vm6_openshift-infra_13c83ff4-13dd-4091-a05b-1f9b624fa886_0(11dce4802dc058303609e634242608d13d6cb769a4f1395a852d86197f7dba30): error adding pod openshift-infra_auto-csr-approver-29535910-59vm6 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"11dce4802dc058303609e634242608d13d6cb769a4f1395a852d86197f7dba30" Netns:"/var/run/netns/8f47da1c-bfd5-442c-b38e-018c79f1bed0" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29535910-59vm6;K8S_POD_INFRA_CONTAINER_ID=11dce4802dc058303609e634242608d13d6cb769a4f1395a852d86197f7dba30;K8S_POD_UID=13c83ff4-13dd-4091-a05b-1f9b624fa886" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29535910-59vm6] networking: Multus: [openshift-infra/auto-csr-approver-29535910-59vm6/13c83ff4-13dd-4091-a05b-1f9b624fa886]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29535910-59vm6 in out of cluster comm: pod "auto-csr-approver-29535910-59vm6" not found Feb 27 01:11:04 crc kubenswrapper[4771]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 27 01:11:04 crc kubenswrapper[4771]: > Feb 27 01:11:04 crc kubenswrapper[4771]: E0227 01:11:04.664880 4771 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 27 01:11:04 crc kubenswrapper[4771]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535910-59vm6_openshift-infra_13c83ff4-13dd-4091-a05b-1f9b624fa886_0(11dce4802dc058303609e634242608d13d6cb769a4f1395a852d86197f7dba30): error adding pod openshift-infra_auto-csr-approver-29535910-59vm6 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"11dce4802dc058303609e634242608d13d6cb769a4f1395a852d86197f7dba30" Netns:"/var/run/netns/8f47da1c-bfd5-442c-b38e-018c79f1bed0" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29535910-59vm6;K8S_POD_INFRA_CONTAINER_ID=11dce4802dc058303609e634242608d13d6cb769a4f1395a852d86197f7dba30;K8S_POD_UID=13c83ff4-13dd-4091-a05b-1f9b624fa886" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29535910-59vm6] networking: Multus: [openshift-infra/auto-csr-approver-29535910-59vm6/13c83ff4-13dd-4091-a05b-1f9b624fa886]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29535910-59vm6 in out of cluster comm: pod "auto-csr-approver-29535910-59vm6" not found Feb 27 01:11:04 crc kubenswrapper[4771]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 27 01:11:04 crc kubenswrapper[4771]: > pod="openshift-infra/auto-csr-approver-29535910-59vm6" Feb 27 01:11:04 crc kubenswrapper[4771]: E0227 01:11:04.664905 4771 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 27 01:11:04 crc kubenswrapper[4771]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535910-59vm6_openshift-infra_13c83ff4-13dd-4091-a05b-1f9b624fa886_0(11dce4802dc058303609e634242608d13d6cb769a4f1395a852d86197f7dba30): error adding pod openshift-infra_auto-csr-approver-29535910-59vm6 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"11dce4802dc058303609e634242608d13d6cb769a4f1395a852d86197f7dba30" Netns:"/var/run/netns/8f47da1c-bfd5-442c-b38e-018c79f1bed0" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29535910-59vm6;K8S_POD_INFRA_CONTAINER_ID=11dce4802dc058303609e634242608d13d6cb769a4f1395a852d86197f7dba30;K8S_POD_UID=13c83ff4-13dd-4091-a05b-1f9b624fa886" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29535910-59vm6] networking: Multus: [openshift-infra/auto-csr-approver-29535910-59vm6/13c83ff4-13dd-4091-a05b-1f9b624fa886]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29535910-59vm6 in out of cluster comm: pod "auto-csr-approver-29535910-59vm6" not found Feb 27 01:11:04 crc kubenswrapper[4771]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 27 01:11:04 crc kubenswrapper[4771]: > pod="openshift-infra/auto-csr-approver-29535910-59vm6" Feb 27 01:11:04 crc kubenswrapper[4771]: E0227 01:11:04.664973 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29535910-59vm6_openshift-infra(13c83ff4-13dd-4091-a05b-1f9b624fa886)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29535910-59vm6_openshift-infra(13c83ff4-13dd-4091-a05b-1f9b624fa886)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535910-59vm6_openshift-infra_13c83ff4-13dd-4091-a05b-1f9b624fa886_0(11dce4802dc058303609e634242608d13d6cb769a4f1395a852d86197f7dba30): error adding pod openshift-infra_auto-csr-approver-29535910-59vm6 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"11dce4802dc058303609e634242608d13d6cb769a4f1395a852d86197f7dba30\\\" Netns:\\\"/var/run/netns/8f47da1c-bfd5-442c-b38e-018c79f1bed0\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29535910-59vm6;K8S_POD_INFRA_CONTAINER_ID=11dce4802dc058303609e634242608d13d6cb769a4f1395a852d86197f7dba30;K8S_POD_UID=13c83ff4-13dd-4091-a05b-1f9b624fa886\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29535910-59vm6] networking: Multus: [openshift-infra/auto-csr-approver-29535910-59vm6/13c83ff4-13dd-4091-a05b-1f9b624fa886]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29535910-59vm6 in out of cluster comm: pod \\\"auto-csr-approver-29535910-59vm6\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-infra/auto-csr-approver-29535910-59vm6" podUID="13c83ff4-13dd-4091-a05b-1f9b624fa886" Feb 27 01:11:05 crc kubenswrapper[4771]: E0227 01:11:05.021313 4771 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 27 01:11:05 crc kubenswrapper[4771]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7f8484fbcc-qxqnd_openshift-authentication_0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d_0(323297750cc3e970e89e4bb7ad16e091e78a8b986375db73517bde31624228fa): error adding pod openshift-authentication_oauth-openshift-7f8484fbcc-qxqnd to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"323297750cc3e970e89e4bb7ad16e091e78a8b986375db73517bde31624228fa" Netns:"/var/run/netns/127b2e3f-aa39-45ac-9118-07c52a6d61c3" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7f8484fbcc-qxqnd;K8S_POD_INFRA_CONTAINER_ID=323297750cc3e970e89e4bb7ad16e091e78a8b986375db73517bde31624228fa;K8S_POD_UID=0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd] networking: Multus: [openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7f8484fbcc-qxqnd in out of cluster comm: pod "oauth-openshift-7f8484fbcc-qxqnd" not found Feb 27 01:11:05 crc kubenswrapper[4771]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 27 01:11:05 crc kubenswrapper[4771]: > Feb 27 01:11:05 crc kubenswrapper[4771]: E0227 01:11:05.021925 4771 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 27 01:11:05 crc kubenswrapper[4771]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7f8484fbcc-qxqnd_openshift-authentication_0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d_0(323297750cc3e970e89e4bb7ad16e091e78a8b986375db73517bde31624228fa): error adding pod openshift-authentication_oauth-openshift-7f8484fbcc-qxqnd to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"323297750cc3e970e89e4bb7ad16e091e78a8b986375db73517bde31624228fa" Netns:"/var/run/netns/127b2e3f-aa39-45ac-9118-07c52a6d61c3" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7f8484fbcc-qxqnd;K8S_POD_INFRA_CONTAINER_ID=323297750cc3e970e89e4bb7ad16e091e78a8b986375db73517bde31624228fa;K8S_POD_UID=0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd] networking: Multus: [openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7f8484fbcc-qxqnd in out of cluster comm: pod "oauth-openshift-7f8484fbcc-qxqnd" not found Feb 27 01:11:05 crc kubenswrapper[4771]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 27 01:11:05 crc kubenswrapper[4771]: > pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:11:05 crc kubenswrapper[4771]: E0227 01:11:05.021961 4771 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 27 01:11:05 crc kubenswrapper[4771]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7f8484fbcc-qxqnd_openshift-authentication_0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d_0(323297750cc3e970e89e4bb7ad16e091e78a8b986375db73517bde31624228fa): error adding pod openshift-authentication_oauth-openshift-7f8484fbcc-qxqnd to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"323297750cc3e970e89e4bb7ad16e091e78a8b986375db73517bde31624228fa" Netns:"/var/run/netns/127b2e3f-aa39-45ac-9118-07c52a6d61c3" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7f8484fbcc-qxqnd;K8S_POD_INFRA_CONTAINER_ID=323297750cc3e970e89e4bb7ad16e091e78a8b986375db73517bde31624228fa;K8S_POD_UID=0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd] networking: Multus: [openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7f8484fbcc-qxqnd in out of cluster comm: pod "oauth-openshift-7f8484fbcc-qxqnd" not found Feb 27 01:11:05 crc kubenswrapper[4771]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 27 01:11:05 crc kubenswrapper[4771]: > pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:11:05 crc kubenswrapper[4771]: E0227 01:11:05.022052 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-7f8484fbcc-qxqnd_openshift-authentication(0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-7f8484fbcc-qxqnd_openshift-authentication(0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7f8484fbcc-qxqnd_openshift-authentication_0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d_0(323297750cc3e970e89e4bb7ad16e091e78a8b986375db73517bde31624228fa): error adding pod openshift-authentication_oauth-openshift-7f8484fbcc-qxqnd to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"323297750cc3e970e89e4bb7ad16e091e78a8b986375db73517bde31624228fa\\\" Netns:\\\"/var/run/netns/127b2e3f-aa39-45ac-9118-07c52a6d61c3\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7f8484fbcc-qxqnd;K8S_POD_INFRA_CONTAINER_ID=323297750cc3e970e89e4bb7ad16e091e78a8b986375db73517bde31624228fa;K8S_POD_UID=0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd] networking: Multus: [openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd/0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7f8484fbcc-qxqnd in out of cluster comm: pod \\\"oauth-openshift-7f8484fbcc-qxqnd\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" podUID="0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d" Feb 27 01:11:05 crc kubenswrapper[4771]: I0227 01:11:05.725399 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 27 01:11:06 crc kubenswrapper[4771]: I0227 01:11:06.459333 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 27 01:11:06 crc kubenswrapper[4771]: I0227 01:11:06.462159 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 27 01:11:06 crc kubenswrapper[4771]: I0227 01:11:06.463498 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 27 01:11:06 crc kubenswrapper[4771]: I0227 01:11:06.464074 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c4029c63755600e25b181f123ca63023911715b2cffea80198d5c361bd309b14"} Feb 27 01:11:06 crc kubenswrapper[4771]: I0227 01:11:06.464133 4771 scope.go:117] "RemoveContainer" containerID="74caa8275b6988734fcbb20adf8623ada0f1229f4ad6cd5bad17780278635abf" Feb 27 01:11:06 crc kubenswrapper[4771]: I0227 01:11:06.464085 4771 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c4029c63755600e25b181f123ca63023911715b2cffea80198d5c361bd309b14" exitCode=137 Feb 27 01:11:07 crc kubenswrapper[4771]: I0227 01:11:07.039367 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 27 01:11:07 crc kubenswrapper[4771]: I0227 01:11:07.264803 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 01:11:07 crc kubenswrapper[4771]: I0227 01:11:07.472745 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 27 01:11:07 crc kubenswrapper[4771]: I0227 01:11:07.474624 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 27 01:11:07 crc kubenswrapper[4771]: I0227 01:11:07.474726 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4d3caafddabce374e368c1303c73a363c71048c09aabc00bcbc6045115f014f4"} Feb 27 01:11:08 crc kubenswrapper[4771]: I0227 01:11:08.210781 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 27 01:11:08 crc kubenswrapper[4771]: I0227 01:11:08.574509 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 27 01:11:08 crc kubenswrapper[4771]: I0227 01:11:08.776064 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 27 01:11:10 crc kubenswrapper[4771]: I0227 01:11:10.901830 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 27 01:11:11 crc kubenswrapper[4771]: I0227 01:11:11.676707 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 27 01:11:11 crc kubenswrapper[4771]: I0227 01:11:11.855065 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 27 01:11:14 crc kubenswrapper[4771]: I0227 01:11:14.702255 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:11:16 crc kubenswrapper[4771]: I0227 01:11:16.281223 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:11:16 crc kubenswrapper[4771]: I0227 01:11:16.286280 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:11:16 crc kubenswrapper[4771]: I0227 01:11:16.683683 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 27 01:11:16 crc kubenswrapper[4771]: I0227 01:11:16.772469 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:11:16 crc kubenswrapper[4771]: I0227 01:11:16.773314 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:11:18 crc kubenswrapper[4771]: W0227 01:11:18.565643 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b9e6f67_1c39_4d7e_8dc9_31b8265c5d8d.slice/crio-0d28c6aa1f2421b93ed1be335fad4e71e71cdb6a27eb5506531faba0511e42fe WatchSource:0}: Error finding container 0d28c6aa1f2421b93ed1be335fad4e71e71cdb6a27eb5506531faba0511e42fe: Status 404 returned error can't find the container with id 0d28c6aa1f2421b93ed1be335fad4e71e71cdb6a27eb5506531faba0511e42fe Feb 27 01:11:18 crc kubenswrapper[4771]: I0227 01:11:18.566608 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd"] Feb 27 01:11:18 crc kubenswrapper[4771]: I0227 01:11:18.772840 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535910-59vm6" Feb 27 01:11:18 crc kubenswrapper[4771]: I0227 01:11:18.773604 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535910-59vm6" Feb 27 01:11:19 crc kubenswrapper[4771]: I0227 01:11:19.198193 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535910-59vm6"] Feb 27 01:11:19 crc kubenswrapper[4771]: W0227 01:11:19.208500 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13c83ff4_13dd_4091_a05b_1f9b624fa886.slice/crio-0219c6632242802f070f641c61daa6abe15cf07088f19badc4876957d0e6024c WatchSource:0}: Error finding container 0219c6632242802f070f641c61daa6abe15cf07088f19badc4876957d0e6024c: Status 404 returned error can't find the container with id 0219c6632242802f070f641c61daa6abe15cf07088f19badc4876957d0e6024c Feb 27 01:11:19 crc kubenswrapper[4771]: I0227 01:11:19.546037 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" event={"ID":"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d","Type":"ContainerStarted","Data":"ce3fc7506144faad0e905ace17ac0da15f0e360a43243a263cee902578b0461e"} Feb 27 01:11:19 crc kubenswrapper[4771]: I0227 01:11:19.546087 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" event={"ID":"0b9e6f67-1c39-4d7e-8dc9-31b8265c5d8d","Type":"ContainerStarted","Data":"0d28c6aa1f2421b93ed1be335fad4e71e71cdb6a27eb5506531faba0511e42fe"} Feb 27 01:11:19 crc kubenswrapper[4771]: I0227 01:11:19.546335 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:11:19 crc kubenswrapper[4771]: I0227 01:11:19.548730 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535910-59vm6" event={"ID":"13c83ff4-13dd-4091-a05b-1f9b624fa886","Type":"ContainerStarted","Data":"0219c6632242802f070f641c61daa6abe15cf07088f19badc4876957d0e6024c"} Feb 27 01:11:19 crc kubenswrapper[4771]: I0227 01:11:19.561318 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" Feb 27 01:11:19 crc kubenswrapper[4771]: I0227 01:11:19.580626 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7f8484fbcc-qxqnd" podStartSLOduration=97.580608813 podStartE2EDuration="1m37.580608813s" podCreationTimestamp="2026-02-27 01:09:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:11:19.577329752 +0000 UTC m=+392.514891040" watchObservedRunningTime="2026-02-27 01:11:19.580608813 +0000 UTC m=+392.518170101" Feb 27 01:11:20 crc kubenswrapper[4771]: I0227 01:11:20.556462 4771 generic.go:334] "Generic (PLEG): container finished" podID="13c83ff4-13dd-4091-a05b-1f9b624fa886" containerID="60f3277e71f994b220e974e345c987b63441737b0cbfeb43596e96b208c99291" exitCode=0 Feb 27 01:11:20 crc kubenswrapper[4771]: I0227 01:11:20.556701 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535910-59vm6" event={"ID":"13c83ff4-13dd-4091-a05b-1f9b624fa886","Type":"ContainerDied","Data":"60f3277e71f994b220e974e345c987b63441737b0cbfeb43596e96b208c99291"} Feb 27 01:11:20 crc kubenswrapper[4771]: I0227 01:11:20.598746 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 27 01:11:21 crc kubenswrapper[4771]: I0227 01:11:21.935821 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535910-59vm6" Feb 27 01:11:22 crc kubenswrapper[4771]: I0227 01:11:22.032643 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r282c\" (UniqueName: \"kubernetes.io/projected/13c83ff4-13dd-4091-a05b-1f9b624fa886-kube-api-access-r282c\") pod \"13c83ff4-13dd-4091-a05b-1f9b624fa886\" (UID: \"13c83ff4-13dd-4091-a05b-1f9b624fa886\") " Feb 27 01:11:22 crc kubenswrapper[4771]: I0227 01:11:22.039167 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13c83ff4-13dd-4091-a05b-1f9b624fa886-kube-api-access-r282c" (OuterVolumeSpecName: "kube-api-access-r282c") pod "13c83ff4-13dd-4091-a05b-1f9b624fa886" (UID: "13c83ff4-13dd-4091-a05b-1f9b624fa886"). InnerVolumeSpecName "kube-api-access-r282c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:11:22 crc kubenswrapper[4771]: I0227 01:11:22.134209 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r282c\" (UniqueName: \"kubernetes.io/projected/13c83ff4-13dd-4091-a05b-1f9b624fa886-kube-api-access-r282c\") on node \"crc\" DevicePath \"\"" Feb 27 01:11:22 crc kubenswrapper[4771]: I0227 01:11:22.578131 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535910-59vm6" event={"ID":"13c83ff4-13dd-4091-a05b-1f9b624fa886","Type":"ContainerDied","Data":"0219c6632242802f070f641c61daa6abe15cf07088f19badc4876957d0e6024c"} Feb 27 01:11:22 crc kubenswrapper[4771]: I0227 01:11:22.578172 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0219c6632242802f070f641c61daa6abe15cf07088f19badc4876957d0e6024c" Feb 27 01:11:22 crc kubenswrapper[4771]: I0227 01:11:22.578228 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535910-59vm6" Feb 27 01:11:24 crc kubenswrapper[4771]: I0227 01:11:24.707044 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 01:12:00 crc kubenswrapper[4771]: I0227 01:12:00.135849 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535912-6mr69"] Feb 27 01:12:00 crc kubenswrapper[4771]: E0227 01:12:00.136609 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c83ff4-13dd-4091-a05b-1f9b624fa886" containerName="oc" Feb 27 01:12:00 crc kubenswrapper[4771]: I0227 01:12:00.136624 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c83ff4-13dd-4091-a05b-1f9b624fa886" containerName="oc" Feb 27 01:12:00 crc kubenswrapper[4771]: I0227 01:12:00.136723 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="13c83ff4-13dd-4091-a05b-1f9b624fa886" containerName="oc" Feb 27 01:12:00 crc kubenswrapper[4771]: I0227 01:12:00.137064 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535912-6mr69" Feb 27 01:12:00 crc kubenswrapper[4771]: I0227 01:12:00.139844 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:12:00 crc kubenswrapper[4771]: I0227 01:12:00.139983 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:12:00 crc kubenswrapper[4771]: I0227 01:12:00.143939 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535912-6mr69"] Feb 27 01:12:00 crc kubenswrapper[4771]: I0227 01:12:00.145094 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 01:12:00 crc kubenswrapper[4771]: I0227 01:12:00.216312 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d6sx\" (UniqueName: \"kubernetes.io/projected/bc662da1-e2be-4b52-ae55-a223b4ffb8ad-kube-api-access-6d6sx\") pod \"auto-csr-approver-29535912-6mr69\" (UID: \"bc662da1-e2be-4b52-ae55-a223b4ffb8ad\") " pod="openshift-infra/auto-csr-approver-29535912-6mr69" Feb 27 01:12:00 crc kubenswrapper[4771]: I0227 01:12:00.317373 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d6sx\" (UniqueName: \"kubernetes.io/projected/bc662da1-e2be-4b52-ae55-a223b4ffb8ad-kube-api-access-6d6sx\") pod \"auto-csr-approver-29535912-6mr69\" (UID: \"bc662da1-e2be-4b52-ae55-a223b4ffb8ad\") " pod="openshift-infra/auto-csr-approver-29535912-6mr69" Feb 27 01:12:00 crc kubenswrapper[4771]: I0227 01:12:00.341631 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d6sx\" (UniqueName: \"kubernetes.io/projected/bc662da1-e2be-4b52-ae55-a223b4ffb8ad-kube-api-access-6d6sx\") pod \"auto-csr-approver-29535912-6mr69\" (UID: \"bc662da1-e2be-4b52-ae55-a223b4ffb8ad\") " pod="openshift-infra/auto-csr-approver-29535912-6mr69" Feb 27 01:12:00 crc kubenswrapper[4771]: I0227 01:12:00.464053 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535912-6mr69" Feb 27 01:12:00 crc kubenswrapper[4771]: I0227 01:12:00.894785 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535912-6mr69"] Feb 27 01:12:01 crc kubenswrapper[4771]: I0227 01:12:01.802381 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535912-6mr69" event={"ID":"bc662da1-e2be-4b52-ae55-a223b4ffb8ad","Type":"ContainerStarted","Data":"0ad17693f40c53252bea2abdac3d744d36add85caebcc0139d5c79f2a42b8ff2"} Feb 27 01:12:02 crc kubenswrapper[4771]: I0227 01:12:02.812222 4771 generic.go:334] "Generic (PLEG): container finished" podID="bc662da1-e2be-4b52-ae55-a223b4ffb8ad" containerID="bb2a8e20c9d55117ec7502085f7f2fb218308d20f7d81a64bb218b6d74ba0a3e" exitCode=0 Feb 27 01:12:02 crc kubenswrapper[4771]: I0227 01:12:02.812275 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535912-6mr69" event={"ID":"bc662da1-e2be-4b52-ae55-a223b4ffb8ad","Type":"ContainerDied","Data":"bb2a8e20c9d55117ec7502085f7f2fb218308d20f7d81a64bb218b6d74ba0a3e"} Feb 27 01:12:04 crc kubenswrapper[4771]: I0227 01:12:04.068496 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535912-6mr69" Feb 27 01:12:04 crc kubenswrapper[4771]: I0227 01:12:04.167005 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d6sx\" (UniqueName: \"kubernetes.io/projected/bc662da1-e2be-4b52-ae55-a223b4ffb8ad-kube-api-access-6d6sx\") pod \"bc662da1-e2be-4b52-ae55-a223b4ffb8ad\" (UID: \"bc662da1-e2be-4b52-ae55-a223b4ffb8ad\") " Feb 27 01:12:04 crc kubenswrapper[4771]: I0227 01:12:04.175011 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc662da1-e2be-4b52-ae55-a223b4ffb8ad-kube-api-access-6d6sx" (OuterVolumeSpecName: "kube-api-access-6d6sx") pod "bc662da1-e2be-4b52-ae55-a223b4ffb8ad" (UID: "bc662da1-e2be-4b52-ae55-a223b4ffb8ad"). InnerVolumeSpecName "kube-api-access-6d6sx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:12:04 crc kubenswrapper[4771]: I0227 01:12:04.269146 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d6sx\" (UniqueName: \"kubernetes.io/projected/bc662da1-e2be-4b52-ae55-a223b4ffb8ad-kube-api-access-6d6sx\") on node \"crc\" DevicePath \"\"" Feb 27 01:12:04 crc kubenswrapper[4771]: I0227 01:12:04.825319 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535912-6mr69" event={"ID":"bc662da1-e2be-4b52-ae55-a223b4ffb8ad","Type":"ContainerDied","Data":"0ad17693f40c53252bea2abdac3d744d36add85caebcc0139d5c79f2a42b8ff2"} Feb 27 01:12:04 crc kubenswrapper[4771]: I0227 01:12:04.825706 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ad17693f40c53252bea2abdac3d744d36add85caebcc0139d5c79f2a42b8ff2" Feb 27 01:12:04 crc kubenswrapper[4771]: I0227 01:12:04.825409 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535912-6mr69" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.303260 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lds4k"] Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.304493 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lds4k" podUID="0b5757cb-321d-4a76-8769-786b28a2b004" containerName="registry-server" containerID="cri-o://0f0d42e63af5bd77c784ef421d2d59e606ac8de2e02658b3fe2f1a8785594c6d" gracePeriod=30 Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.308290 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xk58g"] Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.308609 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xk58g" podUID="460ffbff-d0f0-43dc-bde9-6279c9a4b6a3" containerName="registry-server" containerID="cri-o://3c71395cd4bd71530a327203b7f598b299539b916e93ce5e7465b8dcbe031487" gracePeriod=30 Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.323621 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rwr4f"] Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.324062 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" podUID="963fd070-b5e6-4a67-afd6-d056aacf8bc2" containerName="marketplace-operator" containerID="cri-o://29b73e867d24e4e4a2d589b54eb62e16fc0012f33e537fb487b81c830b48a6cb" gracePeriod=30 Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.343038 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5l2f"] Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.343429 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d5l2f" podUID="deb9a4a5-1474-4744-a57e-fcdcc97922ed" containerName="registry-server" containerID="cri-o://8f8bd6441fd3159b7d3a79d09507058123c78ad7606f4d6a21f28afba06b5e6d" gracePeriod=30 Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.361810 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7r96b"] Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.362953 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7r96b" podUID="33d95d7b-dfe7-495a-b686-5737dd95b974" containerName="registry-server" containerID="cri-o://adc583580a2e6555467330ce2ef6830ba65f8c21a4c2ca71e73d712e5c113f54" gracePeriod=30 Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.378759 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jffnf"] Feb 27 01:12:17 crc kubenswrapper[4771]: E0227 01:12:17.392064 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc662da1-e2be-4b52-ae55-a223b4ffb8ad" containerName="oc" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.392129 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc662da1-e2be-4b52-ae55-a223b4ffb8ad" containerName="oc" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.392535 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc662da1-e2be-4b52-ae55-a223b4ffb8ad" containerName="oc" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.393321 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jffnf" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.396456 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jffnf"] Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.593014 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9cb60be5-a0ff-489e-a473-32a72359b2ce-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jffnf\" (UID: \"9cb60be5-a0ff-489e-a473-32a72359b2ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-jffnf" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.593414 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8brcp\" (UniqueName: \"kubernetes.io/projected/9cb60be5-a0ff-489e-a473-32a72359b2ce-kube-api-access-8brcp\") pod \"marketplace-operator-79b997595-jffnf\" (UID: \"9cb60be5-a0ff-489e-a473-32a72359b2ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-jffnf" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.593435 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9cb60be5-a0ff-489e-a473-32a72359b2ce-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jffnf\" (UID: \"9cb60be5-a0ff-489e-a473-32a72359b2ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-jffnf" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.694198 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9cb60be5-a0ff-489e-a473-32a72359b2ce-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jffnf\" (UID: \"9cb60be5-a0ff-489e-a473-32a72359b2ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-jffnf" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.694257 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8brcp\" (UniqueName: \"kubernetes.io/projected/9cb60be5-a0ff-489e-a473-32a72359b2ce-kube-api-access-8brcp\") pod \"marketplace-operator-79b997595-jffnf\" (UID: \"9cb60be5-a0ff-489e-a473-32a72359b2ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-jffnf" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.694282 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9cb60be5-a0ff-489e-a473-32a72359b2ce-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jffnf\" (UID: \"9cb60be5-a0ff-489e-a473-32a72359b2ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-jffnf" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.698219 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9cb60be5-a0ff-489e-a473-32a72359b2ce-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jffnf\" (UID: \"9cb60be5-a0ff-489e-a473-32a72359b2ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-jffnf" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.704995 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9cb60be5-a0ff-489e-a473-32a72359b2ce-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jffnf\" (UID: \"9cb60be5-a0ff-489e-a473-32a72359b2ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-jffnf" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.711489 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8brcp\" (UniqueName: \"kubernetes.io/projected/9cb60be5-a0ff-489e-a473-32a72359b2ce-kube-api-access-8brcp\") pod \"marketplace-operator-79b997595-jffnf\" (UID: \"9cb60be5-a0ff-489e-a473-32a72359b2ce\") " pod="openshift-marketplace/marketplace-operator-79b997595-jffnf" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.800241 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jffnf" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.806458 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lds4k" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.810712 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xk58g" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.822822 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.822919 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7r96b" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.825839 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5l2f" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.909338 4771 generic.go:334] "Generic (PLEG): container finished" podID="33d95d7b-dfe7-495a-b686-5737dd95b974" containerID="adc583580a2e6555467330ce2ef6830ba65f8c21a4c2ca71e73d712e5c113f54" exitCode=0 Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.909427 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7r96b" event={"ID":"33d95d7b-dfe7-495a-b686-5737dd95b974","Type":"ContainerDied","Data":"adc583580a2e6555467330ce2ef6830ba65f8c21a4c2ca71e73d712e5c113f54"} Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.909467 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7r96b" event={"ID":"33d95d7b-dfe7-495a-b686-5737dd95b974","Type":"ContainerDied","Data":"11f43a18e055af50c64122c250383fecc60d3c679e110e9fec3a67d4e83cf787"} Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.909495 4771 scope.go:117] "RemoveContainer" containerID="adc583580a2e6555467330ce2ef6830ba65f8c21a4c2ca71e73d712e5c113f54" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.909686 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7r96b" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.925650 4771 generic.go:334] "Generic (PLEG): container finished" podID="460ffbff-d0f0-43dc-bde9-6279c9a4b6a3" containerID="3c71395cd4bd71530a327203b7f598b299539b916e93ce5e7465b8dcbe031487" exitCode=0 Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.925721 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xk58g" event={"ID":"460ffbff-d0f0-43dc-bde9-6279c9a4b6a3","Type":"ContainerDied","Data":"3c71395cd4bd71530a327203b7f598b299539b916e93ce5e7465b8dcbe031487"} Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.925746 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xk58g" event={"ID":"460ffbff-d0f0-43dc-bde9-6279c9a4b6a3","Type":"ContainerDied","Data":"9da539d313ec5162f40beb8d4a77cf80b2b2cae6a68c46d3037d807108e4be4e"} Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.925798 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xk58g" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.928348 4771 generic.go:334] "Generic (PLEG): container finished" podID="963fd070-b5e6-4a67-afd6-d056aacf8bc2" containerID="29b73e867d24e4e4a2d589b54eb62e16fc0012f33e537fb487b81c830b48a6cb" exitCode=0 Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.928500 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.928821 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" event={"ID":"963fd070-b5e6-4a67-afd6-d056aacf8bc2","Type":"ContainerDied","Data":"29b73e867d24e4e4a2d589b54eb62e16fc0012f33e537fb487b81c830b48a6cb"} Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.928861 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rwr4f" event={"ID":"963fd070-b5e6-4a67-afd6-d056aacf8bc2","Type":"ContainerDied","Data":"b2ed07c40ddaed774caf8731e5dc006d7f2bfbc7d2cef9b338d51ad77e145ba4"} Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.931892 4771 generic.go:334] "Generic (PLEG): container finished" podID="0b5757cb-321d-4a76-8769-786b28a2b004" containerID="0f0d42e63af5bd77c784ef421d2d59e606ac8de2e02658b3fe2f1a8785594c6d" exitCode=0 Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.931940 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lds4k" event={"ID":"0b5757cb-321d-4a76-8769-786b28a2b004","Type":"ContainerDied","Data":"0f0d42e63af5bd77c784ef421d2d59e606ac8de2e02658b3fe2f1a8785594c6d"} Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.931960 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lds4k" event={"ID":"0b5757cb-321d-4a76-8769-786b28a2b004","Type":"ContainerDied","Data":"e26475c1fb191aa6a0b6893b69ce3258ae015ed411a2f2645477e14bf0c9a2d8"} Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.932023 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lds4k" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.934493 4771 generic.go:334] "Generic (PLEG): container finished" podID="deb9a4a5-1474-4744-a57e-fcdcc97922ed" containerID="8f8bd6441fd3159b7d3a79d09507058123c78ad7606f4d6a21f28afba06b5e6d" exitCode=0 Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.934534 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5l2f" event={"ID":"deb9a4a5-1474-4744-a57e-fcdcc97922ed","Type":"ContainerDied","Data":"8f8bd6441fd3159b7d3a79d09507058123c78ad7606f4d6a21f28afba06b5e6d"} Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.934575 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5l2f" event={"ID":"deb9a4a5-1474-4744-a57e-fcdcc97922ed","Type":"ContainerDied","Data":"970024a3e6803c0ecfbcd86bc8def90fe4b988ea58c19e2017830b90d54e9e46"} Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.934639 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5l2f" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.947288 4771 scope.go:117] "RemoveContainer" containerID="08a8bfddb81f5f2af6b82fbf0bc4cdd2d42bdb3a31f76a788d084702fcd3fe23" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.992722 4771 scope.go:117] "RemoveContainer" containerID="95b828f5b4197c1a596ee79949d2f6af0cbd56911f39aa62ee2742a50e7e15d5" Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.998767 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/963fd070-b5e6-4a67-afd6-d056aacf8bc2-marketplace-operator-metrics\") pod \"963fd070-b5e6-4a67-afd6-d056aacf8bc2\" (UID: \"963fd070-b5e6-4a67-afd6-d056aacf8bc2\") " Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.998800 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2vfs\" (UniqueName: \"kubernetes.io/projected/deb9a4a5-1474-4744-a57e-fcdcc97922ed-kube-api-access-p2vfs\") pod \"deb9a4a5-1474-4744-a57e-fcdcc97922ed\" (UID: \"deb9a4a5-1474-4744-a57e-fcdcc97922ed\") " Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.998829 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deb9a4a5-1474-4744-a57e-fcdcc97922ed-utilities\") pod \"deb9a4a5-1474-4744-a57e-fcdcc97922ed\" (UID: \"deb9a4a5-1474-4744-a57e-fcdcc97922ed\") " Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.998846 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33d95d7b-dfe7-495a-b686-5737dd95b974-utilities\") pod \"33d95d7b-dfe7-495a-b686-5737dd95b974\" (UID: \"33d95d7b-dfe7-495a-b686-5737dd95b974\") " Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.998865 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b5757cb-321d-4a76-8769-786b28a2b004-catalog-content\") pod \"0b5757cb-321d-4a76-8769-786b28a2b004\" (UID: \"0b5757cb-321d-4a76-8769-786b28a2b004\") " Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.998882 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wg6c9\" (UniqueName: \"kubernetes.io/projected/33d95d7b-dfe7-495a-b686-5737dd95b974-kube-api-access-wg6c9\") pod \"33d95d7b-dfe7-495a-b686-5737dd95b974\" (UID: \"33d95d7b-dfe7-495a-b686-5737dd95b974\") " Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.998902 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr2k6\" (UniqueName: \"kubernetes.io/projected/963fd070-b5e6-4a67-afd6-d056aacf8bc2-kube-api-access-sr2k6\") pod \"963fd070-b5e6-4a67-afd6-d056aacf8bc2\" (UID: \"963fd070-b5e6-4a67-afd6-d056aacf8bc2\") " Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.998922 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33d95d7b-dfe7-495a-b686-5737dd95b974-catalog-content\") pod \"33d95d7b-dfe7-495a-b686-5737dd95b974\" (UID: \"33d95d7b-dfe7-495a-b686-5737dd95b974\") " Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.998939 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b5757cb-321d-4a76-8769-786b28a2b004-utilities\") pod \"0b5757cb-321d-4a76-8769-786b28a2b004\" (UID: \"0b5757cb-321d-4a76-8769-786b28a2b004\") " Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.998962 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sstcr\" (UniqueName: \"kubernetes.io/projected/460ffbff-d0f0-43dc-bde9-6279c9a4b6a3-kube-api-access-sstcr\") pod \"460ffbff-d0f0-43dc-bde9-6279c9a4b6a3\" (UID: \"460ffbff-d0f0-43dc-bde9-6279c9a4b6a3\") " Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.998979 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzlpt\" (UniqueName: \"kubernetes.io/projected/0b5757cb-321d-4a76-8769-786b28a2b004-kube-api-access-kzlpt\") pod \"0b5757cb-321d-4a76-8769-786b28a2b004\" (UID: \"0b5757cb-321d-4a76-8769-786b28a2b004\") " Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.999004 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deb9a4a5-1474-4744-a57e-fcdcc97922ed-catalog-content\") pod \"deb9a4a5-1474-4744-a57e-fcdcc97922ed\" (UID: \"deb9a4a5-1474-4744-a57e-fcdcc97922ed\") " Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.999022 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/963fd070-b5e6-4a67-afd6-d056aacf8bc2-marketplace-trusted-ca\") pod \"963fd070-b5e6-4a67-afd6-d056aacf8bc2\" (UID: \"963fd070-b5e6-4a67-afd6-d056aacf8bc2\") " Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.999037 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/460ffbff-d0f0-43dc-bde9-6279c9a4b6a3-catalog-content\") pod \"460ffbff-d0f0-43dc-bde9-6279c9a4b6a3\" (UID: \"460ffbff-d0f0-43dc-bde9-6279c9a4b6a3\") " Feb 27 01:12:17 crc kubenswrapper[4771]: I0227 01:12:17.999052 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/460ffbff-d0f0-43dc-bde9-6279c9a4b6a3-utilities\") pod \"460ffbff-d0f0-43dc-bde9-6279c9a4b6a3\" (UID: \"460ffbff-d0f0-43dc-bde9-6279c9a4b6a3\") " Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:17.999988 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deb9a4a5-1474-4744-a57e-fcdcc97922ed-utilities" (OuterVolumeSpecName: "utilities") pod "deb9a4a5-1474-4744-a57e-fcdcc97922ed" (UID: "deb9a4a5-1474-4744-a57e-fcdcc97922ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:17.999998 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/963fd070-b5e6-4a67-afd6-d056aacf8bc2-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "963fd070-b5e6-4a67-afd6-d056aacf8bc2" (UID: "963fd070-b5e6-4a67-afd6-d056aacf8bc2"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.000047 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/460ffbff-d0f0-43dc-bde9-6279c9a4b6a3-utilities" (OuterVolumeSpecName: "utilities") pod "460ffbff-d0f0-43dc-bde9-6279c9a4b6a3" (UID: "460ffbff-d0f0-43dc-bde9-6279c9a4b6a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.000409 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33d95d7b-dfe7-495a-b686-5737dd95b974-utilities" (OuterVolumeSpecName: "utilities") pod "33d95d7b-dfe7-495a-b686-5737dd95b974" (UID: "33d95d7b-dfe7-495a-b686-5737dd95b974"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.004085 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b5757cb-321d-4a76-8769-786b28a2b004-kube-api-access-kzlpt" (OuterVolumeSpecName: "kube-api-access-kzlpt") pod "0b5757cb-321d-4a76-8769-786b28a2b004" (UID: "0b5757cb-321d-4a76-8769-786b28a2b004"). InnerVolumeSpecName "kube-api-access-kzlpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.004584 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/963fd070-b5e6-4a67-afd6-d056aacf8bc2-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "963fd070-b5e6-4a67-afd6-d056aacf8bc2" (UID: "963fd070-b5e6-4a67-afd6-d056aacf8bc2"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.004837 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b5757cb-321d-4a76-8769-786b28a2b004-utilities" (OuterVolumeSpecName: "utilities") pod "0b5757cb-321d-4a76-8769-786b28a2b004" (UID: "0b5757cb-321d-4a76-8769-786b28a2b004"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.005891 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/460ffbff-d0f0-43dc-bde9-6279c9a4b6a3-kube-api-access-sstcr" (OuterVolumeSpecName: "kube-api-access-sstcr") pod "460ffbff-d0f0-43dc-bde9-6279c9a4b6a3" (UID: "460ffbff-d0f0-43dc-bde9-6279c9a4b6a3"). InnerVolumeSpecName "kube-api-access-sstcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.007906 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33d95d7b-dfe7-495a-b686-5737dd95b974-kube-api-access-wg6c9" (OuterVolumeSpecName: "kube-api-access-wg6c9") pod "33d95d7b-dfe7-495a-b686-5737dd95b974" (UID: "33d95d7b-dfe7-495a-b686-5737dd95b974"). InnerVolumeSpecName "kube-api-access-wg6c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.007921 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deb9a4a5-1474-4744-a57e-fcdcc97922ed-kube-api-access-p2vfs" (OuterVolumeSpecName: "kube-api-access-p2vfs") pod "deb9a4a5-1474-4744-a57e-fcdcc97922ed" (UID: "deb9a4a5-1474-4744-a57e-fcdcc97922ed"). InnerVolumeSpecName "kube-api-access-p2vfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.014523 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/963fd070-b5e6-4a67-afd6-d056aacf8bc2-kube-api-access-sr2k6" (OuterVolumeSpecName: "kube-api-access-sr2k6") pod "963fd070-b5e6-4a67-afd6-d056aacf8bc2" (UID: "963fd070-b5e6-4a67-afd6-d056aacf8bc2"). InnerVolumeSpecName "kube-api-access-sr2k6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.014601 4771 scope.go:117] "RemoveContainer" containerID="adc583580a2e6555467330ce2ef6830ba65f8c21a4c2ca71e73d712e5c113f54" Feb 27 01:12:18 crc kubenswrapper[4771]: E0227 01:12:18.016990 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adc583580a2e6555467330ce2ef6830ba65f8c21a4c2ca71e73d712e5c113f54\": container with ID starting with adc583580a2e6555467330ce2ef6830ba65f8c21a4c2ca71e73d712e5c113f54 not found: ID does not exist" containerID="adc583580a2e6555467330ce2ef6830ba65f8c21a4c2ca71e73d712e5c113f54" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.017033 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adc583580a2e6555467330ce2ef6830ba65f8c21a4c2ca71e73d712e5c113f54"} err="failed to get container status \"adc583580a2e6555467330ce2ef6830ba65f8c21a4c2ca71e73d712e5c113f54\": rpc error: code = NotFound desc = could not find container \"adc583580a2e6555467330ce2ef6830ba65f8c21a4c2ca71e73d712e5c113f54\": container with ID starting with adc583580a2e6555467330ce2ef6830ba65f8c21a4c2ca71e73d712e5c113f54 not found: ID does not exist" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.017056 4771 scope.go:117] "RemoveContainer" containerID="08a8bfddb81f5f2af6b82fbf0bc4cdd2d42bdb3a31f76a788d084702fcd3fe23" Feb 27 01:12:18 crc kubenswrapper[4771]: E0227 01:12:18.017592 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08a8bfddb81f5f2af6b82fbf0bc4cdd2d42bdb3a31f76a788d084702fcd3fe23\": container with ID starting with 08a8bfddb81f5f2af6b82fbf0bc4cdd2d42bdb3a31f76a788d084702fcd3fe23 not found: ID does not exist" containerID="08a8bfddb81f5f2af6b82fbf0bc4cdd2d42bdb3a31f76a788d084702fcd3fe23" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.017622 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08a8bfddb81f5f2af6b82fbf0bc4cdd2d42bdb3a31f76a788d084702fcd3fe23"} err="failed to get container status \"08a8bfddb81f5f2af6b82fbf0bc4cdd2d42bdb3a31f76a788d084702fcd3fe23\": rpc error: code = NotFound desc = could not find container \"08a8bfddb81f5f2af6b82fbf0bc4cdd2d42bdb3a31f76a788d084702fcd3fe23\": container with ID starting with 08a8bfddb81f5f2af6b82fbf0bc4cdd2d42bdb3a31f76a788d084702fcd3fe23 not found: ID does not exist" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.017640 4771 scope.go:117] "RemoveContainer" containerID="95b828f5b4197c1a596ee79949d2f6af0cbd56911f39aa62ee2742a50e7e15d5" Feb 27 01:12:18 crc kubenswrapper[4771]: E0227 01:12:18.018820 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95b828f5b4197c1a596ee79949d2f6af0cbd56911f39aa62ee2742a50e7e15d5\": container with ID starting with 95b828f5b4197c1a596ee79949d2f6af0cbd56911f39aa62ee2742a50e7e15d5 not found: ID does not exist" containerID="95b828f5b4197c1a596ee79949d2f6af0cbd56911f39aa62ee2742a50e7e15d5" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.018862 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b828f5b4197c1a596ee79949d2f6af0cbd56911f39aa62ee2742a50e7e15d5"} err="failed to get container status \"95b828f5b4197c1a596ee79949d2f6af0cbd56911f39aa62ee2742a50e7e15d5\": rpc error: code = NotFound desc = could not find container \"95b828f5b4197c1a596ee79949d2f6af0cbd56911f39aa62ee2742a50e7e15d5\": container with ID starting with 95b828f5b4197c1a596ee79949d2f6af0cbd56911f39aa62ee2742a50e7e15d5 not found: ID does not exist" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.018887 4771 scope.go:117] "RemoveContainer" containerID="3c71395cd4bd71530a327203b7f598b299539b916e93ce5e7465b8dcbe031487" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.022661 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deb9a4a5-1474-4744-a57e-fcdcc97922ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "deb9a4a5-1474-4744-a57e-fcdcc97922ed" (UID: "deb9a4a5-1474-4744-a57e-fcdcc97922ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.035041 4771 scope.go:117] "RemoveContainer" containerID="95ecb081700720bf87b845e9a62ec591d900173d79072552c5e69dba5f110992" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.053733 4771 scope.go:117] "RemoveContainer" containerID="fc9e4c09e8487da11ce052951b57a20b9214f4fd6fa12c2c4cca62daaab2317e" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.060335 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/460ffbff-d0f0-43dc-bde9-6279c9a4b6a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "460ffbff-d0f0-43dc-bde9-6279c9a4b6a3" (UID: "460ffbff-d0f0-43dc-bde9-6279c9a4b6a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.074061 4771 scope.go:117] "RemoveContainer" containerID="3c71395cd4bd71530a327203b7f598b299539b916e93ce5e7465b8dcbe031487" Feb 27 01:12:18 crc kubenswrapper[4771]: E0227 01:12:18.074354 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c71395cd4bd71530a327203b7f598b299539b916e93ce5e7465b8dcbe031487\": container with ID starting with 3c71395cd4bd71530a327203b7f598b299539b916e93ce5e7465b8dcbe031487 not found: ID does not exist" containerID="3c71395cd4bd71530a327203b7f598b299539b916e93ce5e7465b8dcbe031487" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.074392 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c71395cd4bd71530a327203b7f598b299539b916e93ce5e7465b8dcbe031487"} err="failed to get container status \"3c71395cd4bd71530a327203b7f598b299539b916e93ce5e7465b8dcbe031487\": rpc error: code = NotFound desc = could not find container \"3c71395cd4bd71530a327203b7f598b299539b916e93ce5e7465b8dcbe031487\": container with ID starting with 3c71395cd4bd71530a327203b7f598b299539b916e93ce5e7465b8dcbe031487 not found: ID does not exist" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.074421 4771 scope.go:117] "RemoveContainer" containerID="95ecb081700720bf87b845e9a62ec591d900173d79072552c5e69dba5f110992" Feb 27 01:12:18 crc kubenswrapper[4771]: E0227 01:12:18.074812 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95ecb081700720bf87b845e9a62ec591d900173d79072552c5e69dba5f110992\": container with ID starting with 95ecb081700720bf87b845e9a62ec591d900173d79072552c5e69dba5f110992 not found: ID does not exist" containerID="95ecb081700720bf87b845e9a62ec591d900173d79072552c5e69dba5f110992" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.074856 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95ecb081700720bf87b845e9a62ec591d900173d79072552c5e69dba5f110992"} err="failed to get container status \"95ecb081700720bf87b845e9a62ec591d900173d79072552c5e69dba5f110992\": rpc error: code = NotFound desc = could not find container \"95ecb081700720bf87b845e9a62ec591d900173d79072552c5e69dba5f110992\": container with ID starting with 95ecb081700720bf87b845e9a62ec591d900173d79072552c5e69dba5f110992 not found: ID does not exist" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.074888 4771 scope.go:117] "RemoveContainer" containerID="fc9e4c09e8487da11ce052951b57a20b9214f4fd6fa12c2c4cca62daaab2317e" Feb 27 01:12:18 crc kubenswrapper[4771]: E0227 01:12:18.075346 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc9e4c09e8487da11ce052951b57a20b9214f4fd6fa12c2c4cca62daaab2317e\": container with ID starting with fc9e4c09e8487da11ce052951b57a20b9214f4fd6fa12c2c4cca62daaab2317e not found: ID does not exist" containerID="fc9e4c09e8487da11ce052951b57a20b9214f4fd6fa12c2c4cca62daaab2317e" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.075376 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc9e4c09e8487da11ce052951b57a20b9214f4fd6fa12c2c4cca62daaab2317e"} err="failed to get container status \"fc9e4c09e8487da11ce052951b57a20b9214f4fd6fa12c2c4cca62daaab2317e\": rpc error: code = NotFound desc = could not find container \"fc9e4c09e8487da11ce052951b57a20b9214f4fd6fa12c2c4cca62daaab2317e\": container with ID starting with fc9e4c09e8487da11ce052951b57a20b9214f4fd6fa12c2c4cca62daaab2317e not found: ID does not exist" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.075396 4771 scope.go:117] "RemoveContainer" containerID="29b73e867d24e4e4a2d589b54eb62e16fc0012f33e537fb487b81c830b48a6cb" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.088133 4771 scope.go:117] "RemoveContainer" containerID="a8e6d733021697d0fa055f19b568a4e1868ca7248f61c98a447627db114115de" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.091980 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b5757cb-321d-4a76-8769-786b28a2b004-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b5757cb-321d-4a76-8769-786b28a2b004" (UID: "0b5757cb-321d-4a76-8769-786b28a2b004"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.099915 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deb9a4a5-1474-4744-a57e-fcdcc97922ed-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.099973 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/460ffbff-d0f0-43dc-bde9-6279c9a4b6a3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.099989 4771 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/963fd070-b5e6-4a67-afd6-d056aacf8bc2-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.100002 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/460ffbff-d0f0-43dc-bde9-6279c9a4b6a3-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.100011 4771 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/963fd070-b5e6-4a67-afd6-d056aacf8bc2-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.100022 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2vfs\" (UniqueName: \"kubernetes.io/projected/deb9a4a5-1474-4744-a57e-fcdcc97922ed-kube-api-access-p2vfs\") on node \"crc\" DevicePath \"\"" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.100031 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deb9a4a5-1474-4744-a57e-fcdcc97922ed-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.100042 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33d95d7b-dfe7-495a-b686-5737dd95b974-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.100052 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b5757cb-321d-4a76-8769-786b28a2b004-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.100061 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wg6c9\" (UniqueName: \"kubernetes.io/projected/33d95d7b-dfe7-495a-b686-5737dd95b974-kube-api-access-wg6c9\") on node \"crc\" DevicePath \"\"" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.100070 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr2k6\" (UniqueName: \"kubernetes.io/projected/963fd070-b5e6-4a67-afd6-d056aacf8bc2-kube-api-access-sr2k6\") on node \"crc\" DevicePath \"\"" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.100079 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b5757cb-321d-4a76-8769-786b28a2b004-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.100087 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sstcr\" (UniqueName: \"kubernetes.io/projected/460ffbff-d0f0-43dc-bde9-6279c9a4b6a3-kube-api-access-sstcr\") on node \"crc\" DevicePath \"\"" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.100096 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzlpt\" (UniqueName: \"kubernetes.io/projected/0b5757cb-321d-4a76-8769-786b28a2b004-kube-api-access-kzlpt\") on node \"crc\" DevicePath \"\"" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.101620 4771 scope.go:117] "RemoveContainer" containerID="29b73e867d24e4e4a2d589b54eb62e16fc0012f33e537fb487b81c830b48a6cb" Feb 27 01:12:18 crc kubenswrapper[4771]: E0227 01:12:18.102047 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29b73e867d24e4e4a2d589b54eb62e16fc0012f33e537fb487b81c830b48a6cb\": container with ID starting with 29b73e867d24e4e4a2d589b54eb62e16fc0012f33e537fb487b81c830b48a6cb not found: ID does not exist" containerID="29b73e867d24e4e4a2d589b54eb62e16fc0012f33e537fb487b81c830b48a6cb" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.102082 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29b73e867d24e4e4a2d589b54eb62e16fc0012f33e537fb487b81c830b48a6cb"} err="failed to get container status \"29b73e867d24e4e4a2d589b54eb62e16fc0012f33e537fb487b81c830b48a6cb\": rpc error: code = NotFound desc = could not find container \"29b73e867d24e4e4a2d589b54eb62e16fc0012f33e537fb487b81c830b48a6cb\": container with ID starting with 29b73e867d24e4e4a2d589b54eb62e16fc0012f33e537fb487b81c830b48a6cb not found: ID does not exist" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.102107 4771 scope.go:117] "RemoveContainer" containerID="a8e6d733021697d0fa055f19b568a4e1868ca7248f61c98a447627db114115de" Feb 27 01:12:18 crc kubenswrapper[4771]: E0227 01:12:18.102336 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8e6d733021697d0fa055f19b568a4e1868ca7248f61c98a447627db114115de\": container with ID starting with a8e6d733021697d0fa055f19b568a4e1868ca7248f61c98a447627db114115de not found: ID does not exist" containerID="a8e6d733021697d0fa055f19b568a4e1868ca7248f61c98a447627db114115de" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.102355 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8e6d733021697d0fa055f19b568a4e1868ca7248f61c98a447627db114115de"} err="failed to get container status \"a8e6d733021697d0fa055f19b568a4e1868ca7248f61c98a447627db114115de\": rpc error: code = NotFound desc = could not find container \"a8e6d733021697d0fa055f19b568a4e1868ca7248f61c98a447627db114115de\": container with ID starting with a8e6d733021697d0fa055f19b568a4e1868ca7248f61c98a447627db114115de not found: ID does not exist" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.102367 4771 scope.go:117] "RemoveContainer" containerID="0f0d42e63af5bd77c784ef421d2d59e606ac8de2e02658b3fe2f1a8785594c6d" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.115605 4771 scope.go:117] "RemoveContainer" containerID="d347dd6ed5ac7651f21d473a9b564eda8422fc90e4d337eaedbe16edc926f6d3" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.126664 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33d95d7b-dfe7-495a-b686-5737dd95b974-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33d95d7b-dfe7-495a-b686-5737dd95b974" (UID: "33d95d7b-dfe7-495a-b686-5737dd95b974"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.128139 4771 scope.go:117] "RemoveContainer" containerID="a4e44e2ff53b0b41f0747a2c135f8582388f21d0d5972688f6fb0248eeff4c4b" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.142682 4771 scope.go:117] "RemoveContainer" containerID="0f0d42e63af5bd77c784ef421d2d59e606ac8de2e02658b3fe2f1a8785594c6d" Feb 27 01:12:18 crc kubenswrapper[4771]: E0227 01:12:18.143034 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f0d42e63af5bd77c784ef421d2d59e606ac8de2e02658b3fe2f1a8785594c6d\": container with ID starting with 0f0d42e63af5bd77c784ef421d2d59e606ac8de2e02658b3fe2f1a8785594c6d not found: ID does not exist" containerID="0f0d42e63af5bd77c784ef421d2d59e606ac8de2e02658b3fe2f1a8785594c6d" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.143065 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f0d42e63af5bd77c784ef421d2d59e606ac8de2e02658b3fe2f1a8785594c6d"} err="failed to get container status \"0f0d42e63af5bd77c784ef421d2d59e606ac8de2e02658b3fe2f1a8785594c6d\": rpc error: code = NotFound desc = could not find container \"0f0d42e63af5bd77c784ef421d2d59e606ac8de2e02658b3fe2f1a8785594c6d\": container with ID starting with 0f0d42e63af5bd77c784ef421d2d59e606ac8de2e02658b3fe2f1a8785594c6d not found: ID does not exist" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.143085 4771 scope.go:117] "RemoveContainer" containerID="d347dd6ed5ac7651f21d473a9b564eda8422fc90e4d337eaedbe16edc926f6d3" Feb 27 01:12:18 crc kubenswrapper[4771]: E0227 01:12:18.143348 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d347dd6ed5ac7651f21d473a9b564eda8422fc90e4d337eaedbe16edc926f6d3\": container with ID starting with d347dd6ed5ac7651f21d473a9b564eda8422fc90e4d337eaedbe16edc926f6d3 not found: ID does not exist" containerID="d347dd6ed5ac7651f21d473a9b564eda8422fc90e4d337eaedbe16edc926f6d3" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.143383 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d347dd6ed5ac7651f21d473a9b564eda8422fc90e4d337eaedbe16edc926f6d3"} err="failed to get container status \"d347dd6ed5ac7651f21d473a9b564eda8422fc90e4d337eaedbe16edc926f6d3\": rpc error: code = NotFound desc = could not find container \"d347dd6ed5ac7651f21d473a9b564eda8422fc90e4d337eaedbe16edc926f6d3\": container with ID starting with d347dd6ed5ac7651f21d473a9b564eda8422fc90e4d337eaedbe16edc926f6d3 not found: ID does not exist" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.143405 4771 scope.go:117] "RemoveContainer" containerID="a4e44e2ff53b0b41f0747a2c135f8582388f21d0d5972688f6fb0248eeff4c4b" Feb 27 01:12:18 crc kubenswrapper[4771]: E0227 01:12:18.143649 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4e44e2ff53b0b41f0747a2c135f8582388f21d0d5972688f6fb0248eeff4c4b\": container with ID starting with a4e44e2ff53b0b41f0747a2c135f8582388f21d0d5972688f6fb0248eeff4c4b not found: ID does not exist" containerID="a4e44e2ff53b0b41f0747a2c135f8582388f21d0d5972688f6fb0248eeff4c4b" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.143669 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4e44e2ff53b0b41f0747a2c135f8582388f21d0d5972688f6fb0248eeff4c4b"} err="failed to get container status \"a4e44e2ff53b0b41f0747a2c135f8582388f21d0d5972688f6fb0248eeff4c4b\": rpc error: code = NotFound desc = could not find container \"a4e44e2ff53b0b41f0747a2c135f8582388f21d0d5972688f6fb0248eeff4c4b\": container with ID starting with a4e44e2ff53b0b41f0747a2c135f8582388f21d0d5972688f6fb0248eeff4c4b not found: ID does not exist" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.143682 4771 scope.go:117] "RemoveContainer" containerID="8f8bd6441fd3159b7d3a79d09507058123c78ad7606f4d6a21f28afba06b5e6d" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.158489 4771 scope.go:117] "RemoveContainer" containerID="53c069778ad436ecffc8db28eaf5a029961d85a3bc94849e7778bda0d571d9b3" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.172530 4771 scope.go:117] "RemoveContainer" containerID="682c22dcd6c3459e2fbcb0d6c685cf1d905abab945401dedf2d0a7549762fe71" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.185193 4771 scope.go:117] "RemoveContainer" containerID="8f8bd6441fd3159b7d3a79d09507058123c78ad7606f4d6a21f28afba06b5e6d" Feb 27 01:12:18 crc kubenswrapper[4771]: E0227 01:12:18.185573 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f8bd6441fd3159b7d3a79d09507058123c78ad7606f4d6a21f28afba06b5e6d\": container with ID starting with 8f8bd6441fd3159b7d3a79d09507058123c78ad7606f4d6a21f28afba06b5e6d not found: ID does not exist" containerID="8f8bd6441fd3159b7d3a79d09507058123c78ad7606f4d6a21f28afba06b5e6d" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.185632 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f8bd6441fd3159b7d3a79d09507058123c78ad7606f4d6a21f28afba06b5e6d"} err="failed to get container status \"8f8bd6441fd3159b7d3a79d09507058123c78ad7606f4d6a21f28afba06b5e6d\": rpc error: code = NotFound desc = could not find container \"8f8bd6441fd3159b7d3a79d09507058123c78ad7606f4d6a21f28afba06b5e6d\": container with ID starting with 8f8bd6441fd3159b7d3a79d09507058123c78ad7606f4d6a21f28afba06b5e6d not found: ID does not exist" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.185668 4771 scope.go:117] "RemoveContainer" containerID="53c069778ad436ecffc8db28eaf5a029961d85a3bc94849e7778bda0d571d9b3" Feb 27 01:12:18 crc kubenswrapper[4771]: E0227 01:12:18.186111 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53c069778ad436ecffc8db28eaf5a029961d85a3bc94849e7778bda0d571d9b3\": container with ID starting with 53c069778ad436ecffc8db28eaf5a029961d85a3bc94849e7778bda0d571d9b3 not found: ID does not exist" containerID="53c069778ad436ecffc8db28eaf5a029961d85a3bc94849e7778bda0d571d9b3" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.186139 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53c069778ad436ecffc8db28eaf5a029961d85a3bc94849e7778bda0d571d9b3"} err="failed to get container status \"53c069778ad436ecffc8db28eaf5a029961d85a3bc94849e7778bda0d571d9b3\": rpc error: code = NotFound desc = could not find container \"53c069778ad436ecffc8db28eaf5a029961d85a3bc94849e7778bda0d571d9b3\": container with ID starting with 53c069778ad436ecffc8db28eaf5a029961d85a3bc94849e7778bda0d571d9b3 not found: ID does not exist" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.186154 4771 scope.go:117] "RemoveContainer" containerID="682c22dcd6c3459e2fbcb0d6c685cf1d905abab945401dedf2d0a7549762fe71" Feb 27 01:12:18 crc kubenswrapper[4771]: E0227 01:12:18.186428 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"682c22dcd6c3459e2fbcb0d6c685cf1d905abab945401dedf2d0a7549762fe71\": container with ID starting with 682c22dcd6c3459e2fbcb0d6c685cf1d905abab945401dedf2d0a7549762fe71 not found: ID does not exist" containerID="682c22dcd6c3459e2fbcb0d6c685cf1d905abab945401dedf2d0a7549762fe71" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.186447 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"682c22dcd6c3459e2fbcb0d6c685cf1d905abab945401dedf2d0a7549762fe71"} err="failed to get container status \"682c22dcd6c3459e2fbcb0d6c685cf1d905abab945401dedf2d0a7549762fe71\": rpc error: code = NotFound desc = could not find container \"682c22dcd6c3459e2fbcb0d6c685cf1d905abab945401dedf2d0a7549762fe71\": container with ID starting with 682c22dcd6c3459e2fbcb0d6c685cf1d905abab945401dedf2d0a7549762fe71 not found: ID does not exist" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.201171 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33d95d7b-dfe7-495a-b686-5737dd95b974-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.237107 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7r96b"] Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.242862 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7r96b"] Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.277264 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xk58g"] Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.280212 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xk58g"] Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.286499 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jffnf"] Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.288801 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rwr4f"] Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.296974 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rwr4f"] Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.300323 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5l2f"] Feb 27 01:12:18 crc kubenswrapper[4771]: W0227 01:12:18.300844 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cb60be5_a0ff_489e_a473_32a72359b2ce.slice/crio-1cfd33852ed63d5be5d3e37abc3096dadc10e2de3bd25b0af1f4fe1e56a6372f WatchSource:0}: Error finding container 1cfd33852ed63d5be5d3e37abc3096dadc10e2de3bd25b0af1f4fe1e56a6372f: Status 404 returned error can't find the container with id 1cfd33852ed63d5be5d3e37abc3096dadc10e2de3bd25b0af1f4fe1e56a6372f Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.306498 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5l2f"] Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.312542 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lds4k"] Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.316431 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lds4k"] Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.942052 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jffnf" event={"ID":"9cb60be5-a0ff-489e-a473-32a72359b2ce","Type":"ContainerStarted","Data":"a23e1ebf791d091a77981c633ee60313b1b32a93318283febf6d837fb7a0c082"} Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.942375 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jffnf" event={"ID":"9cb60be5-a0ff-489e-a473-32a72359b2ce","Type":"ContainerStarted","Data":"1cfd33852ed63d5be5d3e37abc3096dadc10e2de3bd25b0af1f4fe1e56a6372f"} Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.942392 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jffnf" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.954773 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jffnf" Feb 27 01:12:18 crc kubenswrapper[4771]: I0227 01:12:18.969265 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-jffnf" podStartSLOduration=1.969229322 podStartE2EDuration="1.969229322s" podCreationTimestamp="2026-02-27 01:12:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:12:18.963950152 +0000 UTC m=+451.901511450" watchObservedRunningTime="2026-02-27 01:12:18.969229322 +0000 UTC m=+451.906790650" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.530984 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sl98j"] Feb 27 01:12:19 crc kubenswrapper[4771]: E0227 01:12:19.531579 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d95d7b-dfe7-495a-b686-5737dd95b974" containerName="registry-server" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.531599 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d95d7b-dfe7-495a-b686-5737dd95b974" containerName="registry-server" Feb 27 01:12:19 crc kubenswrapper[4771]: E0227 01:12:19.531642 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b5757cb-321d-4a76-8769-786b28a2b004" containerName="extract-utilities" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.531655 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b5757cb-321d-4a76-8769-786b28a2b004" containerName="extract-utilities" Feb 27 01:12:19 crc kubenswrapper[4771]: E0227 01:12:19.531672 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="460ffbff-d0f0-43dc-bde9-6279c9a4b6a3" containerName="extract-utilities" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.531684 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="460ffbff-d0f0-43dc-bde9-6279c9a4b6a3" containerName="extract-utilities" Feb 27 01:12:19 crc kubenswrapper[4771]: E0227 01:12:19.531701 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b5757cb-321d-4a76-8769-786b28a2b004" containerName="registry-server" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.531714 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b5757cb-321d-4a76-8769-786b28a2b004" containerName="registry-server" Feb 27 01:12:19 crc kubenswrapper[4771]: E0227 01:12:19.531727 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d95d7b-dfe7-495a-b686-5737dd95b974" containerName="extract-content" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.531740 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d95d7b-dfe7-495a-b686-5737dd95b974" containerName="extract-content" Feb 27 01:12:19 crc kubenswrapper[4771]: E0227 01:12:19.531757 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="460ffbff-d0f0-43dc-bde9-6279c9a4b6a3" containerName="registry-server" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.531768 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="460ffbff-d0f0-43dc-bde9-6279c9a4b6a3" containerName="registry-server" Feb 27 01:12:19 crc kubenswrapper[4771]: E0227 01:12:19.531790 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="460ffbff-d0f0-43dc-bde9-6279c9a4b6a3" containerName="extract-content" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.531803 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="460ffbff-d0f0-43dc-bde9-6279c9a4b6a3" containerName="extract-content" Feb 27 01:12:19 crc kubenswrapper[4771]: E0227 01:12:19.531816 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b5757cb-321d-4a76-8769-786b28a2b004" containerName="extract-content" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.531828 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b5757cb-321d-4a76-8769-786b28a2b004" containerName="extract-content" Feb 27 01:12:19 crc kubenswrapper[4771]: E0227 01:12:19.531846 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33d95d7b-dfe7-495a-b686-5737dd95b974" containerName="extract-utilities" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.531859 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="33d95d7b-dfe7-495a-b686-5737dd95b974" containerName="extract-utilities" Feb 27 01:12:19 crc kubenswrapper[4771]: E0227 01:12:19.531879 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb9a4a5-1474-4744-a57e-fcdcc97922ed" containerName="registry-server" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.531891 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb9a4a5-1474-4744-a57e-fcdcc97922ed" containerName="registry-server" Feb 27 01:12:19 crc kubenswrapper[4771]: E0227 01:12:19.531904 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb9a4a5-1474-4744-a57e-fcdcc97922ed" containerName="extract-content" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.531928 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb9a4a5-1474-4744-a57e-fcdcc97922ed" containerName="extract-content" Feb 27 01:12:19 crc kubenswrapper[4771]: E0227 01:12:19.531940 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="963fd070-b5e6-4a67-afd6-d056aacf8bc2" containerName="marketplace-operator" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.531952 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="963fd070-b5e6-4a67-afd6-d056aacf8bc2" containerName="marketplace-operator" Feb 27 01:12:19 crc kubenswrapper[4771]: E0227 01:12:19.531968 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb9a4a5-1474-4744-a57e-fcdcc97922ed" containerName="extract-utilities" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.531979 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb9a4a5-1474-4744-a57e-fcdcc97922ed" containerName="extract-utilities" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.532130 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="963fd070-b5e6-4a67-afd6-d056aacf8bc2" containerName="marketplace-operator" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.532150 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="460ffbff-d0f0-43dc-bde9-6279c9a4b6a3" containerName="registry-server" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.532178 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="33d95d7b-dfe7-495a-b686-5737dd95b974" containerName="registry-server" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.532201 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b5757cb-321d-4a76-8769-786b28a2b004" containerName="registry-server" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.532215 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb9a4a5-1474-4744-a57e-fcdcc97922ed" containerName="registry-server" Feb 27 01:12:19 crc kubenswrapper[4771]: E0227 01:12:19.532373 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="963fd070-b5e6-4a67-afd6-d056aacf8bc2" containerName="marketplace-operator" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.532386 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="963fd070-b5e6-4a67-afd6-d056aacf8bc2" containerName="marketplace-operator" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.532522 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="963fd070-b5e6-4a67-afd6-d056aacf8bc2" containerName="marketplace-operator" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.533382 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sl98j" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.535662 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.547896 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sl98j"] Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.724402 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a25ddae9-53d5-4d86-9914-10fc8e695cb3-catalog-content\") pod \"redhat-marketplace-sl98j\" (UID: \"a25ddae9-53d5-4d86-9914-10fc8e695cb3\") " pod="openshift-marketplace/redhat-marketplace-sl98j" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.724468 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v27v6\" (UniqueName: \"kubernetes.io/projected/a25ddae9-53d5-4d86-9914-10fc8e695cb3-kube-api-access-v27v6\") pod \"redhat-marketplace-sl98j\" (UID: \"a25ddae9-53d5-4d86-9914-10fc8e695cb3\") " pod="openshift-marketplace/redhat-marketplace-sl98j" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.724627 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a25ddae9-53d5-4d86-9914-10fc8e695cb3-utilities\") pod \"redhat-marketplace-sl98j\" (UID: \"a25ddae9-53d5-4d86-9914-10fc8e695cb3\") " pod="openshift-marketplace/redhat-marketplace-sl98j" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.735878 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nm89l"] Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.737469 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nm89l" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.739681 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.743523 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nm89l"] Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.784677 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b5757cb-321d-4a76-8769-786b28a2b004" path="/var/lib/kubelet/pods/0b5757cb-321d-4a76-8769-786b28a2b004/volumes" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.785590 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33d95d7b-dfe7-495a-b686-5737dd95b974" path="/var/lib/kubelet/pods/33d95d7b-dfe7-495a-b686-5737dd95b974/volumes" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.786281 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="460ffbff-d0f0-43dc-bde9-6279c9a4b6a3" path="/var/lib/kubelet/pods/460ffbff-d0f0-43dc-bde9-6279c9a4b6a3/volumes" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.787472 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="963fd070-b5e6-4a67-afd6-d056aacf8bc2" path="/var/lib/kubelet/pods/963fd070-b5e6-4a67-afd6-d056aacf8bc2/volumes" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.788005 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deb9a4a5-1474-4744-a57e-fcdcc97922ed" path="/var/lib/kubelet/pods/deb9a4a5-1474-4744-a57e-fcdcc97922ed/volumes" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.825972 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb1594d2-dbd5-4e37-8d97-dac2a6357808-utilities\") pod \"redhat-operators-nm89l\" (UID: \"eb1594d2-dbd5-4e37-8d97-dac2a6357808\") " pod="openshift-marketplace/redhat-operators-nm89l" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.826027 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb1594d2-dbd5-4e37-8d97-dac2a6357808-catalog-content\") pod \"redhat-operators-nm89l\" (UID: \"eb1594d2-dbd5-4e37-8d97-dac2a6357808\") " pod="openshift-marketplace/redhat-operators-nm89l" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.826163 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-666ks\" (UniqueName: \"kubernetes.io/projected/eb1594d2-dbd5-4e37-8d97-dac2a6357808-kube-api-access-666ks\") pod \"redhat-operators-nm89l\" (UID: \"eb1594d2-dbd5-4e37-8d97-dac2a6357808\") " pod="openshift-marketplace/redhat-operators-nm89l" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.826306 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a25ddae9-53d5-4d86-9914-10fc8e695cb3-catalog-content\") pod \"redhat-marketplace-sl98j\" (UID: \"a25ddae9-53d5-4d86-9914-10fc8e695cb3\") " pod="openshift-marketplace/redhat-marketplace-sl98j" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.826412 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v27v6\" (UniqueName: \"kubernetes.io/projected/a25ddae9-53d5-4d86-9914-10fc8e695cb3-kube-api-access-v27v6\") pod \"redhat-marketplace-sl98j\" (UID: \"a25ddae9-53d5-4d86-9914-10fc8e695cb3\") " pod="openshift-marketplace/redhat-marketplace-sl98j" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.826461 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a25ddae9-53d5-4d86-9914-10fc8e695cb3-utilities\") pod \"redhat-marketplace-sl98j\" (UID: \"a25ddae9-53d5-4d86-9914-10fc8e695cb3\") " pod="openshift-marketplace/redhat-marketplace-sl98j" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.826843 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a25ddae9-53d5-4d86-9914-10fc8e695cb3-catalog-content\") pod \"redhat-marketplace-sl98j\" (UID: \"a25ddae9-53d5-4d86-9914-10fc8e695cb3\") " pod="openshift-marketplace/redhat-marketplace-sl98j" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.826956 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a25ddae9-53d5-4d86-9914-10fc8e695cb3-utilities\") pod \"redhat-marketplace-sl98j\" (UID: \"a25ddae9-53d5-4d86-9914-10fc8e695cb3\") " pod="openshift-marketplace/redhat-marketplace-sl98j" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.846119 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v27v6\" (UniqueName: \"kubernetes.io/projected/a25ddae9-53d5-4d86-9914-10fc8e695cb3-kube-api-access-v27v6\") pod \"redhat-marketplace-sl98j\" (UID: \"a25ddae9-53d5-4d86-9914-10fc8e695cb3\") " pod="openshift-marketplace/redhat-marketplace-sl98j" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.859480 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sl98j" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.926962 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-666ks\" (UniqueName: \"kubernetes.io/projected/eb1594d2-dbd5-4e37-8d97-dac2a6357808-kube-api-access-666ks\") pod \"redhat-operators-nm89l\" (UID: \"eb1594d2-dbd5-4e37-8d97-dac2a6357808\") " pod="openshift-marketplace/redhat-operators-nm89l" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.927169 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb1594d2-dbd5-4e37-8d97-dac2a6357808-utilities\") pod \"redhat-operators-nm89l\" (UID: \"eb1594d2-dbd5-4e37-8d97-dac2a6357808\") " pod="openshift-marketplace/redhat-operators-nm89l" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.927205 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb1594d2-dbd5-4e37-8d97-dac2a6357808-catalog-content\") pod \"redhat-operators-nm89l\" (UID: \"eb1594d2-dbd5-4e37-8d97-dac2a6357808\") " pod="openshift-marketplace/redhat-operators-nm89l" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.927996 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb1594d2-dbd5-4e37-8d97-dac2a6357808-catalog-content\") pod \"redhat-operators-nm89l\" (UID: \"eb1594d2-dbd5-4e37-8d97-dac2a6357808\") " pod="openshift-marketplace/redhat-operators-nm89l" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.930932 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb1594d2-dbd5-4e37-8d97-dac2a6357808-utilities\") pod \"redhat-operators-nm89l\" (UID: \"eb1594d2-dbd5-4e37-8d97-dac2a6357808\") " pod="openshift-marketplace/redhat-operators-nm89l" Feb 27 01:12:19 crc kubenswrapper[4771]: I0227 01:12:19.953150 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-666ks\" (UniqueName: \"kubernetes.io/projected/eb1594d2-dbd5-4e37-8d97-dac2a6357808-kube-api-access-666ks\") pod \"redhat-operators-nm89l\" (UID: \"eb1594d2-dbd5-4e37-8d97-dac2a6357808\") " pod="openshift-marketplace/redhat-operators-nm89l" Feb 27 01:12:20 crc kubenswrapper[4771]: I0227 01:12:20.057257 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sl98j"] Feb 27 01:12:20 crc kubenswrapper[4771]: I0227 01:12:20.069330 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nm89l" Feb 27 01:12:20 crc kubenswrapper[4771]: I0227 01:12:20.702152 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nm89l"] Feb 27 01:12:20 crc kubenswrapper[4771]: W0227 01:12:20.709522 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb1594d2_dbd5_4e37_8d97_dac2a6357808.slice/crio-2ac8db6e2e180d9bbbfea7a7b0e03f4ef7dce60eac98887d4619265606560883 WatchSource:0}: Error finding container 2ac8db6e2e180d9bbbfea7a7b0e03f4ef7dce60eac98887d4619265606560883: Status 404 returned error can't find the container with id 2ac8db6e2e180d9bbbfea7a7b0e03f4ef7dce60eac98887d4619265606560883 Feb 27 01:12:20 crc kubenswrapper[4771]: I0227 01:12:20.967286 4771 generic.go:334] "Generic (PLEG): container finished" podID="a25ddae9-53d5-4d86-9914-10fc8e695cb3" containerID="f24d0e819e384d161f362aa759c055d44b68476ee2219ba48b33fb3a855bd8a8" exitCode=0 Feb 27 01:12:20 crc kubenswrapper[4771]: I0227 01:12:20.967334 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sl98j" event={"ID":"a25ddae9-53d5-4d86-9914-10fc8e695cb3","Type":"ContainerDied","Data":"f24d0e819e384d161f362aa759c055d44b68476ee2219ba48b33fb3a855bd8a8"} Feb 27 01:12:20 crc kubenswrapper[4771]: I0227 01:12:20.967676 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sl98j" event={"ID":"a25ddae9-53d5-4d86-9914-10fc8e695cb3","Type":"ContainerStarted","Data":"2cf98952955a3435d77c64b558e1c7e75c2f4315a055d893956093425c4749c0"} Feb 27 01:12:20 crc kubenswrapper[4771]: I0227 01:12:20.972694 4771 generic.go:334] "Generic (PLEG): container finished" podID="eb1594d2-dbd5-4e37-8d97-dac2a6357808" containerID="dd604110c0930debf439c191e3d3fd9f3865f33dd1014a78c27e4e1356e91f6b" exitCode=0 Feb 27 01:12:20 crc kubenswrapper[4771]: I0227 01:12:20.972756 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nm89l" event={"ID":"eb1594d2-dbd5-4e37-8d97-dac2a6357808","Type":"ContainerDied","Data":"dd604110c0930debf439c191e3d3fd9f3865f33dd1014a78c27e4e1356e91f6b"} Feb 27 01:12:20 crc kubenswrapper[4771]: I0227 01:12:20.972787 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nm89l" event={"ID":"eb1594d2-dbd5-4e37-8d97-dac2a6357808","Type":"ContainerStarted","Data":"2ac8db6e2e180d9bbbfea7a7b0e03f4ef7dce60eac98887d4619265606560883"} Feb 27 01:12:21 crc kubenswrapper[4771]: I0227 01:12:21.920519 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n48ck"] Feb 27 01:12:21 crc kubenswrapper[4771]: I0227 01:12:21.921971 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n48ck" Feb 27 01:12:21 crc kubenswrapper[4771]: I0227 01:12:21.923768 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 27 01:12:21 crc kubenswrapper[4771]: I0227 01:12:21.940423 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n48ck"] Feb 27 01:12:21 crc kubenswrapper[4771]: I0227 01:12:21.956333 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/005f8696-cd2c-46e3-8edb-d9c9c9652871-catalog-content\") pod \"community-operators-n48ck\" (UID: \"005f8696-cd2c-46e3-8edb-d9c9c9652871\") " pod="openshift-marketplace/community-operators-n48ck" Feb 27 01:12:21 crc kubenswrapper[4771]: I0227 01:12:21.956381 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/005f8696-cd2c-46e3-8edb-d9c9c9652871-utilities\") pod \"community-operators-n48ck\" (UID: \"005f8696-cd2c-46e3-8edb-d9c9c9652871\") " pod="openshift-marketplace/community-operators-n48ck" Feb 27 01:12:21 crc kubenswrapper[4771]: I0227 01:12:21.956404 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79xmc\" (UniqueName: \"kubernetes.io/projected/005f8696-cd2c-46e3-8edb-d9c9c9652871-kube-api-access-79xmc\") pod \"community-operators-n48ck\" (UID: \"005f8696-cd2c-46e3-8edb-d9c9c9652871\") " pod="openshift-marketplace/community-operators-n48ck" Feb 27 01:12:21 crc kubenswrapper[4771]: I0227 01:12:21.978325 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nm89l" event={"ID":"eb1594d2-dbd5-4e37-8d97-dac2a6357808","Type":"ContainerStarted","Data":"ae184e8446426f0441311ed7c612da96aad2cbc003fd3108bb071cc4a74ad851"} Feb 27 01:12:21 crc kubenswrapper[4771]: I0227 01:12:21.979944 4771 generic.go:334] "Generic (PLEG): container finished" podID="a25ddae9-53d5-4d86-9914-10fc8e695cb3" containerID="6174465cd90f7fc06f3709feb0708266636f7a6a33840c8bde110da48417ee8a" exitCode=0 Feb 27 01:12:21 crc kubenswrapper[4771]: I0227 01:12:21.979988 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sl98j" event={"ID":"a25ddae9-53d5-4d86-9914-10fc8e695cb3","Type":"ContainerDied","Data":"6174465cd90f7fc06f3709feb0708266636f7a6a33840c8bde110da48417ee8a"} Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.057115 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/005f8696-cd2c-46e3-8edb-d9c9c9652871-utilities\") pod \"community-operators-n48ck\" (UID: \"005f8696-cd2c-46e3-8edb-d9c9c9652871\") " pod="openshift-marketplace/community-operators-n48ck" Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.057157 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79xmc\" (UniqueName: \"kubernetes.io/projected/005f8696-cd2c-46e3-8edb-d9c9c9652871-kube-api-access-79xmc\") pod \"community-operators-n48ck\" (UID: \"005f8696-cd2c-46e3-8edb-d9c9c9652871\") " pod="openshift-marketplace/community-operators-n48ck" Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.057242 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/005f8696-cd2c-46e3-8edb-d9c9c9652871-catalog-content\") pod \"community-operators-n48ck\" (UID: \"005f8696-cd2c-46e3-8edb-d9c9c9652871\") " pod="openshift-marketplace/community-operators-n48ck" Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.057594 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/005f8696-cd2c-46e3-8edb-d9c9c9652871-catalog-content\") pod \"community-operators-n48ck\" (UID: \"005f8696-cd2c-46e3-8edb-d9c9c9652871\") " pod="openshift-marketplace/community-operators-n48ck" Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.057648 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/005f8696-cd2c-46e3-8edb-d9c9c9652871-utilities\") pod \"community-operators-n48ck\" (UID: \"005f8696-cd2c-46e3-8edb-d9c9c9652871\") " pod="openshift-marketplace/community-operators-n48ck" Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.075022 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79xmc\" (UniqueName: \"kubernetes.io/projected/005f8696-cd2c-46e3-8edb-d9c9c9652871-kube-api-access-79xmc\") pod \"community-operators-n48ck\" (UID: \"005f8696-cd2c-46e3-8edb-d9c9c9652871\") " pod="openshift-marketplace/community-operators-n48ck" Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.117497 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6b7jf"] Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.118612 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6b7jf" Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.120720 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.131648 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6b7jf"] Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.158038 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54ef043a-8831-43e4-abb7-583a36418b6c-catalog-content\") pod \"certified-operators-6b7jf\" (UID: \"54ef043a-8831-43e4-abb7-583a36418b6c\") " pod="openshift-marketplace/certified-operators-6b7jf" Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.158396 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54ef043a-8831-43e4-abb7-583a36418b6c-utilities\") pod \"certified-operators-6b7jf\" (UID: \"54ef043a-8831-43e4-abb7-583a36418b6c\") " pod="openshift-marketplace/certified-operators-6b7jf" Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.158526 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hv46\" (UniqueName: \"kubernetes.io/projected/54ef043a-8831-43e4-abb7-583a36418b6c-kube-api-access-8hv46\") pod \"certified-operators-6b7jf\" (UID: \"54ef043a-8831-43e4-abb7-583a36418b6c\") " pod="openshift-marketplace/certified-operators-6b7jf" Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.238209 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n48ck" Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.259228 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54ef043a-8831-43e4-abb7-583a36418b6c-catalog-content\") pod \"certified-operators-6b7jf\" (UID: \"54ef043a-8831-43e4-abb7-583a36418b6c\") " pod="openshift-marketplace/certified-operators-6b7jf" Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.259344 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54ef043a-8831-43e4-abb7-583a36418b6c-utilities\") pod \"certified-operators-6b7jf\" (UID: \"54ef043a-8831-43e4-abb7-583a36418b6c\") " pod="openshift-marketplace/certified-operators-6b7jf" Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.259397 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hv46\" (UniqueName: \"kubernetes.io/projected/54ef043a-8831-43e4-abb7-583a36418b6c-kube-api-access-8hv46\") pod \"certified-operators-6b7jf\" (UID: \"54ef043a-8831-43e4-abb7-583a36418b6c\") " pod="openshift-marketplace/certified-operators-6b7jf" Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.259739 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54ef043a-8831-43e4-abb7-583a36418b6c-catalog-content\") pod \"certified-operators-6b7jf\" (UID: \"54ef043a-8831-43e4-abb7-583a36418b6c\") " pod="openshift-marketplace/certified-operators-6b7jf" Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.259805 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54ef043a-8831-43e4-abb7-583a36418b6c-utilities\") pod \"certified-operators-6b7jf\" (UID: \"54ef043a-8831-43e4-abb7-583a36418b6c\") " pod="openshift-marketplace/certified-operators-6b7jf" Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.286531 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hv46\" (UniqueName: \"kubernetes.io/projected/54ef043a-8831-43e4-abb7-583a36418b6c-kube-api-access-8hv46\") pod \"certified-operators-6b7jf\" (UID: \"54ef043a-8831-43e4-abb7-583a36418b6c\") " pod="openshift-marketplace/certified-operators-6b7jf" Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.421520 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n48ck"] Feb 27 01:12:22 crc kubenswrapper[4771]: W0227 01:12:22.428167 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod005f8696_cd2c_46e3_8edb_d9c9c9652871.slice/crio-2bcf34d583eba0659d3756877b2b2bbd4bc11d56e3e197b2dc1fa392bdec210c WatchSource:0}: Error finding container 2bcf34d583eba0659d3756877b2b2bbd4bc11d56e3e197b2dc1fa392bdec210c: Status 404 returned error can't find the container with id 2bcf34d583eba0659d3756877b2b2bbd4bc11d56e3e197b2dc1fa392bdec210c Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.437155 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6b7jf" Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.636415 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6b7jf"] Feb 27 01:12:22 crc kubenswrapper[4771]: W0227 01:12:22.639537 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54ef043a_8831_43e4_abb7_583a36418b6c.slice/crio-5be0c4f8363c31e36dd931c28c6784d13449e51b970fe81b84e3857318fb4e79 WatchSource:0}: Error finding container 5be0c4f8363c31e36dd931c28c6784d13449e51b970fe81b84e3857318fb4e79: Status 404 returned error can't find the container with id 5be0c4f8363c31e36dd931c28c6784d13449e51b970fe81b84e3857318fb4e79 Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.985872 4771 generic.go:334] "Generic (PLEG): container finished" podID="005f8696-cd2c-46e3-8edb-d9c9c9652871" containerID="0af6f8f85bf32a068f9b6ceeb8e4be87ffbbd61480e0d0f25cc5d864f6f3ae5a" exitCode=0 Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.986078 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n48ck" event={"ID":"005f8696-cd2c-46e3-8edb-d9c9c9652871","Type":"ContainerDied","Data":"0af6f8f85bf32a068f9b6ceeb8e4be87ffbbd61480e0d0f25cc5d864f6f3ae5a"} Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.986310 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n48ck" event={"ID":"005f8696-cd2c-46e3-8edb-d9c9c9652871","Type":"ContainerStarted","Data":"2bcf34d583eba0659d3756877b2b2bbd4bc11d56e3e197b2dc1fa392bdec210c"} Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.988350 4771 generic.go:334] "Generic (PLEG): container finished" podID="eb1594d2-dbd5-4e37-8d97-dac2a6357808" containerID="ae184e8446426f0441311ed7c612da96aad2cbc003fd3108bb071cc4a74ad851" exitCode=0 Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.988685 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nm89l" event={"ID":"eb1594d2-dbd5-4e37-8d97-dac2a6357808","Type":"ContainerDied","Data":"ae184e8446426f0441311ed7c612da96aad2cbc003fd3108bb071cc4a74ad851"} Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.991978 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sl98j" event={"ID":"a25ddae9-53d5-4d86-9914-10fc8e695cb3","Type":"ContainerStarted","Data":"890a7f6e68bfcc4c22b8468c13fb32356ccf1446e6e37081f7b50e82a4648933"} Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.995750 4771 generic.go:334] "Generic (PLEG): container finished" podID="54ef043a-8831-43e4-abb7-583a36418b6c" containerID="c1ebcd7a3e1f2f3b2616e4ce5437b4a4ff4cb02a0732972aa49ce659b91551fc" exitCode=0 Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.995798 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6b7jf" event={"ID":"54ef043a-8831-43e4-abb7-583a36418b6c","Type":"ContainerDied","Data":"c1ebcd7a3e1f2f3b2616e4ce5437b4a4ff4cb02a0732972aa49ce659b91551fc"} Feb 27 01:12:22 crc kubenswrapper[4771]: I0227 01:12:22.995826 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6b7jf" event={"ID":"54ef043a-8831-43e4-abb7-583a36418b6c","Type":"ContainerStarted","Data":"5be0c4f8363c31e36dd931c28c6784d13449e51b970fe81b84e3857318fb4e79"} Feb 27 01:12:23 crc kubenswrapper[4771]: I0227 01:12:23.029515 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sl98j" podStartSLOduration=2.564527403 podStartE2EDuration="4.029494393s" podCreationTimestamp="2026-02-27 01:12:19 +0000 UTC" firstStartedPulling="2026-02-27 01:12:20.970249009 +0000 UTC m=+453.907810307" lastFinishedPulling="2026-02-27 01:12:22.435216009 +0000 UTC m=+455.372777297" observedRunningTime="2026-02-27 01:12:23.026927141 +0000 UTC m=+455.964488479" watchObservedRunningTime="2026-02-27 01:12:23.029494393 +0000 UTC m=+455.967055691" Feb 27 01:12:24 crc kubenswrapper[4771]: I0227 01:12:24.002775 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nm89l" event={"ID":"eb1594d2-dbd5-4e37-8d97-dac2a6357808","Type":"ContainerStarted","Data":"ca3067fc67ac1aca3cdd1e0a57186044f3c42c30100db482faab36a19a310354"} Feb 27 01:12:24 crc kubenswrapper[4771]: I0227 01:12:24.004681 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6b7jf" event={"ID":"54ef043a-8831-43e4-abb7-583a36418b6c","Type":"ContainerStarted","Data":"ebd580628c7ce9b5a34326df520c08922ba57ac7b6cfd9fd595e7ae30b3140a0"} Feb 27 01:12:24 crc kubenswrapper[4771]: I0227 01:12:24.012823 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n48ck" event={"ID":"005f8696-cd2c-46e3-8edb-d9c9c9652871","Type":"ContainerStarted","Data":"fa3b1084a13b8cdc5dad8e5f7433270f93e055c55ebe4d505e1e571066381ce5"} Feb 27 01:12:24 crc kubenswrapper[4771]: I0227 01:12:24.021275 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nm89l" podStartSLOduration=2.5042878650000002 podStartE2EDuration="5.021261847s" podCreationTimestamp="2026-02-27 01:12:19 +0000 UTC" firstStartedPulling="2026-02-27 01:12:20.978649194 +0000 UTC m=+453.916210482" lastFinishedPulling="2026-02-27 01:12:23.495623166 +0000 UTC m=+456.433184464" observedRunningTime="2026-02-27 01:12:24.019875004 +0000 UTC m=+456.957436292" watchObservedRunningTime="2026-02-27 01:12:24.021261847 +0000 UTC m=+456.958823125" Feb 27 01:12:25 crc kubenswrapper[4771]: I0227 01:12:25.020762 4771 generic.go:334] "Generic (PLEG): container finished" podID="54ef043a-8831-43e4-abb7-583a36418b6c" containerID="ebd580628c7ce9b5a34326df520c08922ba57ac7b6cfd9fd595e7ae30b3140a0" exitCode=0 Feb 27 01:12:25 crc kubenswrapper[4771]: I0227 01:12:25.020831 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6b7jf" event={"ID":"54ef043a-8831-43e4-abb7-583a36418b6c","Type":"ContainerDied","Data":"ebd580628c7ce9b5a34326df520c08922ba57ac7b6cfd9fd595e7ae30b3140a0"} Feb 27 01:12:25 crc kubenswrapper[4771]: I0227 01:12:25.034663 4771 generic.go:334] "Generic (PLEG): container finished" podID="005f8696-cd2c-46e3-8edb-d9c9c9652871" containerID="fa3b1084a13b8cdc5dad8e5f7433270f93e055c55ebe4d505e1e571066381ce5" exitCode=0 Feb 27 01:12:25 crc kubenswrapper[4771]: I0227 01:12:25.034805 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n48ck" event={"ID":"005f8696-cd2c-46e3-8edb-d9c9c9652871","Type":"ContainerDied","Data":"fa3b1084a13b8cdc5dad8e5f7433270f93e055c55ebe4d505e1e571066381ce5"} Feb 27 01:12:27 crc kubenswrapper[4771]: I0227 01:12:27.046985 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6b7jf" event={"ID":"54ef043a-8831-43e4-abb7-583a36418b6c","Type":"ContainerStarted","Data":"349497d7e8d21b2a7b8740f72a5cf90baa242dd2c3d2eeab2f0bf8c76c839af4"} Feb 27 01:12:27 crc kubenswrapper[4771]: I0227 01:12:27.050138 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n48ck" event={"ID":"005f8696-cd2c-46e3-8edb-d9c9c9652871","Type":"ContainerStarted","Data":"d3ad898404af91bd75124ffaa0b9a096c96ad688b2ace055c74860a193fd194e"} Feb 27 01:12:27 crc kubenswrapper[4771]: I0227 01:12:27.069573 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6b7jf" podStartSLOduration=1.641789981 podStartE2EDuration="5.06953034s" podCreationTimestamp="2026-02-27 01:12:22 +0000 UTC" firstStartedPulling="2026-02-27 01:12:22.997019869 +0000 UTC m=+455.934581157" lastFinishedPulling="2026-02-27 01:12:26.424760198 +0000 UTC m=+459.362321516" observedRunningTime="2026-02-27 01:12:27.065571475 +0000 UTC m=+460.003132763" watchObservedRunningTime="2026-02-27 01:12:27.06953034 +0000 UTC m=+460.007091618" Feb 27 01:12:27 crc kubenswrapper[4771]: I0227 01:12:27.087173 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n48ck" podStartSLOduration=2.571361171 podStartE2EDuration="6.08715388s" podCreationTimestamp="2026-02-27 01:12:21 +0000 UTC" firstStartedPulling="2026-02-27 01:12:22.987317977 +0000 UTC m=+455.924879305" lastFinishedPulling="2026-02-27 01:12:26.503110716 +0000 UTC m=+459.440672014" observedRunningTime="2026-02-27 01:12:27.083031071 +0000 UTC m=+460.020592369" watchObservedRunningTime="2026-02-27 01:12:27.08715388 +0000 UTC m=+460.024715168" Feb 27 01:12:28 crc kubenswrapper[4771]: I0227 01:12:28.953063 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:12:28 crc kubenswrapper[4771]: I0227 01:12:28.953459 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:12:29 crc kubenswrapper[4771]: I0227 01:12:29.860507 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sl98j" Feb 27 01:12:29 crc kubenswrapper[4771]: I0227 01:12:29.860574 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sl98j" Feb 27 01:12:29 crc kubenswrapper[4771]: I0227 01:12:29.916950 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sl98j" Feb 27 01:12:30 crc kubenswrapper[4771]: I0227 01:12:30.069595 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nm89l" Feb 27 01:12:30 crc kubenswrapper[4771]: I0227 01:12:30.069668 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nm89l" Feb 27 01:12:30 crc kubenswrapper[4771]: I0227 01:12:30.139687 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sl98j" Feb 27 01:12:31 crc kubenswrapper[4771]: I0227 01:12:31.127872 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nm89l" podUID="eb1594d2-dbd5-4e37-8d97-dac2a6357808" containerName="registry-server" probeResult="failure" output=< Feb 27 01:12:31 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 27 01:12:31 crc kubenswrapper[4771]: > Feb 27 01:12:32 crc kubenswrapper[4771]: I0227 01:12:32.239153 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n48ck" Feb 27 01:12:32 crc kubenswrapper[4771]: I0227 01:12:32.239566 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n48ck" Feb 27 01:12:32 crc kubenswrapper[4771]: I0227 01:12:32.287598 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n48ck" Feb 27 01:12:32 crc kubenswrapper[4771]: I0227 01:12:32.437933 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6b7jf" Feb 27 01:12:32 crc kubenswrapper[4771]: I0227 01:12:32.438190 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6b7jf" Feb 27 01:12:32 crc kubenswrapper[4771]: I0227 01:12:32.480149 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6b7jf" Feb 27 01:12:33 crc kubenswrapper[4771]: I0227 01:12:33.134373 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6b7jf" Feb 27 01:12:33 crc kubenswrapper[4771]: I0227 01:12:33.142221 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n48ck" Feb 27 01:12:40 crc kubenswrapper[4771]: I0227 01:12:40.135022 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nm89l" Feb 27 01:12:40 crc kubenswrapper[4771]: I0227 01:12:40.192021 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nm89l" Feb 27 01:12:58 crc kubenswrapper[4771]: I0227 01:12:58.953671 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:12:58 crc kubenswrapper[4771]: I0227 01:12:58.954335 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:13:28 crc kubenswrapper[4771]: I0227 01:13:28.953452 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:13:28 crc kubenswrapper[4771]: I0227 01:13:28.954221 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:13:28 crc kubenswrapper[4771]: I0227 01:13:28.954286 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 01:13:28 crc kubenswrapper[4771]: I0227 01:13:28.955358 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3edbc767662cebd8ad4cf0660d8b2225989bc9c500a2684a30fb57d6c7bf5f5f"} pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 01:13:28 crc kubenswrapper[4771]: I0227 01:13:28.955523 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" containerID="cri-o://3edbc767662cebd8ad4cf0660d8b2225989bc9c500a2684a30fb57d6c7bf5f5f" gracePeriod=600 Feb 27 01:13:29 crc kubenswrapper[4771]: I0227 01:13:29.434513 4771 generic.go:334] "Generic (PLEG): container finished" podID="ca81e505-d53f-496e-bd26-7cec669591e4" containerID="3edbc767662cebd8ad4cf0660d8b2225989bc9c500a2684a30fb57d6c7bf5f5f" exitCode=0 Feb 27 01:13:29 crc kubenswrapper[4771]: I0227 01:13:29.434599 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerDied","Data":"3edbc767662cebd8ad4cf0660d8b2225989bc9c500a2684a30fb57d6c7bf5f5f"} Feb 27 01:13:29 crc kubenswrapper[4771]: I0227 01:13:29.434940 4771 scope.go:117] "RemoveContainer" containerID="593dd128afebc1cdf5cb1ec06b79af514cb5823cc663e626ae58c5b59daba867" Feb 27 01:13:30 crc kubenswrapper[4771]: I0227 01:13:30.445953 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerStarted","Data":"dc866424c36588a9cdf7ab45975036bf986f480af4ea79144c7263e416051408"} Feb 27 01:14:00 crc kubenswrapper[4771]: I0227 01:14:00.133993 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535914-cm6cs"] Feb 27 01:14:00 crc kubenswrapper[4771]: I0227 01:14:00.136129 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535914-cm6cs" Feb 27 01:14:00 crc kubenswrapper[4771]: I0227 01:14:00.137993 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:14:00 crc kubenswrapper[4771]: I0227 01:14:00.137992 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 01:14:00 crc kubenswrapper[4771]: I0227 01:14:00.140332 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:14:00 crc kubenswrapper[4771]: I0227 01:14:00.141056 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535914-cm6cs"] Feb 27 01:14:00 crc kubenswrapper[4771]: I0227 01:14:00.212745 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72p9w\" (UniqueName: \"kubernetes.io/projected/dc341037-7ad9-499e-b1cb-e3523551dcf5-kube-api-access-72p9w\") pod \"auto-csr-approver-29535914-cm6cs\" (UID: \"dc341037-7ad9-499e-b1cb-e3523551dcf5\") " pod="openshift-infra/auto-csr-approver-29535914-cm6cs" Feb 27 01:14:00 crc kubenswrapper[4771]: I0227 01:14:00.313747 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72p9w\" (UniqueName: \"kubernetes.io/projected/dc341037-7ad9-499e-b1cb-e3523551dcf5-kube-api-access-72p9w\") pod \"auto-csr-approver-29535914-cm6cs\" (UID: \"dc341037-7ad9-499e-b1cb-e3523551dcf5\") " pod="openshift-infra/auto-csr-approver-29535914-cm6cs" Feb 27 01:14:00 crc kubenswrapper[4771]: I0227 01:14:00.331284 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72p9w\" (UniqueName: \"kubernetes.io/projected/dc341037-7ad9-499e-b1cb-e3523551dcf5-kube-api-access-72p9w\") pod \"auto-csr-approver-29535914-cm6cs\" (UID: \"dc341037-7ad9-499e-b1cb-e3523551dcf5\") " pod="openshift-infra/auto-csr-approver-29535914-cm6cs" Feb 27 01:14:01 crc kubenswrapper[4771]: I0227 01:14:01.190189 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535914-cm6cs" Feb 27 01:14:01 crc kubenswrapper[4771]: I0227 01:14:01.391763 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535914-cm6cs"] Feb 27 01:14:01 crc kubenswrapper[4771]: I0227 01:14:01.411228 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 01:14:01 crc kubenswrapper[4771]: I0227 01:14:01.636068 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535914-cm6cs" event={"ID":"dc341037-7ad9-499e-b1cb-e3523551dcf5","Type":"ContainerStarted","Data":"ceaf001dc3d8a715d28f286bfc8380d67107beb8fa88acc021a10ff9f722d875"} Feb 27 01:14:03 crc kubenswrapper[4771]: E0227 01:14:03.093129 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc341037_7ad9_499e_b1cb_e3523551dcf5.slice/crio-6008c2469e289a3ca37c87e8197b0bb98a04f606d12591197345ccfdb0bb85f8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc341037_7ad9_499e_b1cb_e3523551dcf5.slice/crio-conmon-6008c2469e289a3ca37c87e8197b0bb98a04f606d12591197345ccfdb0bb85f8.scope\": RecentStats: unable to find data in memory cache]" Feb 27 01:14:03 crc kubenswrapper[4771]: I0227 01:14:03.649522 4771 generic.go:334] "Generic (PLEG): container finished" podID="dc341037-7ad9-499e-b1cb-e3523551dcf5" containerID="6008c2469e289a3ca37c87e8197b0bb98a04f606d12591197345ccfdb0bb85f8" exitCode=0 Feb 27 01:14:03 crc kubenswrapper[4771]: I0227 01:14:03.649658 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535914-cm6cs" event={"ID":"dc341037-7ad9-499e-b1cb-e3523551dcf5","Type":"ContainerDied","Data":"6008c2469e289a3ca37c87e8197b0bb98a04f606d12591197345ccfdb0bb85f8"} Feb 27 01:14:04 crc kubenswrapper[4771]: I0227 01:14:04.912329 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535914-cm6cs" Feb 27 01:14:05 crc kubenswrapper[4771]: I0227 01:14:05.072875 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72p9w\" (UniqueName: \"kubernetes.io/projected/dc341037-7ad9-499e-b1cb-e3523551dcf5-kube-api-access-72p9w\") pod \"dc341037-7ad9-499e-b1cb-e3523551dcf5\" (UID: \"dc341037-7ad9-499e-b1cb-e3523551dcf5\") " Feb 27 01:14:05 crc kubenswrapper[4771]: I0227 01:14:05.079747 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc341037-7ad9-499e-b1cb-e3523551dcf5-kube-api-access-72p9w" (OuterVolumeSpecName: "kube-api-access-72p9w") pod "dc341037-7ad9-499e-b1cb-e3523551dcf5" (UID: "dc341037-7ad9-499e-b1cb-e3523551dcf5"). InnerVolumeSpecName "kube-api-access-72p9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:14:05 crc kubenswrapper[4771]: I0227 01:14:05.174480 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72p9w\" (UniqueName: \"kubernetes.io/projected/dc341037-7ad9-499e-b1cb-e3523551dcf5-kube-api-access-72p9w\") on node \"crc\" DevicePath \"\"" Feb 27 01:14:05 crc kubenswrapper[4771]: I0227 01:14:05.668618 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535914-cm6cs" event={"ID":"dc341037-7ad9-499e-b1cb-e3523551dcf5","Type":"ContainerDied","Data":"ceaf001dc3d8a715d28f286bfc8380d67107beb8fa88acc021a10ff9f722d875"} Feb 27 01:14:05 crc kubenswrapper[4771]: I0227 01:14:05.668675 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ceaf001dc3d8a715d28f286bfc8380d67107beb8fa88acc021a10ff9f722d875" Feb 27 01:14:05 crc kubenswrapper[4771]: I0227 01:14:05.668742 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535914-cm6cs" Feb 27 01:14:05 crc kubenswrapper[4771]: I0227 01:14:05.988321 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535908-hhvn5"] Feb 27 01:14:06 crc kubenswrapper[4771]: I0227 01:14:06.008971 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535908-hhvn5"] Feb 27 01:14:07 crc kubenswrapper[4771]: I0227 01:14:07.780870 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0d5634e-ce3f-40a5-b85d-64f8c4708c59" path="/var/lib/kubelet/pods/e0d5634e-ce3f-40a5-b85d-64f8c4708c59/volumes" Feb 27 01:15:00 crc kubenswrapper[4771]: I0227 01:15:00.149035 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535915-54bjv"] Feb 27 01:15:00 crc kubenswrapper[4771]: E0227 01:15:00.150043 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc341037-7ad9-499e-b1cb-e3523551dcf5" containerName="oc" Feb 27 01:15:00 crc kubenswrapper[4771]: I0227 01:15:00.150064 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc341037-7ad9-499e-b1cb-e3523551dcf5" containerName="oc" Feb 27 01:15:00 crc kubenswrapper[4771]: I0227 01:15:00.150240 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc341037-7ad9-499e-b1cb-e3523551dcf5" containerName="oc" Feb 27 01:15:00 crc kubenswrapper[4771]: I0227 01:15:00.150892 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-54bjv" Feb 27 01:15:00 crc kubenswrapper[4771]: I0227 01:15:00.153328 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 01:15:00 crc kubenswrapper[4771]: I0227 01:15:00.153612 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 01:15:00 crc kubenswrapper[4771]: I0227 01:15:00.163475 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535915-54bjv"] Feb 27 01:15:00 crc kubenswrapper[4771]: I0227 01:15:00.177642 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e692db9a-5217-48e9-a817-4ba90c53dc40-secret-volume\") pod \"collect-profiles-29535915-54bjv\" (UID: \"e692db9a-5217-48e9-a817-4ba90c53dc40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-54bjv" Feb 27 01:15:00 crc kubenswrapper[4771]: I0227 01:15:00.177726 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e692db9a-5217-48e9-a817-4ba90c53dc40-config-volume\") pod \"collect-profiles-29535915-54bjv\" (UID: \"e692db9a-5217-48e9-a817-4ba90c53dc40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-54bjv" Feb 27 01:15:00 crc kubenswrapper[4771]: I0227 01:15:00.177872 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62kwp\" (UniqueName: \"kubernetes.io/projected/e692db9a-5217-48e9-a817-4ba90c53dc40-kube-api-access-62kwp\") pod \"collect-profiles-29535915-54bjv\" (UID: \"e692db9a-5217-48e9-a817-4ba90c53dc40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-54bjv" Feb 27 01:15:00 crc kubenswrapper[4771]: I0227 01:15:00.279658 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e692db9a-5217-48e9-a817-4ba90c53dc40-secret-volume\") pod \"collect-profiles-29535915-54bjv\" (UID: \"e692db9a-5217-48e9-a817-4ba90c53dc40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-54bjv" Feb 27 01:15:00 crc kubenswrapper[4771]: I0227 01:15:00.279713 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e692db9a-5217-48e9-a817-4ba90c53dc40-config-volume\") pod \"collect-profiles-29535915-54bjv\" (UID: \"e692db9a-5217-48e9-a817-4ba90c53dc40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-54bjv" Feb 27 01:15:00 crc kubenswrapper[4771]: I0227 01:15:00.279772 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62kwp\" (UniqueName: \"kubernetes.io/projected/e692db9a-5217-48e9-a817-4ba90c53dc40-kube-api-access-62kwp\") pod \"collect-profiles-29535915-54bjv\" (UID: \"e692db9a-5217-48e9-a817-4ba90c53dc40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-54bjv" Feb 27 01:15:00 crc kubenswrapper[4771]: I0227 01:15:00.280926 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e692db9a-5217-48e9-a817-4ba90c53dc40-config-volume\") pod \"collect-profiles-29535915-54bjv\" (UID: \"e692db9a-5217-48e9-a817-4ba90c53dc40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-54bjv" Feb 27 01:15:00 crc kubenswrapper[4771]: I0227 01:15:00.288603 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e692db9a-5217-48e9-a817-4ba90c53dc40-secret-volume\") pod \"collect-profiles-29535915-54bjv\" (UID: \"e692db9a-5217-48e9-a817-4ba90c53dc40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-54bjv" Feb 27 01:15:00 crc kubenswrapper[4771]: I0227 01:15:00.300240 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62kwp\" (UniqueName: \"kubernetes.io/projected/e692db9a-5217-48e9-a817-4ba90c53dc40-kube-api-access-62kwp\") pod \"collect-profiles-29535915-54bjv\" (UID: \"e692db9a-5217-48e9-a817-4ba90c53dc40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-54bjv" Feb 27 01:15:00 crc kubenswrapper[4771]: I0227 01:15:00.479372 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-54bjv" Feb 27 01:15:00 crc kubenswrapper[4771]: I0227 01:15:00.919253 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535915-54bjv"] Feb 27 01:15:01 crc kubenswrapper[4771]: I0227 01:15:01.072501 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-54bjv" event={"ID":"e692db9a-5217-48e9-a817-4ba90c53dc40","Type":"ContainerStarted","Data":"a31da61834cbb0c225966d7a1426d852c5a8b10132a25b8041a82ce81a1ab856"} Feb 27 01:15:02 crc kubenswrapper[4771]: I0227 01:15:02.083181 4771 generic.go:334] "Generic (PLEG): container finished" podID="e692db9a-5217-48e9-a817-4ba90c53dc40" containerID="7c4907178ca1611c934099998063d527b93238a7ffb83d3e9030d58b6ba31ad3" exitCode=0 Feb 27 01:15:02 crc kubenswrapper[4771]: I0227 01:15:02.083426 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-54bjv" event={"ID":"e692db9a-5217-48e9-a817-4ba90c53dc40","Type":"ContainerDied","Data":"7c4907178ca1611c934099998063d527b93238a7ffb83d3e9030d58b6ba31ad3"} Feb 27 01:15:03 crc kubenswrapper[4771]: I0227 01:15:03.342963 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-54bjv" Feb 27 01:15:03 crc kubenswrapper[4771]: I0227 01:15:03.419673 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e692db9a-5217-48e9-a817-4ba90c53dc40-config-volume\") pod \"e692db9a-5217-48e9-a817-4ba90c53dc40\" (UID: \"e692db9a-5217-48e9-a817-4ba90c53dc40\") " Feb 27 01:15:03 crc kubenswrapper[4771]: I0227 01:15:03.419738 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62kwp\" (UniqueName: \"kubernetes.io/projected/e692db9a-5217-48e9-a817-4ba90c53dc40-kube-api-access-62kwp\") pod \"e692db9a-5217-48e9-a817-4ba90c53dc40\" (UID: \"e692db9a-5217-48e9-a817-4ba90c53dc40\") " Feb 27 01:15:03 crc kubenswrapper[4771]: I0227 01:15:03.419799 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e692db9a-5217-48e9-a817-4ba90c53dc40-secret-volume\") pod \"e692db9a-5217-48e9-a817-4ba90c53dc40\" (UID: \"e692db9a-5217-48e9-a817-4ba90c53dc40\") " Feb 27 01:15:03 crc kubenswrapper[4771]: I0227 01:15:03.420820 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e692db9a-5217-48e9-a817-4ba90c53dc40-config-volume" (OuterVolumeSpecName: "config-volume") pod "e692db9a-5217-48e9-a817-4ba90c53dc40" (UID: "e692db9a-5217-48e9-a817-4ba90c53dc40"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:15:03 crc kubenswrapper[4771]: I0227 01:15:03.426229 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e692db9a-5217-48e9-a817-4ba90c53dc40-kube-api-access-62kwp" (OuterVolumeSpecName: "kube-api-access-62kwp") pod "e692db9a-5217-48e9-a817-4ba90c53dc40" (UID: "e692db9a-5217-48e9-a817-4ba90c53dc40"). InnerVolumeSpecName "kube-api-access-62kwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:15:03 crc kubenswrapper[4771]: I0227 01:15:03.428032 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e692db9a-5217-48e9-a817-4ba90c53dc40-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e692db9a-5217-48e9-a817-4ba90c53dc40" (UID: "e692db9a-5217-48e9-a817-4ba90c53dc40"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:15:03 crc kubenswrapper[4771]: I0227 01:15:03.521241 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62kwp\" (UniqueName: \"kubernetes.io/projected/e692db9a-5217-48e9-a817-4ba90c53dc40-kube-api-access-62kwp\") on node \"crc\" DevicePath \"\"" Feb 27 01:15:03 crc kubenswrapper[4771]: I0227 01:15:03.521292 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e692db9a-5217-48e9-a817-4ba90c53dc40-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 01:15:03 crc kubenswrapper[4771]: I0227 01:15:03.521310 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e692db9a-5217-48e9-a817-4ba90c53dc40-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 01:15:04 crc kubenswrapper[4771]: I0227 01:15:04.101150 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-54bjv" event={"ID":"e692db9a-5217-48e9-a817-4ba90c53dc40","Type":"ContainerDied","Data":"a31da61834cbb0c225966d7a1426d852c5a8b10132a25b8041a82ce81a1ab856"} Feb 27 01:15:04 crc kubenswrapper[4771]: I0227 01:15:04.101207 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a31da61834cbb0c225966d7a1426d852c5a8b10132a25b8041a82ce81a1ab856" Feb 27 01:15:04 crc kubenswrapper[4771]: I0227 01:15:04.101219 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535915-54bjv" Feb 27 01:15:07 crc kubenswrapper[4771]: I0227 01:15:07.088666 4771 scope.go:117] "RemoveContainer" containerID="7e3ed9db1076d237699426a40479cb3bc8a486dad98925003573a0e0fddf9312" Feb 27 01:15:07 crc kubenswrapper[4771]: I0227 01:15:07.123260 4771 scope.go:117] "RemoveContainer" containerID="e303e79e8da8b68b1195811949b089c0de2403ad7876b01aa15b28f29a99be5f" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.019269 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bbw6d"] Feb 27 01:15:23 crc kubenswrapper[4771]: E0227 01:15:23.020118 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e692db9a-5217-48e9-a817-4ba90c53dc40" containerName="collect-profiles" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.020132 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e692db9a-5217-48e9-a817-4ba90c53dc40" containerName="collect-profiles" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.020261 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e692db9a-5217-48e9-a817-4ba90c53dc40" containerName="collect-profiles" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.020800 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.030355 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bbw6d"] Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.111403 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/90d86167-d6a1-4663-b6e2-b641f601102e-registry-tls\") pod \"image-registry-66df7c8f76-bbw6d\" (UID: \"90d86167-d6a1-4663-b6e2-b641f601102e\") " pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.111442 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/90d86167-d6a1-4663-b6e2-b641f601102e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bbw6d\" (UID: \"90d86167-d6a1-4663-b6e2-b641f601102e\") " pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.111483 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/90d86167-d6a1-4663-b6e2-b641f601102e-bound-sa-token\") pod \"image-registry-66df7c8f76-bbw6d\" (UID: \"90d86167-d6a1-4663-b6e2-b641f601102e\") " pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.111518 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/90d86167-d6a1-4663-b6e2-b641f601102e-registry-certificates\") pod \"image-registry-66df7c8f76-bbw6d\" (UID: \"90d86167-d6a1-4663-b6e2-b641f601102e\") " pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.111597 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-bbw6d\" (UID: \"90d86167-d6a1-4663-b6e2-b641f601102e\") " pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.111637 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/90d86167-d6a1-4663-b6e2-b641f601102e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bbw6d\" (UID: \"90d86167-d6a1-4663-b6e2-b641f601102e\") " pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.111663 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90d86167-d6a1-4663-b6e2-b641f601102e-trusted-ca\") pod \"image-registry-66df7c8f76-bbw6d\" (UID: \"90d86167-d6a1-4663-b6e2-b641f601102e\") " pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.111690 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz5m9\" (UniqueName: \"kubernetes.io/projected/90d86167-d6a1-4663-b6e2-b641f601102e-kube-api-access-sz5m9\") pod \"image-registry-66df7c8f76-bbw6d\" (UID: \"90d86167-d6a1-4663-b6e2-b641f601102e\") " pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.134854 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-bbw6d\" (UID: \"90d86167-d6a1-4663-b6e2-b641f601102e\") " pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.213522 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/90d86167-d6a1-4663-b6e2-b641f601102e-registry-certificates\") pod \"image-registry-66df7c8f76-bbw6d\" (UID: \"90d86167-d6a1-4663-b6e2-b641f601102e\") " pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.213690 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/90d86167-d6a1-4663-b6e2-b641f601102e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bbw6d\" (UID: \"90d86167-d6a1-4663-b6e2-b641f601102e\") " pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.213762 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90d86167-d6a1-4663-b6e2-b641f601102e-trusted-ca\") pod \"image-registry-66df7c8f76-bbw6d\" (UID: \"90d86167-d6a1-4663-b6e2-b641f601102e\") " pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.213824 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz5m9\" (UniqueName: \"kubernetes.io/projected/90d86167-d6a1-4663-b6e2-b641f601102e-kube-api-access-sz5m9\") pod \"image-registry-66df7c8f76-bbw6d\" (UID: \"90d86167-d6a1-4663-b6e2-b641f601102e\") " pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.214155 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/90d86167-d6a1-4663-b6e2-b641f601102e-registry-tls\") pod \"image-registry-66df7c8f76-bbw6d\" (UID: \"90d86167-d6a1-4663-b6e2-b641f601102e\") " pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.214227 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/90d86167-d6a1-4663-b6e2-b641f601102e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bbw6d\" (UID: \"90d86167-d6a1-4663-b6e2-b641f601102e\") " pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.214395 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/90d86167-d6a1-4663-b6e2-b641f601102e-bound-sa-token\") pod \"image-registry-66df7c8f76-bbw6d\" (UID: \"90d86167-d6a1-4663-b6e2-b641f601102e\") " pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.214868 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/90d86167-d6a1-4663-b6e2-b641f601102e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bbw6d\" (UID: \"90d86167-d6a1-4663-b6e2-b641f601102e\") " pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.215486 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90d86167-d6a1-4663-b6e2-b641f601102e-trusted-ca\") pod \"image-registry-66df7c8f76-bbw6d\" (UID: \"90d86167-d6a1-4663-b6e2-b641f601102e\") " pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.216715 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/90d86167-d6a1-4663-b6e2-b641f601102e-registry-certificates\") pod \"image-registry-66df7c8f76-bbw6d\" (UID: \"90d86167-d6a1-4663-b6e2-b641f601102e\") " pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.221002 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/90d86167-d6a1-4663-b6e2-b641f601102e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bbw6d\" (UID: \"90d86167-d6a1-4663-b6e2-b641f601102e\") " pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.221083 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/90d86167-d6a1-4663-b6e2-b641f601102e-registry-tls\") pod \"image-registry-66df7c8f76-bbw6d\" (UID: \"90d86167-d6a1-4663-b6e2-b641f601102e\") " pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.237250 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz5m9\" (UniqueName: \"kubernetes.io/projected/90d86167-d6a1-4663-b6e2-b641f601102e-kube-api-access-sz5m9\") pod \"image-registry-66df7c8f76-bbw6d\" (UID: \"90d86167-d6a1-4663-b6e2-b641f601102e\") " pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.238598 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/90d86167-d6a1-4663-b6e2-b641f601102e-bound-sa-token\") pod \"image-registry-66df7c8f76-bbw6d\" (UID: \"90d86167-d6a1-4663-b6e2-b641f601102e\") " pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.335863 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:23 crc kubenswrapper[4771]: I0227 01:15:23.608399 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bbw6d"] Feb 27 01:15:24 crc kubenswrapper[4771]: I0227 01:15:24.231214 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" event={"ID":"90d86167-d6a1-4663-b6e2-b641f601102e","Type":"ContainerStarted","Data":"f3258ada7273397712e9ea7dccbecd595c27e4c956c9bfc7d4f80b4c4bc4967e"} Feb 27 01:15:24 crc kubenswrapper[4771]: I0227 01:15:24.231303 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" event={"ID":"90d86167-d6a1-4663-b6e2-b641f601102e","Type":"ContainerStarted","Data":"b3bec34079ed10027f8380e192bc191206aa2e40048e9cd6c5f4265babb9f2c7"} Feb 27 01:15:24 crc kubenswrapper[4771]: I0227 01:15:24.231444 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:24 crc kubenswrapper[4771]: I0227 01:15:24.260423 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" podStartSLOduration=2.260405102 podStartE2EDuration="2.260405102s" podCreationTimestamp="2026-02-27 01:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:15:24.255866997 +0000 UTC m=+637.193428295" watchObservedRunningTime="2026-02-27 01:15:24.260405102 +0000 UTC m=+637.197966390" Feb 27 01:15:43 crc kubenswrapper[4771]: I0227 01:15:43.341968 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-bbw6d" Feb 27 01:15:43 crc kubenswrapper[4771]: I0227 01:15:43.409802 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zmxc8"] Feb 27 01:15:58 crc kubenswrapper[4771]: I0227 01:15:58.953048 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:15:58 crc kubenswrapper[4771]: I0227 01:15:58.953704 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:16:00 crc kubenswrapper[4771]: I0227 01:16:00.139122 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535916-v5rj9"] Feb 27 01:16:00 crc kubenswrapper[4771]: I0227 01:16:00.140322 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535916-v5rj9" Feb 27 01:16:00 crc kubenswrapper[4771]: I0227 01:16:00.142927 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:16:00 crc kubenswrapper[4771]: I0227 01:16:00.143191 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 01:16:00 crc kubenswrapper[4771]: I0227 01:16:00.143292 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:16:00 crc kubenswrapper[4771]: I0227 01:16:00.148679 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535916-v5rj9"] Feb 27 01:16:00 crc kubenswrapper[4771]: I0227 01:16:00.232989 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46qlt\" (UniqueName: \"kubernetes.io/projected/921de8cb-d569-47a5-97c8-ad7f94db475e-kube-api-access-46qlt\") pod \"auto-csr-approver-29535916-v5rj9\" (UID: \"921de8cb-d569-47a5-97c8-ad7f94db475e\") " pod="openshift-infra/auto-csr-approver-29535916-v5rj9" Feb 27 01:16:00 crc kubenswrapper[4771]: I0227 01:16:00.334070 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46qlt\" (UniqueName: \"kubernetes.io/projected/921de8cb-d569-47a5-97c8-ad7f94db475e-kube-api-access-46qlt\") pod \"auto-csr-approver-29535916-v5rj9\" (UID: \"921de8cb-d569-47a5-97c8-ad7f94db475e\") " pod="openshift-infra/auto-csr-approver-29535916-v5rj9" Feb 27 01:16:00 crc kubenswrapper[4771]: I0227 01:16:00.352688 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46qlt\" (UniqueName: \"kubernetes.io/projected/921de8cb-d569-47a5-97c8-ad7f94db475e-kube-api-access-46qlt\") pod \"auto-csr-approver-29535916-v5rj9\" (UID: \"921de8cb-d569-47a5-97c8-ad7f94db475e\") " pod="openshift-infra/auto-csr-approver-29535916-v5rj9" Feb 27 01:16:00 crc kubenswrapper[4771]: I0227 01:16:00.515680 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535916-v5rj9" Feb 27 01:16:00 crc kubenswrapper[4771]: I0227 01:16:00.710126 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535916-v5rj9"] Feb 27 01:16:01 crc kubenswrapper[4771]: I0227 01:16:01.483229 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535916-v5rj9" event={"ID":"921de8cb-d569-47a5-97c8-ad7f94db475e","Type":"ContainerStarted","Data":"4e2182c03373b3f7aaa64e43dab8cfbf96295f83424fb064e9d0bbd504c424b9"} Feb 27 01:16:02 crc kubenswrapper[4771]: I0227 01:16:02.491774 4771 generic.go:334] "Generic (PLEG): container finished" podID="921de8cb-d569-47a5-97c8-ad7f94db475e" containerID="f8477f3875ca09bdc0753dab90f4d9838358f9c121298b5d73e1a1a66cf13a2d" exitCode=0 Feb 27 01:16:02 crc kubenswrapper[4771]: I0227 01:16:02.491875 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535916-v5rj9" event={"ID":"921de8cb-d569-47a5-97c8-ad7f94db475e","Type":"ContainerDied","Data":"f8477f3875ca09bdc0753dab90f4d9838358f9c121298b5d73e1a1a66cf13a2d"} Feb 27 01:16:03 crc kubenswrapper[4771]: I0227 01:16:03.773884 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535916-v5rj9" Feb 27 01:16:03 crc kubenswrapper[4771]: I0227 01:16:03.897443 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46qlt\" (UniqueName: \"kubernetes.io/projected/921de8cb-d569-47a5-97c8-ad7f94db475e-kube-api-access-46qlt\") pod \"921de8cb-d569-47a5-97c8-ad7f94db475e\" (UID: \"921de8cb-d569-47a5-97c8-ad7f94db475e\") " Feb 27 01:16:03 crc kubenswrapper[4771]: I0227 01:16:03.904727 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/921de8cb-d569-47a5-97c8-ad7f94db475e-kube-api-access-46qlt" (OuterVolumeSpecName: "kube-api-access-46qlt") pod "921de8cb-d569-47a5-97c8-ad7f94db475e" (UID: "921de8cb-d569-47a5-97c8-ad7f94db475e"). InnerVolumeSpecName "kube-api-access-46qlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:16:04 crc kubenswrapper[4771]: I0227 01:16:03.999915 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46qlt\" (UniqueName: \"kubernetes.io/projected/921de8cb-d569-47a5-97c8-ad7f94db475e-kube-api-access-46qlt\") on node \"crc\" DevicePath \"\"" Feb 27 01:16:04 crc kubenswrapper[4771]: I0227 01:16:04.521110 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535916-v5rj9" event={"ID":"921de8cb-d569-47a5-97c8-ad7f94db475e","Type":"ContainerDied","Data":"4e2182c03373b3f7aaa64e43dab8cfbf96295f83424fb064e9d0bbd504c424b9"} Feb 27 01:16:04 crc kubenswrapper[4771]: I0227 01:16:04.521163 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e2182c03373b3f7aaa64e43dab8cfbf96295f83424fb064e9d0bbd504c424b9" Feb 27 01:16:04 crc kubenswrapper[4771]: I0227 01:16:04.521193 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535916-v5rj9" Feb 27 01:16:04 crc kubenswrapper[4771]: I0227 01:16:04.829174 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535910-59vm6"] Feb 27 01:16:04 crc kubenswrapper[4771]: I0227 01:16:04.835123 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535910-59vm6"] Feb 27 01:16:05 crc kubenswrapper[4771]: I0227 01:16:05.784148 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13c83ff4-13dd-4091-a05b-1f9b624fa886" path="/var/lib/kubelet/pods/13c83ff4-13dd-4091-a05b-1f9b624fa886/volumes" Feb 27 01:16:07 crc kubenswrapper[4771]: I0227 01:16:07.187733 4771 scope.go:117] "RemoveContainer" containerID="28ff4feea473c2d9c69ca8261a4e864aeb36499b9e1851ea7fdfadf1057cc13f" Feb 27 01:16:07 crc kubenswrapper[4771]: I0227 01:16:07.229713 4771 scope.go:117] "RemoveContainer" containerID="152d0ec9817218b57a398bab4a74271426223cf7219a89961aefc31ce90a8199" Feb 27 01:16:08 crc kubenswrapper[4771]: I0227 01:16:08.459586 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" podUID="ebbe1c67-5385-4eda-af88-793c2c85e043" containerName="registry" containerID="cri-o://6f8cfd30afc043134f09522b81b040d1bb1294bced13422d93e2097bbfe219cc" gracePeriod=30 Feb 27 01:16:08 crc kubenswrapper[4771]: I0227 01:16:08.857990 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:16:08 crc kubenswrapper[4771]: I0227 01:16:08.972183 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"ebbe1c67-5385-4eda-af88-793c2c85e043\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " Feb 27 01:16:08 crc kubenswrapper[4771]: I0227 01:16:08.972257 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ebbe1c67-5385-4eda-af88-793c2c85e043-installation-pull-secrets\") pod \"ebbe1c67-5385-4eda-af88-793c2c85e043\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " Feb 27 01:16:08 crc kubenswrapper[4771]: I0227 01:16:08.972313 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ebbe1c67-5385-4eda-af88-793c2c85e043-bound-sa-token\") pod \"ebbe1c67-5385-4eda-af88-793c2c85e043\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " Feb 27 01:16:08 crc kubenswrapper[4771]: I0227 01:16:08.972382 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ebbe1c67-5385-4eda-af88-793c2c85e043-ca-trust-extracted\") pod \"ebbe1c67-5385-4eda-af88-793c2c85e043\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " Feb 27 01:16:08 crc kubenswrapper[4771]: I0227 01:16:08.972428 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkmrm\" (UniqueName: \"kubernetes.io/projected/ebbe1c67-5385-4eda-af88-793c2c85e043-kube-api-access-hkmrm\") pod \"ebbe1c67-5385-4eda-af88-793c2c85e043\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " Feb 27 01:16:08 crc kubenswrapper[4771]: I0227 01:16:08.972463 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ebbe1c67-5385-4eda-af88-793c2c85e043-registry-certificates\") pod \"ebbe1c67-5385-4eda-af88-793c2c85e043\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " Feb 27 01:16:08 crc kubenswrapper[4771]: I0227 01:16:08.972499 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebbe1c67-5385-4eda-af88-793c2c85e043-trusted-ca\") pod \"ebbe1c67-5385-4eda-af88-793c2c85e043\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " Feb 27 01:16:08 crc kubenswrapper[4771]: I0227 01:16:08.972590 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ebbe1c67-5385-4eda-af88-793c2c85e043-registry-tls\") pod \"ebbe1c67-5385-4eda-af88-793c2c85e043\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " Feb 27 01:16:08 crc kubenswrapper[4771]: I0227 01:16:08.974249 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebbe1c67-5385-4eda-af88-793c2c85e043-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "ebbe1c67-5385-4eda-af88-793c2c85e043" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:16:08 crc kubenswrapper[4771]: I0227 01:16:08.975277 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebbe1c67-5385-4eda-af88-793c2c85e043-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "ebbe1c67-5385-4eda-af88-793c2c85e043" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:16:08 crc kubenswrapper[4771]: I0227 01:16:08.979610 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebbe1c67-5385-4eda-af88-793c2c85e043-kube-api-access-hkmrm" (OuterVolumeSpecName: "kube-api-access-hkmrm") pod "ebbe1c67-5385-4eda-af88-793c2c85e043" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043"). InnerVolumeSpecName "kube-api-access-hkmrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:16:08 crc kubenswrapper[4771]: I0227 01:16:08.982692 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebbe1c67-5385-4eda-af88-793c2c85e043-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "ebbe1c67-5385-4eda-af88-793c2c85e043" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:16:08 crc kubenswrapper[4771]: I0227 01:16:08.986145 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebbe1c67-5385-4eda-af88-793c2c85e043-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "ebbe1c67-5385-4eda-af88-793c2c85e043" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:16:08 crc kubenswrapper[4771]: I0227 01:16:08.987250 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebbe1c67-5385-4eda-af88-793c2c85e043-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "ebbe1c67-5385-4eda-af88-793c2c85e043" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:16:08 crc kubenswrapper[4771]: E0227 01:16:08.988666 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:ebbe1c67-5385-4eda-af88-793c2c85e043 nodeName:}" failed. No retries permitted until 2026-02-27 01:16:09.488630562 +0000 UTC m=+682.426191910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "registry-storage" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "ebbe1c67-5385-4eda-af88-793c2c85e043" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043") : kubernetes.io/csi: Unmounter.TearDownAt failed: rpc error: code = Unknown desc = check target path: could not get consistent content of /proc/mounts after 3 attempts Feb 27 01:16:09 crc kubenswrapper[4771]: I0227 01:16:09.005250 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebbe1c67-5385-4eda-af88-793c2c85e043-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "ebbe1c67-5385-4eda-af88-793c2c85e043" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:16:09 crc kubenswrapper[4771]: I0227 01:16:09.074520 4771 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ebbe1c67-5385-4eda-af88-793c2c85e043-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 27 01:16:09 crc kubenswrapper[4771]: I0227 01:16:09.074578 4771 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ebbe1c67-5385-4eda-af88-793c2c85e043-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 27 01:16:09 crc kubenswrapper[4771]: I0227 01:16:09.074593 4771 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ebbe1c67-5385-4eda-af88-793c2c85e043-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 27 01:16:09 crc kubenswrapper[4771]: I0227 01:16:09.074607 4771 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ebbe1c67-5385-4eda-af88-793c2c85e043-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 27 01:16:09 crc kubenswrapper[4771]: I0227 01:16:09.074620 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkmrm\" (UniqueName: \"kubernetes.io/projected/ebbe1c67-5385-4eda-af88-793c2c85e043-kube-api-access-hkmrm\") on node \"crc\" DevicePath \"\"" Feb 27 01:16:09 crc kubenswrapper[4771]: I0227 01:16:09.074633 4771 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ebbe1c67-5385-4eda-af88-793c2c85e043-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 27 01:16:09 crc kubenswrapper[4771]: I0227 01:16:09.074647 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebbe1c67-5385-4eda-af88-793c2c85e043-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:16:09 crc kubenswrapper[4771]: I0227 01:16:09.557579 4771 generic.go:334] "Generic (PLEG): container finished" podID="ebbe1c67-5385-4eda-af88-793c2c85e043" containerID="6f8cfd30afc043134f09522b81b040d1bb1294bced13422d93e2097bbfe219cc" exitCode=0 Feb 27 01:16:09 crc kubenswrapper[4771]: I0227 01:16:09.557647 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" event={"ID":"ebbe1c67-5385-4eda-af88-793c2c85e043","Type":"ContainerDied","Data":"6f8cfd30afc043134f09522b81b040d1bb1294bced13422d93e2097bbfe219cc"} Feb 27 01:16:09 crc kubenswrapper[4771]: I0227 01:16:09.557728 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" event={"ID":"ebbe1c67-5385-4eda-af88-793c2c85e043","Type":"ContainerDied","Data":"9fbbf4ad51c25094c0cfedf118c56dea5edbfae24cdc2aa14d2c61ab253f3321"} Feb 27 01:16:09 crc kubenswrapper[4771]: I0227 01:16:09.557758 4771 scope.go:117] "RemoveContainer" containerID="6f8cfd30afc043134f09522b81b040d1bb1294bced13422d93e2097bbfe219cc" Feb 27 01:16:09 crc kubenswrapper[4771]: I0227 01:16:09.557691 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zmxc8" Feb 27 01:16:09 crc kubenswrapper[4771]: I0227 01:16:09.576204 4771 scope.go:117] "RemoveContainer" containerID="6f8cfd30afc043134f09522b81b040d1bb1294bced13422d93e2097bbfe219cc" Feb 27 01:16:09 crc kubenswrapper[4771]: E0227 01:16:09.576704 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f8cfd30afc043134f09522b81b040d1bb1294bced13422d93e2097bbfe219cc\": container with ID starting with 6f8cfd30afc043134f09522b81b040d1bb1294bced13422d93e2097bbfe219cc not found: ID does not exist" containerID="6f8cfd30afc043134f09522b81b040d1bb1294bced13422d93e2097bbfe219cc" Feb 27 01:16:09 crc kubenswrapper[4771]: I0227 01:16:09.576743 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f8cfd30afc043134f09522b81b040d1bb1294bced13422d93e2097bbfe219cc"} err="failed to get container status \"6f8cfd30afc043134f09522b81b040d1bb1294bced13422d93e2097bbfe219cc\": rpc error: code = NotFound desc = could not find container \"6f8cfd30afc043134f09522b81b040d1bb1294bced13422d93e2097bbfe219cc\": container with ID starting with 6f8cfd30afc043134f09522b81b040d1bb1294bced13422d93e2097bbfe219cc not found: ID does not exist" Feb 27 01:16:09 crc kubenswrapper[4771]: I0227 01:16:09.581651 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"ebbe1c67-5385-4eda-af88-793c2c85e043\" (UID: \"ebbe1c67-5385-4eda-af88-793c2c85e043\") " Feb 27 01:16:09 crc kubenswrapper[4771]: I0227 01:16:09.595184 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "ebbe1c67-5385-4eda-af88-793c2c85e043" (UID: "ebbe1c67-5385-4eda-af88-793c2c85e043"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 01:16:09 crc kubenswrapper[4771]: I0227 01:16:09.691061 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zmxc8"] Feb 27 01:16:09 crc kubenswrapper[4771]: I0227 01:16:09.695056 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zmxc8"] Feb 27 01:16:09 crc kubenswrapper[4771]: I0227 01:16:09.780379 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebbe1c67-5385-4eda-af88-793c2c85e043" path="/var/lib/kubelet/pods/ebbe1c67-5385-4eda-af88-793c2c85e043/volumes" Feb 27 01:16:28 crc kubenswrapper[4771]: I0227 01:16:28.953350 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:16:28 crc kubenswrapper[4771]: I0227 01:16:28.954031 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:16:58 crc kubenswrapper[4771]: I0227 01:16:58.953197 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:16:58 crc kubenswrapper[4771]: I0227 01:16:58.953976 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:16:58 crc kubenswrapper[4771]: I0227 01:16:58.954053 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 01:16:58 crc kubenswrapper[4771]: I0227 01:16:58.955074 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dc866424c36588a9cdf7ab45975036bf986f480af4ea79144c7263e416051408"} pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 01:16:58 crc kubenswrapper[4771]: I0227 01:16:58.955183 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" containerID="cri-o://dc866424c36588a9cdf7ab45975036bf986f480af4ea79144c7263e416051408" gracePeriod=600 Feb 27 01:16:59 crc kubenswrapper[4771]: I0227 01:16:59.938411 4771 generic.go:334] "Generic (PLEG): container finished" podID="ca81e505-d53f-496e-bd26-7cec669591e4" containerID="dc866424c36588a9cdf7ab45975036bf986f480af4ea79144c7263e416051408" exitCode=0 Feb 27 01:16:59 crc kubenswrapper[4771]: I0227 01:16:59.938514 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerDied","Data":"dc866424c36588a9cdf7ab45975036bf986f480af4ea79144c7263e416051408"} Feb 27 01:16:59 crc kubenswrapper[4771]: I0227 01:16:59.939150 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerStarted","Data":"c06019bd1d417bdca00ed2eff4e51501f46dbc51fa52f89a80770d81ea06c432"} Feb 27 01:16:59 crc kubenswrapper[4771]: I0227 01:16:59.939183 4771 scope.go:117] "RemoveContainer" containerID="3edbc767662cebd8ad4cf0660d8b2225989bc9c500a2684a30fb57d6c7bf5f5f" Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.104121 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-8pmgm"] Feb 27 01:17:32 crc kubenswrapper[4771]: E0227 01:17:32.105076 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="921de8cb-d569-47a5-97c8-ad7f94db475e" containerName="oc" Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.105101 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="921de8cb-d569-47a5-97c8-ad7f94db475e" containerName="oc" Feb 27 01:17:32 crc kubenswrapper[4771]: E0227 01:17:32.105123 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebbe1c67-5385-4eda-af88-793c2c85e043" containerName="registry" Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.105135 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebbe1c67-5385-4eda-af88-793c2c85e043" containerName="registry" Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.105312 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebbe1c67-5385-4eda-af88-793c2c85e043" containerName="registry" Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.105329 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="921de8cb-d569-47a5-97c8-ad7f94db475e" containerName="oc" Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.105935 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8pmgm" Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.108118 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.109359 4771 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-qnhhr" Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.110359 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.116688 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-8pmgm"] Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.128413 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-6n589"] Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.129082 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-6n589" Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.131062 4771 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-qv95z" Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.153102 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-gd2tq"] Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.153811 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-gd2tq" Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.154897 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-6n589"] Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.155269 4771 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-w54ch" Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.158752 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-gd2tq"] Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.225323 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7b4s\" (UniqueName: \"kubernetes.io/projected/dca42308-0eb3-4c5b-a620-cbbb29c3c88f-kube-api-access-j7b4s\") pod \"cert-manager-cainjector-cf98fcc89-8pmgm\" (UID: \"dca42308-0eb3-4c5b-a620-cbbb29c3c88f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-8pmgm" Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.225443 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g87lj\" (UniqueName: \"kubernetes.io/projected/d066e334-9b58-464d-80d8-899a6390d5c5-kube-api-access-g87lj\") pod \"cert-manager-webhook-687f57d79b-gd2tq\" (UID: \"d066e334-9b58-464d-80d8-899a6390d5c5\") " pod="cert-manager/cert-manager-webhook-687f57d79b-gd2tq" Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.225516 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw2h5\" (UniqueName: \"kubernetes.io/projected/6a0dd098-846f-4aab-b87f-4d06728195c5-kube-api-access-jw2h5\") pod \"cert-manager-858654f9db-6n589\" (UID: \"6a0dd098-846f-4aab-b87f-4d06728195c5\") " pod="cert-manager/cert-manager-858654f9db-6n589" Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.326181 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw2h5\" (UniqueName: \"kubernetes.io/projected/6a0dd098-846f-4aab-b87f-4d06728195c5-kube-api-access-jw2h5\") pod \"cert-manager-858654f9db-6n589\" (UID: \"6a0dd098-846f-4aab-b87f-4d06728195c5\") " pod="cert-manager/cert-manager-858654f9db-6n589" Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.326227 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7b4s\" (UniqueName: \"kubernetes.io/projected/dca42308-0eb3-4c5b-a620-cbbb29c3c88f-kube-api-access-j7b4s\") pod \"cert-manager-cainjector-cf98fcc89-8pmgm\" (UID: \"dca42308-0eb3-4c5b-a620-cbbb29c3c88f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-8pmgm" Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.326283 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g87lj\" (UniqueName: \"kubernetes.io/projected/d066e334-9b58-464d-80d8-899a6390d5c5-kube-api-access-g87lj\") pod \"cert-manager-webhook-687f57d79b-gd2tq\" (UID: \"d066e334-9b58-464d-80d8-899a6390d5c5\") " pod="cert-manager/cert-manager-webhook-687f57d79b-gd2tq" Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.346097 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7b4s\" (UniqueName: \"kubernetes.io/projected/dca42308-0eb3-4c5b-a620-cbbb29c3c88f-kube-api-access-j7b4s\") pod \"cert-manager-cainjector-cf98fcc89-8pmgm\" (UID: \"dca42308-0eb3-4c5b-a620-cbbb29c3c88f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-8pmgm" Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.346521 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g87lj\" (UniqueName: \"kubernetes.io/projected/d066e334-9b58-464d-80d8-899a6390d5c5-kube-api-access-g87lj\") pod \"cert-manager-webhook-687f57d79b-gd2tq\" (UID: \"d066e334-9b58-464d-80d8-899a6390d5c5\") " pod="cert-manager/cert-manager-webhook-687f57d79b-gd2tq" Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.347726 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw2h5\" (UniqueName: \"kubernetes.io/projected/6a0dd098-846f-4aab-b87f-4d06728195c5-kube-api-access-jw2h5\") pod \"cert-manager-858654f9db-6n589\" (UID: \"6a0dd098-846f-4aab-b87f-4d06728195c5\") " pod="cert-manager/cert-manager-858654f9db-6n589" Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.425766 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8pmgm" Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.442303 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-6n589" Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.468482 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-gd2tq" Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.890360 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-8pmgm"] Feb 27 01:17:32 crc kubenswrapper[4771]: W0227 01:17:32.898049 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddca42308_0eb3_4c5b_a620_cbbb29c3c88f.slice/crio-c07bfa52116205e3f231e1fdf1b8f2cd0a53756a12362511139d695a3c5d47ee WatchSource:0}: Error finding container c07bfa52116205e3f231e1fdf1b8f2cd0a53756a12362511139d695a3c5d47ee: Status 404 returned error can't find the container with id c07bfa52116205e3f231e1fdf1b8f2cd0a53756a12362511139d695a3c5d47ee Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.943064 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-gd2tq"] Feb 27 01:17:32 crc kubenswrapper[4771]: I0227 01:17:32.966129 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-6n589"] Feb 27 01:17:33 crc kubenswrapper[4771]: I0227 01:17:33.167953 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8pmgm" event={"ID":"dca42308-0eb3-4c5b-a620-cbbb29c3c88f","Type":"ContainerStarted","Data":"c07bfa52116205e3f231e1fdf1b8f2cd0a53756a12362511139d695a3c5d47ee"} Feb 27 01:17:33 crc kubenswrapper[4771]: I0227 01:17:33.169328 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-6n589" event={"ID":"6a0dd098-846f-4aab-b87f-4d06728195c5","Type":"ContainerStarted","Data":"df33ef64abf35a30947ebc73a6b5251b951c3cd50d6ba19b73e1fd6bcf518312"} Feb 27 01:17:33 crc kubenswrapper[4771]: I0227 01:17:33.170535 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-gd2tq" event={"ID":"d066e334-9b58-464d-80d8-899a6390d5c5","Type":"ContainerStarted","Data":"f02f3ed1bb45faa89d520e4ab8fca9efb7d895039a521ca37191e4433a6ce0dc"} Feb 27 01:17:37 crc kubenswrapper[4771]: I0227 01:17:37.202409 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-6n589" event={"ID":"6a0dd098-846f-4aab-b87f-4d06728195c5","Type":"ContainerStarted","Data":"4c987930888910fd3a1d7def7d912c250584f5dcb2e2d8a37250a988e3c87f95"} Feb 27 01:17:37 crc kubenswrapper[4771]: I0227 01:17:37.204911 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-gd2tq" event={"ID":"d066e334-9b58-464d-80d8-899a6390d5c5","Type":"ContainerStarted","Data":"b18f1b970877a0fac5e075b06226a1b9d8057ec2ee66e7d6b321b51dbc5cf900"} Feb 27 01:17:37 crc kubenswrapper[4771]: I0227 01:17:37.205006 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-gd2tq" Feb 27 01:17:37 crc kubenswrapper[4771]: I0227 01:17:37.208997 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8pmgm" event={"ID":"dca42308-0eb3-4c5b-a620-cbbb29c3c88f","Type":"ContainerStarted","Data":"d7f98aa100877b0c9586645bfd60f8732e4c5140b05a2309b539c4f696c092a6"} Feb 27 01:17:37 crc kubenswrapper[4771]: I0227 01:17:37.222453 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-6n589" podStartSLOduration=1.750673079 podStartE2EDuration="5.222437762s" podCreationTimestamp="2026-02-27 01:17:32 +0000 UTC" firstStartedPulling="2026-02-27 01:17:32.972815613 +0000 UTC m=+765.910376931" lastFinishedPulling="2026-02-27 01:17:36.444580296 +0000 UTC m=+769.382141614" observedRunningTime="2026-02-27 01:17:37.222191584 +0000 UTC m=+770.159752882" watchObservedRunningTime="2026-02-27 01:17:37.222437762 +0000 UTC m=+770.159999070" Feb 27 01:17:37 crc kubenswrapper[4771]: I0227 01:17:37.260937 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-gd2tq" podStartSLOduration=1.704018859 podStartE2EDuration="5.26091257s" podCreationTimestamp="2026-02-27 01:17:32 +0000 UTC" firstStartedPulling="2026-02-27 01:17:32.970005668 +0000 UTC m=+765.907566976" lastFinishedPulling="2026-02-27 01:17:36.526899379 +0000 UTC m=+769.464460687" observedRunningTime="2026-02-27 01:17:37.255462713 +0000 UTC m=+770.193024031" watchObservedRunningTime="2026-02-27 01:17:37.26091257 +0000 UTC m=+770.198473898" Feb 27 01:17:37 crc kubenswrapper[4771]: I0227 01:17:37.288458 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8pmgm" podStartSLOduration=1.749584101 podStartE2EDuration="5.288428124s" podCreationTimestamp="2026-02-27 01:17:32 +0000 UTC" firstStartedPulling="2026-02-27 01:17:32.905684282 +0000 UTC m=+765.843245580" lastFinishedPulling="2026-02-27 01:17:36.444528285 +0000 UTC m=+769.382089603" observedRunningTime="2026-02-27 01:17:37.280242082 +0000 UTC m=+770.217803410" watchObservedRunningTime="2026-02-27 01:17:37.288428124 +0000 UTC m=+770.225989452" Feb 27 01:17:42 crc kubenswrapper[4771]: I0227 01:17:42.471913 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-gd2tq" Feb 27 01:17:44 crc kubenswrapper[4771]: I0227 01:17:44.602023 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-h5vs8"] Feb 27 01:17:44 crc kubenswrapper[4771]: I0227 01:17:44.602353 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="ovn-controller" containerID="cri-o://a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62" gracePeriod=30 Feb 27 01:17:44 crc kubenswrapper[4771]: I0227 01:17:44.602793 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="sbdb" containerID="cri-o://b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411" gracePeriod=30 Feb 27 01:17:44 crc kubenswrapper[4771]: I0227 01:17:44.602844 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="nbdb" containerID="cri-o://735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1" gracePeriod=30 Feb 27 01:17:44 crc kubenswrapper[4771]: I0227 01:17:44.602887 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="northd" containerID="cri-o://2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90" gracePeriod=30 Feb 27 01:17:44 crc kubenswrapper[4771]: I0227 01:17:44.602932 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70" gracePeriod=30 Feb 27 01:17:44 crc kubenswrapper[4771]: I0227 01:17:44.602976 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="kube-rbac-proxy-node" containerID="cri-o://965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08" gracePeriod=30 Feb 27 01:17:44 crc kubenswrapper[4771]: I0227 01:17:44.603012 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="ovn-acl-logging" containerID="cri-o://17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c" gracePeriod=30 Feb 27 01:17:44 crc kubenswrapper[4771]: I0227 01:17:44.689261 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="ovnkube-controller" containerID="cri-o://81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66" gracePeriod=30 Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.011171 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h5vs8_21f824c6-1bde-4e58-b4ef-72a56a140abb/ovnkube-controller/3.log" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.014844 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h5vs8_21f824c6-1bde-4e58-b4ef-72a56a140abb/ovn-acl-logging/0.log" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.015661 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h5vs8_21f824c6-1bde-4e58-b4ef-72a56a140abb/ovn-controller/0.log" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.016135 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.085770 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-z2hps"] Feb 27 01:17:45 crc kubenswrapper[4771]: E0227 01:17:45.085990 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="northd" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.086004 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="northd" Feb 27 01:17:45 crc kubenswrapper[4771]: E0227 01:17:45.086022 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="kubecfg-setup" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.086031 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="kubecfg-setup" Feb 27 01:17:45 crc kubenswrapper[4771]: E0227 01:17:45.086046 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="kube-rbac-proxy-ovn-metrics" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.086055 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="kube-rbac-proxy-ovn-metrics" Feb 27 01:17:45 crc kubenswrapper[4771]: E0227 01:17:45.086069 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="ovn-acl-logging" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.086077 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="ovn-acl-logging" Feb 27 01:17:45 crc kubenswrapper[4771]: E0227 01:17:45.086087 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="ovnkube-controller" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.086095 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="ovnkube-controller" Feb 27 01:17:45 crc kubenswrapper[4771]: E0227 01:17:45.086105 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="ovnkube-controller" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.086114 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="ovnkube-controller" Feb 27 01:17:45 crc kubenswrapper[4771]: E0227 01:17:45.086123 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="ovnkube-controller" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.086132 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="ovnkube-controller" Feb 27 01:17:45 crc kubenswrapper[4771]: E0227 01:17:45.086142 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="sbdb" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.086150 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="sbdb" Feb 27 01:17:45 crc kubenswrapper[4771]: E0227 01:17:45.086163 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="nbdb" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.086170 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="nbdb" Feb 27 01:17:45 crc kubenswrapper[4771]: E0227 01:17:45.086183 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="kube-rbac-proxy-node" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.086191 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="kube-rbac-proxy-node" Feb 27 01:17:45 crc kubenswrapper[4771]: E0227 01:17:45.086205 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="ovnkube-controller" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.086213 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="ovnkube-controller" Feb 27 01:17:45 crc kubenswrapper[4771]: E0227 01:17:45.086222 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="ovn-controller" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.086230 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="ovn-controller" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.086339 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="sbdb" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.086352 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="kube-rbac-proxy-ovn-metrics" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.086365 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="ovnkube-controller" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.086377 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="ovnkube-controller" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.086389 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="ovnkube-controller" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.086400 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="ovnkube-controller" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.086411 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="ovn-controller" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.086419 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="nbdb" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.086430 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="ovn-acl-logging" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.086445 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="kube-rbac-proxy-node" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.086456 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="northd" Feb 27 01:17:45 crc kubenswrapper[4771]: E0227 01:17:45.086590 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="ovnkube-controller" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.086600 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="ovnkube-controller" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.086730 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerName="ovnkube-controller" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.088704 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.155094 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-var-lib-openvswitch\") pod \"21f824c6-1bde-4e58-b4ef-72a56a140abb\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.155184 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-run-openvswitch\") pod \"21f824c6-1bde-4e58-b4ef-72a56a140abb\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.155222 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-run-ovn-kubernetes\") pod \"21f824c6-1bde-4e58-b4ef-72a56a140abb\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.155260 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c6hm\" (UniqueName: \"kubernetes.io/projected/21f824c6-1bde-4e58-b4ef-72a56a140abb-kube-api-access-9c6hm\") pod \"21f824c6-1bde-4e58-b4ef-72a56a140abb\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.155177 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "21f824c6-1bde-4e58-b4ef-72a56a140abb" (UID: "21f824c6-1bde-4e58-b4ef-72a56a140abb"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.155295 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-run-netns\") pod \"21f824c6-1bde-4e58-b4ef-72a56a140abb\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.155345 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "21f824c6-1bde-4e58-b4ef-72a56a140abb" (UID: "21f824c6-1bde-4e58-b4ef-72a56a140abb"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.155347 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21f824c6-1bde-4e58-b4ef-72a56a140abb-ovn-node-metrics-cert\") pod \"21f824c6-1bde-4e58-b4ef-72a56a140abb\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.155421 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-etc-openvswitch\") pod \"21f824c6-1bde-4e58-b4ef-72a56a140abb\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.155452 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-run-systemd\") pod \"21f824c6-1bde-4e58-b4ef-72a56a140abb\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.155487 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21f824c6-1bde-4e58-b4ef-72a56a140abb-env-overrides\") pod \"21f824c6-1bde-4e58-b4ef-72a56a140abb\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.155525 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21f824c6-1bde-4e58-b4ef-72a56a140abb-ovnkube-script-lib\") pod \"21f824c6-1bde-4e58-b4ef-72a56a140abb\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.155619 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-cni-bin\") pod \"21f824c6-1bde-4e58-b4ef-72a56a140abb\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.155649 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-systemd-units\") pod \"21f824c6-1bde-4e58-b4ef-72a56a140abb\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.155678 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-slash\") pod \"21f824c6-1bde-4e58-b4ef-72a56a140abb\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.155706 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-kubelet\") pod \"21f824c6-1bde-4e58-b4ef-72a56a140abb\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.155736 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21f824c6-1bde-4e58-b4ef-72a56a140abb-ovnkube-config\") pod \"21f824c6-1bde-4e58-b4ef-72a56a140abb\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.155770 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-run-ovn\") pod \"21f824c6-1bde-4e58-b4ef-72a56a140abb\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.155803 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"21f824c6-1bde-4e58-b4ef-72a56a140abb\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.155836 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-cni-netd\") pod \"21f824c6-1bde-4e58-b4ef-72a56a140abb\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.155868 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-log-socket\") pod \"21f824c6-1bde-4e58-b4ef-72a56a140abb\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.155902 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-node-log\") pod \"21f824c6-1bde-4e58-b4ef-72a56a140abb\" (UID: \"21f824c6-1bde-4e58-b4ef-72a56a140abb\") " Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156042 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2295d052-efbd-4d2f-b1ba-20fc81f6da86-ovnkube-script-lib\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156082 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-systemd-units\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156113 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2295d052-efbd-4d2f-b1ba-20fc81f6da86-env-overrides\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156200 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-host-run-netns\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156230 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-node-log\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156263 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-host-run-ovn-kubernetes\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156294 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-log-socket\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156329 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-host-cni-bin\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156362 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-var-lib-openvswitch\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156414 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkbv6\" (UniqueName: \"kubernetes.io/projected/2295d052-efbd-4d2f-b1ba-20fc81f6da86-kube-api-access-dkbv6\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156451 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2295d052-efbd-4d2f-b1ba-20fc81f6da86-ovn-node-metrics-cert\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156484 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-host-slash\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156517 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-host-kubelet\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156574 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-run-systemd\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156602 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-etc-openvswitch\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156630 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2295d052-efbd-4d2f-b1ba-20fc81f6da86-ovnkube-config\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156631 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "21f824c6-1bde-4e58-b4ef-72a56a140abb" (UID: "21f824c6-1bde-4e58-b4ef-72a56a140abb"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156661 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-host-cni-netd\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156665 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "21f824c6-1bde-4e58-b4ef-72a56a140abb" (UID: "21f824c6-1bde-4e58-b4ef-72a56a140abb"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156703 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-run-openvswitch\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156732 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-run-ovn\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156736 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "21f824c6-1bde-4e58-b4ef-72a56a140abb" (UID: "21f824c6-1bde-4e58-b4ef-72a56a140abb"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156736 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "21f824c6-1bde-4e58-b4ef-72a56a140abb" (UID: "21f824c6-1bde-4e58-b4ef-72a56a140abb"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156757 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-slash" (OuterVolumeSpecName: "host-slash") pod "21f824c6-1bde-4e58-b4ef-72a56a140abb" (UID: "21f824c6-1bde-4e58-b4ef-72a56a140abb"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156790 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156882 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "21f824c6-1bde-4e58-b4ef-72a56a140abb" (UID: "21f824c6-1bde-4e58-b4ef-72a56a140abb"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156908 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "21f824c6-1bde-4e58-b4ef-72a56a140abb" (UID: "21f824c6-1bde-4e58-b4ef-72a56a140abb"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156954 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "21f824c6-1bde-4e58-b4ef-72a56a140abb" (UID: "21f824c6-1bde-4e58-b4ef-72a56a140abb"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.157033 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "21f824c6-1bde-4e58-b4ef-72a56a140abb" (UID: "21f824c6-1bde-4e58-b4ef-72a56a140abb"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.157019 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-log-socket" (OuterVolumeSpecName: "log-socket") pod "21f824c6-1bde-4e58-b4ef-72a56a140abb" (UID: "21f824c6-1bde-4e58-b4ef-72a56a140abb"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.157062 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-node-log" (OuterVolumeSpecName: "node-log") pod "21f824c6-1bde-4e58-b4ef-72a56a140abb" (UID: "21f824c6-1bde-4e58-b4ef-72a56a140abb"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.156772 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "21f824c6-1bde-4e58-b4ef-72a56a140abb" (UID: "21f824c6-1bde-4e58-b4ef-72a56a140abb"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.157153 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21f824c6-1bde-4e58-b4ef-72a56a140abb-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "21f824c6-1bde-4e58-b4ef-72a56a140abb" (UID: "21f824c6-1bde-4e58-b4ef-72a56a140abb"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.157384 4771 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.157422 4771 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.157569 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21f824c6-1bde-4e58-b4ef-72a56a140abb-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "21f824c6-1bde-4e58-b4ef-72a56a140abb" (UID: "21f824c6-1bde-4e58-b4ef-72a56a140abb"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.157615 4771 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.157652 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21f824c6-1bde-4e58-b4ef-72a56a140abb-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "21f824c6-1bde-4e58-b4ef-72a56a140abb" (UID: "21f824c6-1bde-4e58-b4ef-72a56a140abb"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.157664 4771 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.157731 4771 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.157753 4771 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-slash\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.157771 4771 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.160839 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21f824c6-1bde-4e58-b4ef-72a56a140abb-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "21f824c6-1bde-4e58-b4ef-72a56a140abb" (UID: "21f824c6-1bde-4e58-b4ef-72a56a140abb"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.161220 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21f824c6-1bde-4e58-b4ef-72a56a140abb-kube-api-access-9c6hm" (OuterVolumeSpecName: "kube-api-access-9c6hm") pod "21f824c6-1bde-4e58-b4ef-72a56a140abb" (UID: "21f824c6-1bde-4e58-b4ef-72a56a140abb"). InnerVolumeSpecName "kube-api-access-9c6hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.182400 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "21f824c6-1bde-4e58-b4ef-72a56a140abb" (UID: "21f824c6-1bde-4e58-b4ef-72a56a140abb"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.258891 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-host-slash\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259005 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-etc-openvswitch\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259090 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-host-kubelet\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259124 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-run-systemd\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259153 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2295d052-efbd-4d2f-b1ba-20fc81f6da86-ovnkube-config\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259188 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-host-cni-netd\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259237 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-run-openvswitch\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259235 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-host-kubelet\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259284 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-run-ovn\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259307 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-host-cni-netd\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259312 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-run-systemd\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259234 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-etc-openvswitch\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259361 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-run-ovn\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259373 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-run-openvswitch\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259419 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259491 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259492 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2295d052-efbd-4d2f-b1ba-20fc81f6da86-ovnkube-script-lib\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259537 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-systemd-units\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259579 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2295d052-efbd-4d2f-b1ba-20fc81f6da86-env-overrides\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259605 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-systemd-units\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259637 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-host-slash\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259716 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-host-run-netns\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259751 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-node-log\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259789 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-host-run-ovn-kubernetes\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259799 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-host-run-netns\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259822 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-log-socket\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259861 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-host-cni-bin\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259902 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-var-lib-openvswitch\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259863 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-node-log\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259938 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkbv6\" (UniqueName: \"kubernetes.io/projected/2295d052-efbd-4d2f-b1ba-20fc81f6da86-kube-api-access-dkbv6\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259902 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-host-run-ovn-kubernetes\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259973 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2295d052-efbd-4d2f-b1ba-20fc81f6da86-ovn-node-metrics-cert\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259904 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-log-socket\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259943 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-var-lib-openvswitch\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.259904 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2295d052-efbd-4d2f-b1ba-20fc81f6da86-host-cni-bin\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.260130 4771 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21f824c6-1bde-4e58-b4ef-72a56a140abb-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.260295 4771 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.260332 4771 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.260353 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2295d052-efbd-4d2f-b1ba-20fc81f6da86-env-overrides\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.260471 4771 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21f824c6-1bde-4e58-b4ef-72a56a140abb-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.260503 4771 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.260530 4771 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.260600 4771 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.260629 4771 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-log-socket\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.260654 4771 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-node-log\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.260679 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9c6hm\" (UniqueName: \"kubernetes.io/projected/21f824c6-1bde-4e58-b4ef-72a56a140abb-kube-api-access-9c6hm\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.260705 4771 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21f824c6-1bde-4e58-b4ef-72a56a140abb-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.260729 4771 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21f824c6-1bde-4e58-b4ef-72a56a140abb-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.260752 4771 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21f824c6-1bde-4e58-b4ef-72a56a140abb-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.260943 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2295d052-efbd-4d2f-b1ba-20fc81f6da86-ovnkube-config\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.261932 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2295d052-efbd-4d2f-b1ba-20fc81f6da86-ovnkube-script-lib\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.264790 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2295d052-efbd-4d2f-b1ba-20fc81f6da86-ovn-node-metrics-cert\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.277326 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h5vs8_21f824c6-1bde-4e58-b4ef-72a56a140abb/ovnkube-controller/3.log" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.282212 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h5vs8_21f824c6-1bde-4e58-b4ef-72a56a140abb/ovn-acl-logging/0.log" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.283229 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h5vs8_21f824c6-1bde-4e58-b4ef-72a56a140abb/ovn-controller/0.log" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.284093 4771 generic.go:334] "Generic (PLEG): container finished" podID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerID="81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66" exitCode=0 Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.284144 4771 generic.go:334] "Generic (PLEG): container finished" podID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerID="b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411" exitCode=0 Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.284164 4771 generic.go:334] "Generic (PLEG): container finished" podID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerID="735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1" exitCode=0 Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.284179 4771 generic.go:334] "Generic (PLEG): container finished" podID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerID="2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90" exitCode=0 Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.284195 4771 generic.go:334] "Generic (PLEG): container finished" podID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerID="ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70" exitCode=0 Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.284209 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.284210 4771 generic.go:334] "Generic (PLEG): container finished" podID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerID="965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08" exitCode=0 Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.284478 4771 generic.go:334] "Generic (PLEG): container finished" podID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerID="17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c" exitCode=143 Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.284191 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerDied","Data":"81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.284524 4771 generic.go:334] "Generic (PLEG): container finished" podID="21f824c6-1bde-4e58-b4ef-72a56a140abb" containerID="a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62" exitCode=143 Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.284603 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerDied","Data":"b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.284641 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerDied","Data":"735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.284664 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerDied","Data":"2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.284684 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerDied","Data":"ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.284698 4771 scope.go:117] "RemoveContainer" containerID="81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.284704 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerDied","Data":"965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.284924 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.284943 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.284955 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.284967 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.284979 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.284990 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.285002 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.285013 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.285024 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.285046 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerDied","Data":"17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.285066 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.285079 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.285089 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.285099 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.285110 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.285120 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.285131 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.285160 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.285172 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.285183 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.285199 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerDied","Data":"a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.285216 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.286394 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.286428 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.286500 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.286513 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.286523 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.286534 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.286545 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.286582 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.286595 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.286615 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h5vs8" event={"ID":"21f824c6-1bde-4e58-b4ef-72a56a140abb","Type":"ContainerDied","Data":"5457b050cedf3d58c016746fe9ae19016484dee6506d5903e4a66dadde91100c"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.286777 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.286801 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.286812 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.286823 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.286834 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.286844 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.286855 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.286865 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.286887 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.286898 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.289321 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-srbwq_3c460c23-4b4a-458f-a52e-4208b9942829/kube-multus/2.log" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.290277 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-srbwq_3c460c23-4b4a-458f-a52e-4208b9942829/kube-multus/1.log" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.290351 4771 generic.go:334] "Generic (PLEG): container finished" podID="3c460c23-4b4a-458f-a52e-4208b9942829" containerID="1f5c442299aaf88392fdb9b66293dcde4d1eac2143b2828533d23ec4d8860a72" exitCode=2 Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.290389 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-srbwq" event={"ID":"3c460c23-4b4a-458f-a52e-4208b9942829","Type":"ContainerDied","Data":"1f5c442299aaf88392fdb9b66293dcde4d1eac2143b2828533d23ec4d8860a72"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.290419 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2c7c6ab7e2f9f9a791788e08cb062278f1f02827724ac88cb37e84e901b8dcda"} Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.291298 4771 scope.go:117] "RemoveContainer" containerID="1f5c442299aaf88392fdb9b66293dcde4d1eac2143b2828533d23ec4d8860a72" Feb 27 01:17:45 crc kubenswrapper[4771]: E0227 01:17:45.291713 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-srbwq_openshift-multus(3c460c23-4b4a-458f-a52e-4208b9942829)\"" pod="openshift-multus/multus-srbwq" podUID="3c460c23-4b4a-458f-a52e-4208b9942829" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.294023 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkbv6\" (UniqueName: \"kubernetes.io/projected/2295d052-efbd-4d2f-b1ba-20fc81f6da86-kube-api-access-dkbv6\") pod \"ovnkube-node-z2hps\" (UID: \"2295d052-efbd-4d2f-b1ba-20fc81f6da86\") " pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.311191 4771 scope.go:117] "RemoveContainer" containerID="143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.349016 4771 scope.go:117] "RemoveContainer" containerID="b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.351041 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-h5vs8"] Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.356482 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-h5vs8"] Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.370053 4771 scope.go:117] "RemoveContainer" containerID="735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.388772 4771 scope.go:117] "RemoveContainer" containerID="2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.407159 4771 scope.go:117] "RemoveContainer" containerID="ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.410874 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.434804 4771 scope.go:117] "RemoveContainer" containerID="965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08" Feb 27 01:17:45 crc kubenswrapper[4771]: W0227 01:17:45.443426 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2295d052_efbd_4d2f_b1ba_20fc81f6da86.slice/crio-efe3cf652d3b23171d0f115644c3d051ac6522644cc2483f8c9420f7606faeb2 WatchSource:0}: Error finding container efe3cf652d3b23171d0f115644c3d051ac6522644cc2483f8c9420f7606faeb2: Status 404 returned error can't find the container with id efe3cf652d3b23171d0f115644c3d051ac6522644cc2483f8c9420f7606faeb2 Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.452668 4771 scope.go:117] "RemoveContainer" containerID="17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.483200 4771 scope.go:117] "RemoveContainer" containerID="a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.507007 4771 scope.go:117] "RemoveContainer" containerID="ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.527064 4771 scope.go:117] "RemoveContainer" containerID="81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66" Feb 27 01:17:45 crc kubenswrapper[4771]: E0227 01:17:45.527857 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66\": container with ID starting with 81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66 not found: ID does not exist" containerID="81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.527895 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66"} err="failed to get container status \"81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66\": rpc error: code = NotFound desc = could not find container \"81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66\": container with ID starting with 81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.527924 4771 scope.go:117] "RemoveContainer" containerID="143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b" Feb 27 01:17:45 crc kubenswrapper[4771]: E0227 01:17:45.528422 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b\": container with ID starting with 143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b not found: ID does not exist" containerID="143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.528488 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b"} err="failed to get container status \"143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b\": rpc error: code = NotFound desc = could not find container \"143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b\": container with ID starting with 143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.528581 4771 scope.go:117] "RemoveContainer" containerID="b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411" Feb 27 01:17:45 crc kubenswrapper[4771]: E0227 01:17:45.529012 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\": container with ID starting with b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411 not found: ID does not exist" containerID="b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.529042 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411"} err="failed to get container status \"b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\": rpc error: code = NotFound desc = could not find container \"b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\": container with ID starting with b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.529064 4771 scope.go:117] "RemoveContainer" containerID="735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1" Feb 27 01:17:45 crc kubenswrapper[4771]: E0227 01:17:45.529658 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\": container with ID starting with 735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1 not found: ID does not exist" containerID="735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.529695 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1"} err="failed to get container status \"735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\": rpc error: code = NotFound desc = could not find container \"735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\": container with ID starting with 735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.529717 4771 scope.go:117] "RemoveContainer" containerID="2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90" Feb 27 01:17:45 crc kubenswrapper[4771]: E0227 01:17:45.530315 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\": container with ID starting with 2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90 not found: ID does not exist" containerID="2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.530355 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90"} err="failed to get container status \"2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\": rpc error: code = NotFound desc = could not find container \"2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\": container with ID starting with 2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.530381 4771 scope.go:117] "RemoveContainer" containerID="ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70" Feb 27 01:17:45 crc kubenswrapper[4771]: E0227 01:17:45.531150 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\": container with ID starting with ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70 not found: ID does not exist" containerID="ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.531190 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70"} err="failed to get container status \"ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\": rpc error: code = NotFound desc = could not find container \"ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\": container with ID starting with ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.531208 4771 scope.go:117] "RemoveContainer" containerID="965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08" Feb 27 01:17:45 crc kubenswrapper[4771]: E0227 01:17:45.531500 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\": container with ID starting with 965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08 not found: ID does not exist" containerID="965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.531587 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08"} err="failed to get container status \"965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\": rpc error: code = NotFound desc = could not find container \"965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\": container with ID starting with 965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.531626 4771 scope.go:117] "RemoveContainer" containerID="17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c" Feb 27 01:17:45 crc kubenswrapper[4771]: E0227 01:17:45.532062 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\": container with ID starting with 17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c not found: ID does not exist" containerID="17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.532095 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c"} err="failed to get container status \"17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\": rpc error: code = NotFound desc = could not find container \"17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\": container with ID starting with 17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.532114 4771 scope.go:117] "RemoveContainer" containerID="a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62" Feb 27 01:17:45 crc kubenswrapper[4771]: E0227 01:17:45.532813 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\": container with ID starting with a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62 not found: ID does not exist" containerID="a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.532844 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62"} err="failed to get container status \"a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\": rpc error: code = NotFound desc = could not find container \"a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\": container with ID starting with a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.532861 4771 scope.go:117] "RemoveContainer" containerID="ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94" Feb 27 01:17:45 crc kubenswrapper[4771]: E0227 01:17:45.533120 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\": container with ID starting with ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94 not found: ID does not exist" containerID="ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.533148 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94"} err="failed to get container status \"ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\": rpc error: code = NotFound desc = could not find container \"ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\": container with ID starting with ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.533167 4771 scope.go:117] "RemoveContainer" containerID="81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.533930 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66"} err="failed to get container status \"81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66\": rpc error: code = NotFound desc = could not find container \"81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66\": container with ID starting with 81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.533959 4771 scope.go:117] "RemoveContainer" containerID="143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.534408 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b"} err="failed to get container status \"143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b\": rpc error: code = NotFound desc = could not find container \"143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b\": container with ID starting with 143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.534494 4771 scope.go:117] "RemoveContainer" containerID="b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.535059 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411"} err="failed to get container status \"b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\": rpc error: code = NotFound desc = could not find container \"b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\": container with ID starting with b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.535134 4771 scope.go:117] "RemoveContainer" containerID="735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.535737 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1"} err="failed to get container status \"735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\": rpc error: code = NotFound desc = could not find container \"735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\": container with ID starting with 735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.535766 4771 scope.go:117] "RemoveContainer" containerID="2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.536154 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90"} err="failed to get container status \"2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\": rpc error: code = NotFound desc = could not find container \"2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\": container with ID starting with 2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.536177 4771 scope.go:117] "RemoveContainer" containerID="ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.536732 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70"} err="failed to get container status \"ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\": rpc error: code = NotFound desc = could not find container \"ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\": container with ID starting with ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.536765 4771 scope.go:117] "RemoveContainer" containerID="965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.537104 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08"} err="failed to get container status \"965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\": rpc error: code = NotFound desc = could not find container \"965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\": container with ID starting with 965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.537130 4771 scope.go:117] "RemoveContainer" containerID="17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.537498 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c"} err="failed to get container status \"17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\": rpc error: code = NotFound desc = could not find container \"17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\": container with ID starting with 17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.537524 4771 scope.go:117] "RemoveContainer" containerID="a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.538110 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62"} err="failed to get container status \"a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\": rpc error: code = NotFound desc = could not find container \"a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\": container with ID starting with a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.538141 4771 scope.go:117] "RemoveContainer" containerID="ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.538534 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94"} err="failed to get container status \"ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\": rpc error: code = NotFound desc = could not find container \"ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\": container with ID starting with ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.538614 4771 scope.go:117] "RemoveContainer" containerID="81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.539125 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66"} err="failed to get container status \"81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66\": rpc error: code = NotFound desc = could not find container \"81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66\": container with ID starting with 81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.539187 4771 scope.go:117] "RemoveContainer" containerID="143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.539619 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b"} err="failed to get container status \"143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b\": rpc error: code = NotFound desc = could not find container \"143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b\": container with ID starting with 143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.539643 4771 scope.go:117] "RemoveContainer" containerID="b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.539880 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411"} err="failed to get container status \"b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\": rpc error: code = NotFound desc = could not find container \"b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\": container with ID starting with b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.539899 4771 scope.go:117] "RemoveContainer" containerID="735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.540182 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1"} err="failed to get container status \"735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\": rpc error: code = NotFound desc = could not find container \"735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\": container with ID starting with 735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.540210 4771 scope.go:117] "RemoveContainer" containerID="2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.540601 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90"} err="failed to get container status \"2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\": rpc error: code = NotFound desc = could not find container \"2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\": container with ID starting with 2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.540628 4771 scope.go:117] "RemoveContainer" containerID="ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.541155 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70"} err="failed to get container status \"ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\": rpc error: code = NotFound desc = could not find container \"ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\": container with ID starting with ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.541183 4771 scope.go:117] "RemoveContainer" containerID="965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.541476 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08"} err="failed to get container status \"965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\": rpc error: code = NotFound desc = could not find container \"965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\": container with ID starting with 965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.541507 4771 scope.go:117] "RemoveContainer" containerID="17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.542852 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c"} err="failed to get container status \"17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\": rpc error: code = NotFound desc = could not find container \"17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\": container with ID starting with 17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.542882 4771 scope.go:117] "RemoveContainer" containerID="a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.543252 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62"} err="failed to get container status \"a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\": rpc error: code = NotFound desc = could not find container \"a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\": container with ID starting with a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.543277 4771 scope.go:117] "RemoveContainer" containerID="ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.543703 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94"} err="failed to get container status \"ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\": rpc error: code = NotFound desc = could not find container \"ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\": container with ID starting with ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.543818 4771 scope.go:117] "RemoveContainer" containerID="81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.544196 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66"} err="failed to get container status \"81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66\": rpc error: code = NotFound desc = could not find container \"81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66\": container with ID starting with 81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.544220 4771 scope.go:117] "RemoveContainer" containerID="143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.544618 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b"} err="failed to get container status \"143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b\": rpc error: code = NotFound desc = could not find container \"143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b\": container with ID starting with 143278244a08bbcf723de4c4aa73906bfb87e914380d67d0f9a49d1e41da2e5b not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.544647 4771 scope.go:117] "RemoveContainer" containerID="b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.545126 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411"} err="failed to get container status \"b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\": rpc error: code = NotFound desc = could not find container \"b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411\": container with ID starting with b671bfcc2b7bbd4cb9f3a5ce475f53269aeef3ee83f6a83e253b65f719d4c411 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.545150 4771 scope.go:117] "RemoveContainer" containerID="735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.545844 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1"} err="failed to get container status \"735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\": rpc error: code = NotFound desc = could not find container \"735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1\": container with ID starting with 735525c44d9754609bbb40466bfb36da2c1542fb4b2b37bfaef7670dff7466f1 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.545868 4771 scope.go:117] "RemoveContainer" containerID="2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.546269 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90"} err="failed to get container status \"2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\": rpc error: code = NotFound desc = could not find container \"2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90\": container with ID starting with 2c8cde5cf0dde1a036d2ddc26e6568a4f95cfc8372860e453492b796522fea90 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.546297 4771 scope.go:117] "RemoveContainer" containerID="ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.546725 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70"} err="failed to get container status \"ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\": rpc error: code = NotFound desc = could not find container \"ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70\": container with ID starting with ae7486d73963dcfe5f2d5ea413838be6b3d69ad77b4161902650ea537b435d70 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.546749 4771 scope.go:117] "RemoveContainer" containerID="965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.547289 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08"} err="failed to get container status \"965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\": rpc error: code = NotFound desc = could not find container \"965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08\": container with ID starting with 965e17eb3ad65380753129c38f077c1f2b491827db3eaeff4024be17cefe9a08 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.547315 4771 scope.go:117] "RemoveContainer" containerID="17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.547684 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c"} err="failed to get container status \"17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\": rpc error: code = NotFound desc = could not find container \"17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c\": container with ID starting with 17a0c697e13e03ba0d2a1928c838ae5d966059548f0697e4c26fdbd4a52daa9c not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.547715 4771 scope.go:117] "RemoveContainer" containerID="a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.548103 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62"} err="failed to get container status \"a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\": rpc error: code = NotFound desc = could not find container \"a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62\": container with ID starting with a2cc4b32c4f6dea233f6e6f4fb178f64f7bccb1206a6e36f7f68ad9b6041ef62 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.548133 4771 scope.go:117] "RemoveContainer" containerID="ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.548586 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94"} err="failed to get container status \"ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\": rpc error: code = NotFound desc = could not find container \"ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94\": container with ID starting with ec23392cd09df830ef1b5c6e3a8ae80d6ee596df231d4c5ba97e7f6e329cda94 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.548724 4771 scope.go:117] "RemoveContainer" containerID="81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.549024 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66"} err="failed to get container status \"81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66\": rpc error: code = NotFound desc = could not find container \"81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66\": container with ID starting with 81bfc482bcc7951afc701926e8390bf7918dfa03a45e37013cda2c71c8950c66 not found: ID does not exist" Feb 27 01:17:45 crc kubenswrapper[4771]: I0227 01:17:45.788919 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21f824c6-1bde-4e58-b4ef-72a56a140abb" path="/var/lib/kubelet/pods/21f824c6-1bde-4e58-b4ef-72a56a140abb/volumes" Feb 27 01:17:46 crc kubenswrapper[4771]: I0227 01:17:46.302747 4771 generic.go:334] "Generic (PLEG): container finished" podID="2295d052-efbd-4d2f-b1ba-20fc81f6da86" containerID="ce947a8726a9d27d802fbbc635c29983e4111820a14db6d4b0f0b822eb22d66c" exitCode=0 Feb 27 01:17:46 crc kubenswrapper[4771]: I0227 01:17:46.302831 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" event={"ID":"2295d052-efbd-4d2f-b1ba-20fc81f6da86","Type":"ContainerDied","Data":"ce947a8726a9d27d802fbbc635c29983e4111820a14db6d4b0f0b822eb22d66c"} Feb 27 01:17:46 crc kubenswrapper[4771]: I0227 01:17:46.303005 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" event={"ID":"2295d052-efbd-4d2f-b1ba-20fc81f6da86","Type":"ContainerStarted","Data":"efe3cf652d3b23171d0f115644c3d051ac6522644cc2483f8c9420f7606faeb2"} Feb 27 01:17:47 crc kubenswrapper[4771]: I0227 01:17:47.315266 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" event={"ID":"2295d052-efbd-4d2f-b1ba-20fc81f6da86","Type":"ContainerStarted","Data":"9a75b1e2f3ef5b6b1dcf958b9e142475a20ab2f8d89de546e009d463ad5ce019"} Feb 27 01:17:47 crc kubenswrapper[4771]: I0227 01:17:47.315671 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" event={"ID":"2295d052-efbd-4d2f-b1ba-20fc81f6da86","Type":"ContainerStarted","Data":"fe24ea66e76789592ccc30a9ebd18a789b8b91d1e97e46431fa96709e10dd8bd"} Feb 27 01:17:47 crc kubenswrapper[4771]: I0227 01:17:47.315684 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" event={"ID":"2295d052-efbd-4d2f-b1ba-20fc81f6da86","Type":"ContainerStarted","Data":"3d732d154e85ed74cb6652e0165c2f6b62706e6155da5d47eac50319554dd099"} Feb 27 01:17:47 crc kubenswrapper[4771]: I0227 01:17:47.315694 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" event={"ID":"2295d052-efbd-4d2f-b1ba-20fc81f6da86","Type":"ContainerStarted","Data":"2056e7fc467e7944fa8a73c6aa734ba5482d1cd3c68a5d32dbe43af2051c5c4e"} Feb 27 01:17:47 crc kubenswrapper[4771]: I0227 01:17:47.315706 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" event={"ID":"2295d052-efbd-4d2f-b1ba-20fc81f6da86","Type":"ContainerStarted","Data":"0d6ac000d00dc597febf6d46d38df981f2c41dad45cf558c1f97b33251d75369"} Feb 27 01:17:47 crc kubenswrapper[4771]: I0227 01:17:47.315715 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" event={"ID":"2295d052-efbd-4d2f-b1ba-20fc81f6da86","Type":"ContainerStarted","Data":"725993435ddf57783ee29e0d8685cca8aab6580db3007198847d7ac92a57b908"} Feb 27 01:17:50 crc kubenswrapper[4771]: I0227 01:17:50.350060 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" event={"ID":"2295d052-efbd-4d2f-b1ba-20fc81f6da86","Type":"ContainerStarted","Data":"0bc5260b3c54d5b1831670b45d8d0e281517e6fd8ab6cf152daebac7eed05aad"} Feb 27 01:17:52 crc kubenswrapper[4771]: I0227 01:17:52.367762 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" event={"ID":"2295d052-efbd-4d2f-b1ba-20fc81f6da86","Type":"ContainerStarted","Data":"e63949286b0a967739e1ca60c618fab7188e498842006497a6defbb727762fe5"} Feb 27 01:17:52 crc kubenswrapper[4771]: I0227 01:17:52.368347 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:52 crc kubenswrapper[4771]: I0227 01:17:52.368363 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:52 crc kubenswrapper[4771]: I0227 01:17:52.368374 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:52 crc kubenswrapper[4771]: I0227 01:17:52.403005 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:52 crc kubenswrapper[4771]: I0227 01:17:52.410610 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:17:52 crc kubenswrapper[4771]: I0227 01:17:52.417761 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" podStartSLOduration=7.417745377 podStartE2EDuration="7.417745377s" podCreationTimestamp="2026-02-27 01:17:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:17:52.416878344 +0000 UTC m=+785.354439632" watchObservedRunningTime="2026-02-27 01:17:52.417745377 +0000 UTC m=+785.355306675" Feb 27 01:18:00 crc kubenswrapper[4771]: I0227 01:18:00.148282 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535918-kd9mw"] Feb 27 01:18:00 crc kubenswrapper[4771]: I0227 01:18:00.150909 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535918-kd9mw" Feb 27 01:18:00 crc kubenswrapper[4771]: I0227 01:18:00.154730 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:18:00 crc kubenswrapper[4771]: I0227 01:18:00.155042 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:18:00 crc kubenswrapper[4771]: I0227 01:18:00.155598 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 01:18:00 crc kubenswrapper[4771]: I0227 01:18:00.158474 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535918-kd9mw"] Feb 27 01:18:00 crc kubenswrapper[4771]: I0227 01:18:00.187161 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wclj6\" (UniqueName: \"kubernetes.io/projected/f5cdfa47-132f-4eb8-95c0-efd8ba314ab7-kube-api-access-wclj6\") pod \"auto-csr-approver-29535918-kd9mw\" (UID: \"f5cdfa47-132f-4eb8-95c0-efd8ba314ab7\") " pod="openshift-infra/auto-csr-approver-29535918-kd9mw" Feb 27 01:18:00 crc kubenswrapper[4771]: I0227 01:18:00.288496 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wclj6\" (UniqueName: \"kubernetes.io/projected/f5cdfa47-132f-4eb8-95c0-efd8ba314ab7-kube-api-access-wclj6\") pod \"auto-csr-approver-29535918-kd9mw\" (UID: \"f5cdfa47-132f-4eb8-95c0-efd8ba314ab7\") " pod="openshift-infra/auto-csr-approver-29535918-kd9mw" Feb 27 01:18:00 crc kubenswrapper[4771]: I0227 01:18:00.324341 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wclj6\" (UniqueName: \"kubernetes.io/projected/f5cdfa47-132f-4eb8-95c0-efd8ba314ab7-kube-api-access-wclj6\") pod \"auto-csr-approver-29535918-kd9mw\" (UID: \"f5cdfa47-132f-4eb8-95c0-efd8ba314ab7\") " pod="openshift-infra/auto-csr-approver-29535918-kd9mw" Feb 27 01:18:00 crc kubenswrapper[4771]: I0227 01:18:00.484319 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535918-kd9mw" Feb 27 01:18:00 crc kubenswrapper[4771]: E0227 01:18:00.536924 4771 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535918-kd9mw_openshift-infra_f5cdfa47-132f-4eb8-95c0-efd8ba314ab7_0(231769d43ed72b7a1f03d738f8301f2ec123c10879e13cfde1308660821293a2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 01:18:00 crc kubenswrapper[4771]: E0227 01:18:00.537009 4771 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535918-kd9mw_openshift-infra_f5cdfa47-132f-4eb8-95c0-efd8ba314ab7_0(231769d43ed72b7a1f03d738f8301f2ec123c10879e13cfde1308660821293a2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29535918-kd9mw" Feb 27 01:18:00 crc kubenswrapper[4771]: E0227 01:18:00.537045 4771 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535918-kd9mw_openshift-infra_f5cdfa47-132f-4eb8-95c0-efd8ba314ab7_0(231769d43ed72b7a1f03d738f8301f2ec123c10879e13cfde1308660821293a2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29535918-kd9mw" Feb 27 01:18:00 crc kubenswrapper[4771]: E0227 01:18:00.537119 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29535918-kd9mw_openshift-infra(f5cdfa47-132f-4eb8-95c0-efd8ba314ab7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29535918-kd9mw_openshift-infra(f5cdfa47-132f-4eb8-95c0-efd8ba314ab7)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535918-kd9mw_openshift-infra_f5cdfa47-132f-4eb8-95c0-efd8ba314ab7_0(231769d43ed72b7a1f03d738f8301f2ec123c10879e13cfde1308660821293a2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29535918-kd9mw" podUID="f5cdfa47-132f-4eb8-95c0-efd8ba314ab7" Feb 27 01:18:00 crc kubenswrapper[4771]: I0227 01:18:00.773354 4771 scope.go:117] "RemoveContainer" containerID="1f5c442299aaf88392fdb9b66293dcde4d1eac2143b2828533d23ec4d8860a72" Feb 27 01:18:00 crc kubenswrapper[4771]: E0227 01:18:00.773821 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-srbwq_openshift-multus(3c460c23-4b4a-458f-a52e-4208b9942829)\"" pod="openshift-multus/multus-srbwq" podUID="3c460c23-4b4a-458f-a52e-4208b9942829" Feb 27 01:18:01 crc kubenswrapper[4771]: I0227 01:18:01.429127 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535918-kd9mw" Feb 27 01:18:01 crc kubenswrapper[4771]: I0227 01:18:01.429732 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535918-kd9mw" Feb 27 01:18:01 crc kubenswrapper[4771]: E0227 01:18:01.469123 4771 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535918-kd9mw_openshift-infra_f5cdfa47-132f-4eb8-95c0-efd8ba314ab7_0(86d24350ad55a5be72c4fb9af3eb479772ff9ee19c3181b3cb1803b257b44a87): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 01:18:01 crc kubenswrapper[4771]: E0227 01:18:01.469189 4771 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535918-kd9mw_openshift-infra_f5cdfa47-132f-4eb8-95c0-efd8ba314ab7_0(86d24350ad55a5be72c4fb9af3eb479772ff9ee19c3181b3cb1803b257b44a87): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29535918-kd9mw" Feb 27 01:18:01 crc kubenswrapper[4771]: E0227 01:18:01.469214 4771 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535918-kd9mw_openshift-infra_f5cdfa47-132f-4eb8-95c0-efd8ba314ab7_0(86d24350ad55a5be72c4fb9af3eb479772ff9ee19c3181b3cb1803b257b44a87): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29535918-kd9mw" Feb 27 01:18:01 crc kubenswrapper[4771]: E0227 01:18:01.469263 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29535918-kd9mw_openshift-infra(f5cdfa47-132f-4eb8-95c0-efd8ba314ab7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29535918-kd9mw_openshift-infra(f5cdfa47-132f-4eb8-95c0-efd8ba314ab7)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535918-kd9mw_openshift-infra_f5cdfa47-132f-4eb8-95c0-efd8ba314ab7_0(86d24350ad55a5be72c4fb9af3eb479772ff9ee19c3181b3cb1803b257b44a87): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29535918-kd9mw" podUID="f5cdfa47-132f-4eb8-95c0-efd8ba314ab7" Feb 27 01:18:07 crc kubenswrapper[4771]: I0227 01:18:07.302912 4771 scope.go:117] "RemoveContainer" containerID="60f3277e71f994b220e974e345c987b63441737b0cbfeb43596e96b208c99291" Feb 27 01:18:07 crc kubenswrapper[4771]: I0227 01:18:07.358617 4771 scope.go:117] "RemoveContainer" containerID="2c7c6ab7e2f9f9a791788e08cb062278f1f02827724ac88cb37e84e901b8dcda" Feb 27 01:18:07 crc kubenswrapper[4771]: I0227 01:18:07.474407 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-srbwq_3c460c23-4b4a-458f-a52e-4208b9942829/kube-multus/2.log" Feb 27 01:18:13 crc kubenswrapper[4771]: I0227 01:18:13.773264 4771 scope.go:117] "RemoveContainer" containerID="1f5c442299aaf88392fdb9b66293dcde4d1eac2143b2828533d23ec4d8860a72" Feb 27 01:18:14 crc kubenswrapper[4771]: I0227 01:18:14.525433 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-srbwq_3c460c23-4b4a-458f-a52e-4208b9942829/kube-multus/2.log" Feb 27 01:18:14 crc kubenswrapper[4771]: I0227 01:18:14.525840 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-srbwq" event={"ID":"3c460c23-4b4a-458f-a52e-4208b9942829","Type":"ContainerStarted","Data":"101b11679da99cb4aa6bca8306e132a0aac17f9eb18a0d7419e6c75223cbc3ba"} Feb 27 01:18:14 crc kubenswrapper[4771]: I0227 01:18:14.772323 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535918-kd9mw" Feb 27 01:18:14 crc kubenswrapper[4771]: I0227 01:18:14.772692 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535918-kd9mw" Feb 27 01:18:14 crc kubenswrapper[4771]: E0227 01:18:14.804665 4771 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535918-kd9mw_openshift-infra_f5cdfa47-132f-4eb8-95c0-efd8ba314ab7_0(94f7f4a8b826b5869206fc96094a46066e731460982921e2e147b6cfdbbcd661): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 01:18:14 crc kubenswrapper[4771]: E0227 01:18:14.804757 4771 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535918-kd9mw_openshift-infra_f5cdfa47-132f-4eb8-95c0-efd8ba314ab7_0(94f7f4a8b826b5869206fc96094a46066e731460982921e2e147b6cfdbbcd661): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29535918-kd9mw" Feb 27 01:18:14 crc kubenswrapper[4771]: E0227 01:18:14.804792 4771 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535918-kd9mw_openshift-infra_f5cdfa47-132f-4eb8-95c0-efd8ba314ab7_0(94f7f4a8b826b5869206fc96094a46066e731460982921e2e147b6cfdbbcd661): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29535918-kd9mw" Feb 27 01:18:14 crc kubenswrapper[4771]: E0227 01:18:14.804864 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29535918-kd9mw_openshift-infra(f5cdfa47-132f-4eb8-95c0-efd8ba314ab7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29535918-kd9mw_openshift-infra(f5cdfa47-132f-4eb8-95c0-efd8ba314ab7)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535918-kd9mw_openshift-infra_f5cdfa47-132f-4eb8-95c0-efd8ba314ab7_0(94f7f4a8b826b5869206fc96094a46066e731460982921e2e147b6cfdbbcd661): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29535918-kd9mw" podUID="f5cdfa47-132f-4eb8-95c0-efd8ba314ab7" Feb 27 01:18:15 crc kubenswrapper[4771]: I0227 01:18:15.445687 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z2hps" Feb 27 01:18:20 crc kubenswrapper[4771]: I0227 01:18:20.115218 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck"] Feb 27 01:18:20 crc kubenswrapper[4771]: I0227 01:18:20.117698 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck" Feb 27 01:18:20 crc kubenswrapper[4771]: I0227 01:18:20.121075 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 27 01:18:20 crc kubenswrapper[4771]: I0227 01:18:20.132076 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck"] Feb 27 01:18:20 crc kubenswrapper[4771]: I0227 01:18:20.277643 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hvls\" (UniqueName: \"kubernetes.io/projected/78ae3d79-21d2-41f2-8685-9eeb9095dbb9-kube-api-access-9hvls\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck\" (UID: \"78ae3d79-21d2-41f2-8685-9eeb9095dbb9\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck" Feb 27 01:18:20 crc kubenswrapper[4771]: I0227 01:18:20.277771 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78ae3d79-21d2-41f2-8685-9eeb9095dbb9-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck\" (UID: \"78ae3d79-21d2-41f2-8685-9eeb9095dbb9\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck" Feb 27 01:18:20 crc kubenswrapper[4771]: I0227 01:18:20.278349 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78ae3d79-21d2-41f2-8685-9eeb9095dbb9-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck\" (UID: \"78ae3d79-21d2-41f2-8685-9eeb9095dbb9\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck" Feb 27 01:18:20 crc kubenswrapper[4771]: I0227 01:18:20.379875 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78ae3d79-21d2-41f2-8685-9eeb9095dbb9-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck\" (UID: \"78ae3d79-21d2-41f2-8685-9eeb9095dbb9\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck" Feb 27 01:18:20 crc kubenswrapper[4771]: I0227 01:18:20.379957 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hvls\" (UniqueName: \"kubernetes.io/projected/78ae3d79-21d2-41f2-8685-9eeb9095dbb9-kube-api-access-9hvls\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck\" (UID: \"78ae3d79-21d2-41f2-8685-9eeb9095dbb9\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck" Feb 27 01:18:20 crc kubenswrapper[4771]: I0227 01:18:20.380145 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78ae3d79-21d2-41f2-8685-9eeb9095dbb9-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck\" (UID: \"78ae3d79-21d2-41f2-8685-9eeb9095dbb9\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck" Feb 27 01:18:20 crc kubenswrapper[4771]: I0227 01:18:20.380907 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78ae3d79-21d2-41f2-8685-9eeb9095dbb9-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck\" (UID: \"78ae3d79-21d2-41f2-8685-9eeb9095dbb9\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck" Feb 27 01:18:20 crc kubenswrapper[4771]: I0227 01:18:20.381294 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78ae3d79-21d2-41f2-8685-9eeb9095dbb9-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck\" (UID: \"78ae3d79-21d2-41f2-8685-9eeb9095dbb9\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck" Feb 27 01:18:20 crc kubenswrapper[4771]: I0227 01:18:20.418190 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hvls\" (UniqueName: \"kubernetes.io/projected/78ae3d79-21d2-41f2-8685-9eeb9095dbb9-kube-api-access-9hvls\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck\" (UID: \"78ae3d79-21d2-41f2-8685-9eeb9095dbb9\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck" Feb 27 01:18:20 crc kubenswrapper[4771]: I0227 01:18:20.440504 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck" Feb 27 01:18:20 crc kubenswrapper[4771]: I0227 01:18:20.686978 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck"] Feb 27 01:18:20 crc kubenswrapper[4771]: W0227 01:18:20.690934 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78ae3d79_21d2_41f2_8685_9eeb9095dbb9.slice/crio-4b5edfa9f5fc2bae8a59e3ce57d8403c433927c15daa3e21f7b6e48e06425661 WatchSource:0}: Error finding container 4b5edfa9f5fc2bae8a59e3ce57d8403c433927c15daa3e21f7b6e48e06425661: Status 404 returned error can't find the container with id 4b5edfa9f5fc2bae8a59e3ce57d8403c433927c15daa3e21f7b6e48e06425661 Feb 27 01:18:21 crc kubenswrapper[4771]: I0227 01:18:21.579413 4771 generic.go:334] "Generic (PLEG): container finished" podID="78ae3d79-21d2-41f2-8685-9eeb9095dbb9" containerID="b51f79bff2f2b91298e1ce4c78e9503205a94e9051535ce19b6ff1bba1ab5da4" exitCode=0 Feb 27 01:18:21 crc kubenswrapper[4771]: I0227 01:18:21.579477 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck" event={"ID":"78ae3d79-21d2-41f2-8685-9eeb9095dbb9","Type":"ContainerDied","Data":"b51f79bff2f2b91298e1ce4c78e9503205a94e9051535ce19b6ff1bba1ab5da4"} Feb 27 01:18:21 crc kubenswrapper[4771]: I0227 01:18:21.579519 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck" event={"ID":"78ae3d79-21d2-41f2-8685-9eeb9095dbb9","Type":"ContainerStarted","Data":"4b5edfa9f5fc2bae8a59e3ce57d8403c433927c15daa3e21f7b6e48e06425661"} Feb 27 01:18:23 crc kubenswrapper[4771]: I0227 01:18:23.602751 4771 generic.go:334] "Generic (PLEG): container finished" podID="78ae3d79-21d2-41f2-8685-9eeb9095dbb9" containerID="afc14a709267a33967442691a440db16b5979adf107c0edbfd690f1082b15eb7" exitCode=0 Feb 27 01:18:23 crc kubenswrapper[4771]: I0227 01:18:23.602890 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck" event={"ID":"78ae3d79-21d2-41f2-8685-9eeb9095dbb9","Type":"ContainerDied","Data":"afc14a709267a33967442691a440db16b5979adf107c0edbfd690f1082b15eb7"} Feb 27 01:18:24 crc kubenswrapper[4771]: I0227 01:18:24.613792 4771 generic.go:334] "Generic (PLEG): container finished" podID="78ae3d79-21d2-41f2-8685-9eeb9095dbb9" containerID="13882e8cf917eebeafde741a9de8901369cef203db25e722b425964d343ec110" exitCode=0 Feb 27 01:18:24 crc kubenswrapper[4771]: I0227 01:18:24.613928 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck" event={"ID":"78ae3d79-21d2-41f2-8685-9eeb9095dbb9","Type":"ContainerDied","Data":"13882e8cf917eebeafde741a9de8901369cef203db25e722b425964d343ec110"} Feb 27 01:18:25 crc kubenswrapper[4771]: I0227 01:18:25.864159 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck" Feb 27 01:18:25 crc kubenswrapper[4771]: I0227 01:18:25.966185 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78ae3d79-21d2-41f2-8685-9eeb9095dbb9-bundle\") pod \"78ae3d79-21d2-41f2-8685-9eeb9095dbb9\" (UID: \"78ae3d79-21d2-41f2-8685-9eeb9095dbb9\") " Feb 27 01:18:25 crc kubenswrapper[4771]: I0227 01:18:25.966462 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hvls\" (UniqueName: \"kubernetes.io/projected/78ae3d79-21d2-41f2-8685-9eeb9095dbb9-kube-api-access-9hvls\") pod \"78ae3d79-21d2-41f2-8685-9eeb9095dbb9\" (UID: \"78ae3d79-21d2-41f2-8685-9eeb9095dbb9\") " Feb 27 01:18:25 crc kubenswrapper[4771]: I0227 01:18:25.966631 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78ae3d79-21d2-41f2-8685-9eeb9095dbb9-util\") pod \"78ae3d79-21d2-41f2-8685-9eeb9095dbb9\" (UID: \"78ae3d79-21d2-41f2-8685-9eeb9095dbb9\") " Feb 27 01:18:25 crc kubenswrapper[4771]: I0227 01:18:25.967409 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78ae3d79-21d2-41f2-8685-9eeb9095dbb9-bundle" (OuterVolumeSpecName: "bundle") pod "78ae3d79-21d2-41f2-8685-9eeb9095dbb9" (UID: "78ae3d79-21d2-41f2-8685-9eeb9095dbb9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:18:25 crc kubenswrapper[4771]: I0227 01:18:25.973150 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78ae3d79-21d2-41f2-8685-9eeb9095dbb9-kube-api-access-9hvls" (OuterVolumeSpecName: "kube-api-access-9hvls") pod "78ae3d79-21d2-41f2-8685-9eeb9095dbb9" (UID: "78ae3d79-21d2-41f2-8685-9eeb9095dbb9"). InnerVolumeSpecName "kube-api-access-9hvls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:18:25 crc kubenswrapper[4771]: I0227 01:18:25.987452 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78ae3d79-21d2-41f2-8685-9eeb9095dbb9-util" (OuterVolumeSpecName: "util") pod "78ae3d79-21d2-41f2-8685-9eeb9095dbb9" (UID: "78ae3d79-21d2-41f2-8685-9eeb9095dbb9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:18:26 crc kubenswrapper[4771]: I0227 01:18:26.068153 4771 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78ae3d79-21d2-41f2-8685-9eeb9095dbb9-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:18:26 crc kubenswrapper[4771]: I0227 01:18:26.068202 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hvls\" (UniqueName: \"kubernetes.io/projected/78ae3d79-21d2-41f2-8685-9eeb9095dbb9-kube-api-access-9hvls\") on node \"crc\" DevicePath \"\"" Feb 27 01:18:26 crc kubenswrapper[4771]: I0227 01:18:26.068222 4771 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78ae3d79-21d2-41f2-8685-9eeb9095dbb9-util\") on node \"crc\" DevicePath \"\"" Feb 27 01:18:26 crc kubenswrapper[4771]: I0227 01:18:26.631643 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck" event={"ID":"78ae3d79-21d2-41f2-8685-9eeb9095dbb9","Type":"ContainerDied","Data":"4b5edfa9f5fc2bae8a59e3ce57d8403c433927c15daa3e21f7b6e48e06425661"} Feb 27 01:18:26 crc kubenswrapper[4771]: I0227 01:18:26.632071 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b5edfa9f5fc2bae8a59e3ce57d8403c433927c15daa3e21f7b6e48e06425661" Feb 27 01:18:26 crc kubenswrapper[4771]: I0227 01:18:26.631668 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck" Feb 27 01:18:27 crc kubenswrapper[4771]: I0227 01:18:27.772863 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535918-kd9mw" Feb 27 01:18:27 crc kubenswrapper[4771]: I0227 01:18:27.782646 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535918-kd9mw" Feb 27 01:18:28 crc kubenswrapper[4771]: I0227 01:18:28.061311 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535918-kd9mw"] Feb 27 01:18:28 crc kubenswrapper[4771]: W0227 01:18:28.068431 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5cdfa47_132f_4eb8_95c0_efd8ba314ab7.slice/crio-6e4687318591e77c2de5d5d5d8bd53bba5cdb18819c0084a7fcac9b374578554 WatchSource:0}: Error finding container 6e4687318591e77c2de5d5d5d8bd53bba5cdb18819c0084a7fcac9b374578554: Status 404 returned error can't find the container with id 6e4687318591e77c2de5d5d5d8bd53bba5cdb18819c0084a7fcac9b374578554 Feb 27 01:18:28 crc kubenswrapper[4771]: I0227 01:18:28.649987 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535918-kd9mw" event={"ID":"f5cdfa47-132f-4eb8-95c0-efd8ba314ab7","Type":"ContainerStarted","Data":"6e4687318591e77c2de5d5d5d8bd53bba5cdb18819c0084a7fcac9b374578554"} Feb 27 01:18:29 crc kubenswrapper[4771]: I0227 01:18:29.659773 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535918-kd9mw" event={"ID":"f5cdfa47-132f-4eb8-95c0-efd8ba314ab7","Type":"ContainerStarted","Data":"b2398ecd90e0f24eb235ae146443c005e5a83958bb522228dd6c94a74044a3ba"} Feb 27 01:18:29 crc kubenswrapper[4771]: I0227 01:18:29.683622 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535918-kd9mw" podStartSLOduration=28.621089338 podStartE2EDuration="29.683597748s" podCreationTimestamp="2026-02-27 01:18:00 +0000 UTC" firstStartedPulling="2026-02-27 01:18:28.072830372 +0000 UTC m=+821.010391700" lastFinishedPulling="2026-02-27 01:18:29.135338782 +0000 UTC m=+822.072900110" observedRunningTime="2026-02-27 01:18:29.676770291 +0000 UTC m=+822.614331619" watchObservedRunningTime="2026-02-27 01:18:29.683597748 +0000 UTC m=+822.621159066" Feb 27 01:18:30 crc kubenswrapper[4771]: I0227 01:18:30.667712 4771 generic.go:334] "Generic (PLEG): container finished" podID="f5cdfa47-132f-4eb8-95c0-efd8ba314ab7" containerID="b2398ecd90e0f24eb235ae146443c005e5a83958bb522228dd6c94a74044a3ba" exitCode=0 Feb 27 01:18:30 crc kubenswrapper[4771]: I0227 01:18:30.667778 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535918-kd9mw" event={"ID":"f5cdfa47-132f-4eb8-95c0-efd8ba314ab7","Type":"ContainerDied","Data":"b2398ecd90e0f24eb235ae146443c005e5a83958bb522228dd6c94a74044a3ba"} Feb 27 01:18:31 crc kubenswrapper[4771]: I0227 01:18:31.704717 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-tmk5t"] Feb 27 01:18:31 crc kubenswrapper[4771]: E0227 01:18:31.704908 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78ae3d79-21d2-41f2-8685-9eeb9095dbb9" containerName="util" Feb 27 01:18:31 crc kubenswrapper[4771]: I0227 01:18:31.704919 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ae3d79-21d2-41f2-8685-9eeb9095dbb9" containerName="util" Feb 27 01:18:31 crc kubenswrapper[4771]: E0227 01:18:31.704930 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78ae3d79-21d2-41f2-8685-9eeb9095dbb9" containerName="pull" Feb 27 01:18:31 crc kubenswrapper[4771]: I0227 01:18:31.704936 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ae3d79-21d2-41f2-8685-9eeb9095dbb9" containerName="pull" Feb 27 01:18:31 crc kubenswrapper[4771]: E0227 01:18:31.704946 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78ae3d79-21d2-41f2-8685-9eeb9095dbb9" containerName="extract" Feb 27 01:18:31 crc kubenswrapper[4771]: I0227 01:18:31.704952 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ae3d79-21d2-41f2-8685-9eeb9095dbb9" containerName="extract" Feb 27 01:18:31 crc kubenswrapper[4771]: I0227 01:18:31.705032 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="78ae3d79-21d2-41f2-8685-9eeb9095dbb9" containerName="extract" Feb 27 01:18:31 crc kubenswrapper[4771]: I0227 01:18:31.705371 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-tmk5t" Feb 27 01:18:31 crc kubenswrapper[4771]: I0227 01:18:31.707884 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 27 01:18:31 crc kubenswrapper[4771]: I0227 01:18:31.708027 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-wrpdz" Feb 27 01:18:31 crc kubenswrapper[4771]: I0227 01:18:31.711163 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 27 01:18:31 crc kubenswrapper[4771]: I0227 01:18:31.727299 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-tmk5t"] Feb 27 01:18:31 crc kubenswrapper[4771]: I0227 01:18:31.846786 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bls7\" (UniqueName: \"kubernetes.io/projected/b7560148-b519-4709-a6a8-184258052e14-kube-api-access-8bls7\") pod \"nmstate-operator-75c5dccd6c-tmk5t\" (UID: \"b7560148-b519-4709-a6a8-184258052e14\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-tmk5t" Feb 27 01:18:31 crc kubenswrapper[4771]: I0227 01:18:31.888337 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535918-kd9mw" Feb 27 01:18:31 crc kubenswrapper[4771]: I0227 01:18:31.948321 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bls7\" (UniqueName: \"kubernetes.io/projected/b7560148-b519-4709-a6a8-184258052e14-kube-api-access-8bls7\") pod \"nmstate-operator-75c5dccd6c-tmk5t\" (UID: \"b7560148-b519-4709-a6a8-184258052e14\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-tmk5t" Feb 27 01:18:31 crc kubenswrapper[4771]: I0227 01:18:31.966451 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bls7\" (UniqueName: \"kubernetes.io/projected/b7560148-b519-4709-a6a8-184258052e14-kube-api-access-8bls7\") pod \"nmstate-operator-75c5dccd6c-tmk5t\" (UID: \"b7560148-b519-4709-a6a8-184258052e14\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-tmk5t" Feb 27 01:18:32 crc kubenswrapper[4771]: I0227 01:18:32.025209 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-tmk5t" Feb 27 01:18:32 crc kubenswrapper[4771]: I0227 01:18:32.049781 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wclj6\" (UniqueName: \"kubernetes.io/projected/f5cdfa47-132f-4eb8-95c0-efd8ba314ab7-kube-api-access-wclj6\") pod \"f5cdfa47-132f-4eb8-95c0-efd8ba314ab7\" (UID: \"f5cdfa47-132f-4eb8-95c0-efd8ba314ab7\") " Feb 27 01:18:32 crc kubenswrapper[4771]: I0227 01:18:32.054541 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5cdfa47-132f-4eb8-95c0-efd8ba314ab7-kube-api-access-wclj6" (OuterVolumeSpecName: "kube-api-access-wclj6") pod "f5cdfa47-132f-4eb8-95c0-efd8ba314ab7" (UID: "f5cdfa47-132f-4eb8-95c0-efd8ba314ab7"). InnerVolumeSpecName "kube-api-access-wclj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:18:32 crc kubenswrapper[4771]: I0227 01:18:32.152270 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wclj6\" (UniqueName: \"kubernetes.io/projected/f5cdfa47-132f-4eb8-95c0-efd8ba314ab7-kube-api-access-wclj6\") on node \"crc\" DevicePath \"\"" Feb 27 01:18:32 crc kubenswrapper[4771]: I0227 01:18:32.265771 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-tmk5t"] Feb 27 01:18:32 crc kubenswrapper[4771]: W0227 01:18:32.271825 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7560148_b519_4709_a6a8_184258052e14.slice/crio-63d2b2521382467ae8d8452299fcdabbac6aae89eb81b88b8fcc40396200b205 WatchSource:0}: Error finding container 63d2b2521382467ae8d8452299fcdabbac6aae89eb81b88b8fcc40396200b205: Status 404 returned error can't find the container with id 63d2b2521382467ae8d8452299fcdabbac6aae89eb81b88b8fcc40396200b205 Feb 27 01:18:32 crc kubenswrapper[4771]: I0227 01:18:32.679973 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535918-kd9mw" event={"ID":"f5cdfa47-132f-4eb8-95c0-efd8ba314ab7","Type":"ContainerDied","Data":"6e4687318591e77c2de5d5d5d8bd53bba5cdb18819c0084a7fcac9b374578554"} Feb 27 01:18:32 crc kubenswrapper[4771]: I0227 01:18:32.680292 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e4687318591e77c2de5d5d5d8bd53bba5cdb18819c0084a7fcac9b374578554" Feb 27 01:18:32 crc kubenswrapper[4771]: I0227 01:18:32.680015 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535918-kd9mw" Feb 27 01:18:32 crc kubenswrapper[4771]: I0227 01:18:32.681276 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-tmk5t" event={"ID":"b7560148-b519-4709-a6a8-184258052e14","Type":"ContainerStarted","Data":"63d2b2521382467ae8d8452299fcdabbac6aae89eb81b88b8fcc40396200b205"} Feb 27 01:18:32 crc kubenswrapper[4771]: I0227 01:18:32.735680 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535912-6mr69"] Feb 27 01:18:32 crc kubenswrapper[4771]: I0227 01:18:32.741721 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535912-6mr69"] Feb 27 01:18:33 crc kubenswrapper[4771]: I0227 01:18:33.782036 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc662da1-e2be-4b52-ae55-a223b4ffb8ad" path="/var/lib/kubelet/pods/bc662da1-e2be-4b52-ae55-a223b4ffb8ad/volumes" Feb 27 01:18:34 crc kubenswrapper[4771]: I0227 01:18:34.694130 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-tmk5t" event={"ID":"b7560148-b519-4709-a6a8-184258052e14","Type":"ContainerStarted","Data":"98b87ad2502b00fadabe1cae4c6123a9354b5ab6b5ae8a672991df0523ce9697"} Feb 27 01:18:35 crc kubenswrapper[4771]: I0227 01:18:35.337402 4771 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.133469 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-tmk5t" podStartSLOduration=6.975015487 podStartE2EDuration="9.133449173s" podCreationTimestamp="2026-02-27 01:18:31 +0000 UTC" firstStartedPulling="2026-02-27 01:18:32.274082478 +0000 UTC m=+825.211643776" lastFinishedPulling="2026-02-27 01:18:34.432516134 +0000 UTC m=+827.370077462" observedRunningTime="2026-02-27 01:18:34.714409064 +0000 UTC m=+827.651970352" watchObservedRunningTime="2026-02-27 01:18:40.133449173 +0000 UTC m=+833.071010471" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.134411 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-bkc9p"] Feb 27 01:18:40 crc kubenswrapper[4771]: E0227 01:18:40.134658 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5cdfa47-132f-4eb8-95c0-efd8ba314ab7" containerName="oc" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.134673 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5cdfa47-132f-4eb8-95c0-efd8ba314ab7" containerName="oc" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.134815 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5cdfa47-132f-4eb8-95c0-efd8ba314ab7" containerName="oc" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.135607 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-bkc9p" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.137799 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-vv66m" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.140149 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-pfb4d"] Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.141001 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-pfb4d" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.143237 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-ln8xd"] Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.143756 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-ln8xd" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.145051 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.162563 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-pfb4d"] Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.178364 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-bkc9p"] Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.264147 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqxwd\" (UniqueName: \"kubernetes.io/projected/7b2bdabc-b325-4bc2-91f8-39e9f12ec946-kube-api-access-cqxwd\") pod \"nmstate-handler-ln8xd\" (UID: \"7b2bdabc-b325-4bc2-91f8-39e9f12ec946\") " pod="openshift-nmstate/nmstate-handler-ln8xd" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.264195 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/397a2bf0-511c-4cc9-964c-e1d2efc662ea-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-pfb4d\" (UID: \"397a2bf0-511c-4cc9-964c-e1d2efc662ea\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-pfb4d" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.264226 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7b2bdabc-b325-4bc2-91f8-39e9f12ec946-nmstate-lock\") pod \"nmstate-handler-ln8xd\" (UID: \"7b2bdabc-b325-4bc2-91f8-39e9f12ec946\") " pod="openshift-nmstate/nmstate-handler-ln8xd" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.264247 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vww2\" (UniqueName: \"kubernetes.io/projected/6049b388-cb33-408a-848e-90a3e9767488-kube-api-access-7vww2\") pod \"nmstate-metrics-69594cc75-bkc9p\" (UID: \"6049b388-cb33-408a-848e-90a3e9767488\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-bkc9p" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.264375 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7b2bdabc-b325-4bc2-91f8-39e9f12ec946-dbus-socket\") pod \"nmstate-handler-ln8xd\" (UID: \"7b2bdabc-b325-4bc2-91f8-39e9f12ec946\") " pod="openshift-nmstate/nmstate-handler-ln8xd" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.264480 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7b2bdabc-b325-4bc2-91f8-39e9f12ec946-ovs-socket\") pod \"nmstate-handler-ln8xd\" (UID: \"7b2bdabc-b325-4bc2-91f8-39e9f12ec946\") " pod="openshift-nmstate/nmstate-handler-ln8xd" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.264570 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-458m8\" (UniqueName: \"kubernetes.io/projected/397a2bf0-511c-4cc9-964c-e1d2efc662ea-kube-api-access-458m8\") pod \"nmstate-webhook-786f45cff4-pfb4d\" (UID: \"397a2bf0-511c-4cc9-964c-e1d2efc662ea\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-pfb4d" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.278817 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-thhng"] Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.279391 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-thhng" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.281482 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.281695 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-wvbrw" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.281736 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.318659 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-thhng"] Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.366058 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqxwd\" (UniqueName: \"kubernetes.io/projected/7b2bdabc-b325-4bc2-91f8-39e9f12ec946-kube-api-access-cqxwd\") pod \"nmstate-handler-ln8xd\" (UID: \"7b2bdabc-b325-4bc2-91f8-39e9f12ec946\") " pod="openshift-nmstate/nmstate-handler-ln8xd" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.366117 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/397a2bf0-511c-4cc9-964c-e1d2efc662ea-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-pfb4d\" (UID: \"397a2bf0-511c-4cc9-964c-e1d2efc662ea\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-pfb4d" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.366149 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7b2bdabc-b325-4bc2-91f8-39e9f12ec946-nmstate-lock\") pod \"nmstate-handler-ln8xd\" (UID: \"7b2bdabc-b325-4bc2-91f8-39e9f12ec946\") " pod="openshift-nmstate/nmstate-handler-ln8xd" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.366172 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vww2\" (UniqueName: \"kubernetes.io/projected/6049b388-cb33-408a-848e-90a3e9767488-kube-api-access-7vww2\") pod \"nmstate-metrics-69594cc75-bkc9p\" (UID: \"6049b388-cb33-408a-848e-90a3e9767488\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-bkc9p" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.366191 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7b2bdabc-b325-4bc2-91f8-39e9f12ec946-dbus-socket\") pod \"nmstate-handler-ln8xd\" (UID: \"7b2bdabc-b325-4bc2-91f8-39e9f12ec946\") " pod="openshift-nmstate/nmstate-handler-ln8xd" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.366227 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7b2bdabc-b325-4bc2-91f8-39e9f12ec946-ovs-socket\") pod \"nmstate-handler-ln8xd\" (UID: \"7b2bdabc-b325-4bc2-91f8-39e9f12ec946\") " pod="openshift-nmstate/nmstate-handler-ln8xd" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.366248 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-458m8\" (UniqueName: \"kubernetes.io/projected/397a2bf0-511c-4cc9-964c-e1d2efc662ea-kube-api-access-458m8\") pod \"nmstate-webhook-786f45cff4-pfb4d\" (UID: \"397a2bf0-511c-4cc9-964c-e1d2efc662ea\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-pfb4d" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.366284 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7b2bdabc-b325-4bc2-91f8-39e9f12ec946-nmstate-lock\") pod \"nmstate-handler-ln8xd\" (UID: \"7b2bdabc-b325-4bc2-91f8-39e9f12ec946\") " pod="openshift-nmstate/nmstate-handler-ln8xd" Feb 27 01:18:40 crc kubenswrapper[4771]: E0227 01:18:40.366403 4771 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.366507 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7b2bdabc-b325-4bc2-91f8-39e9f12ec946-ovs-socket\") pod \"nmstate-handler-ln8xd\" (UID: \"7b2bdabc-b325-4bc2-91f8-39e9f12ec946\") " pod="openshift-nmstate/nmstate-handler-ln8xd" Feb 27 01:18:40 crc kubenswrapper[4771]: E0227 01:18:40.366568 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/397a2bf0-511c-4cc9-964c-e1d2efc662ea-tls-key-pair podName:397a2bf0-511c-4cc9-964c-e1d2efc662ea nodeName:}" failed. No retries permitted until 2026-02-27 01:18:40.866506097 +0000 UTC m=+833.804067385 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/397a2bf0-511c-4cc9-964c-e1d2efc662ea-tls-key-pair") pod "nmstate-webhook-786f45cff4-pfb4d" (UID: "397a2bf0-511c-4cc9-964c-e1d2efc662ea") : secret "openshift-nmstate-webhook" not found Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.366566 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7b2bdabc-b325-4bc2-91f8-39e9f12ec946-dbus-socket\") pod \"nmstate-handler-ln8xd\" (UID: \"7b2bdabc-b325-4bc2-91f8-39e9f12ec946\") " pod="openshift-nmstate/nmstate-handler-ln8xd" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.385923 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vww2\" (UniqueName: \"kubernetes.io/projected/6049b388-cb33-408a-848e-90a3e9767488-kube-api-access-7vww2\") pod \"nmstate-metrics-69594cc75-bkc9p\" (UID: \"6049b388-cb33-408a-848e-90a3e9767488\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-bkc9p" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.386478 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqxwd\" (UniqueName: \"kubernetes.io/projected/7b2bdabc-b325-4bc2-91f8-39e9f12ec946-kube-api-access-cqxwd\") pod \"nmstate-handler-ln8xd\" (UID: \"7b2bdabc-b325-4bc2-91f8-39e9f12ec946\") " pod="openshift-nmstate/nmstate-handler-ln8xd" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.387272 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-458m8\" (UniqueName: \"kubernetes.io/projected/397a2bf0-511c-4cc9-964c-e1d2efc662ea-kube-api-access-458m8\") pod \"nmstate-webhook-786f45cff4-pfb4d\" (UID: \"397a2bf0-511c-4cc9-964c-e1d2efc662ea\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-pfb4d" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.453511 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-bkc9p" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.466999 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctcd4\" (UniqueName: \"kubernetes.io/projected/7c2f136b-c273-45f2-bbd2-923046cf0861-kube-api-access-ctcd4\") pod \"nmstate-console-plugin-5dcbbd79cf-thhng\" (UID: \"7c2f136b-c273-45f2-bbd2-923046cf0861\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-thhng" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.467081 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c2f136b-c273-45f2-bbd2-923046cf0861-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-thhng\" (UID: \"7c2f136b-c273-45f2-bbd2-923046cf0861\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-thhng" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.467101 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7c2f136b-c273-45f2-bbd2-923046cf0861-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-thhng\" (UID: \"7c2f136b-c273-45f2-bbd2-923046cf0861\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-thhng" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.475562 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-ln8xd" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.481749 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-785db76b97-tqrt5"] Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.482377 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.501966 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-785db76b97-tqrt5"] Feb 27 01:18:40 crc kubenswrapper[4771]: W0227 01:18:40.507727 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b2bdabc_b325_4bc2_91f8_39e9f12ec946.slice/crio-4a47da8cc2ee8456b917df02afe5454278d24060cc69fa3de16fa4278ad3e99b WatchSource:0}: Error finding container 4a47da8cc2ee8456b917df02afe5454278d24060cc69fa3de16fa4278ad3e99b: Status 404 returned error can't find the container with id 4a47da8cc2ee8456b917df02afe5454278d24060cc69fa3de16fa4278ad3e99b Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.568592 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctcd4\" (UniqueName: \"kubernetes.io/projected/7c2f136b-c273-45f2-bbd2-923046cf0861-kube-api-access-ctcd4\") pod \"nmstate-console-plugin-5dcbbd79cf-thhng\" (UID: \"7c2f136b-c273-45f2-bbd2-923046cf0861\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-thhng" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.568712 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c2f136b-c273-45f2-bbd2-923046cf0861-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-thhng\" (UID: \"7c2f136b-c273-45f2-bbd2-923046cf0861\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-thhng" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.568738 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7c2f136b-c273-45f2-bbd2-923046cf0861-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-thhng\" (UID: \"7c2f136b-c273-45f2-bbd2-923046cf0861\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-thhng" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.570005 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7c2f136b-c273-45f2-bbd2-923046cf0861-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-thhng\" (UID: \"7c2f136b-c273-45f2-bbd2-923046cf0861\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-thhng" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.573200 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c2f136b-c273-45f2-bbd2-923046cf0861-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-thhng\" (UID: \"7c2f136b-c273-45f2-bbd2-923046cf0861\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-thhng" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.585135 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctcd4\" (UniqueName: \"kubernetes.io/projected/7c2f136b-c273-45f2-bbd2-923046cf0861-kube-api-access-ctcd4\") pod \"nmstate-console-plugin-5dcbbd79cf-thhng\" (UID: \"7c2f136b-c273-45f2-bbd2-923046cf0861\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-thhng" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.596018 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-thhng" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.670850 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtnq9\" (UniqueName: \"kubernetes.io/projected/0146552b-1489-4b8b-8637-1b0be99091a4-kube-api-access-jtnq9\") pod \"console-785db76b97-tqrt5\" (UID: \"0146552b-1489-4b8b-8637-1b0be99091a4\") " pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.670939 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0146552b-1489-4b8b-8637-1b0be99091a4-console-oauth-config\") pod \"console-785db76b97-tqrt5\" (UID: \"0146552b-1489-4b8b-8637-1b0be99091a4\") " pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.671001 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0146552b-1489-4b8b-8637-1b0be99091a4-console-serving-cert\") pod \"console-785db76b97-tqrt5\" (UID: \"0146552b-1489-4b8b-8637-1b0be99091a4\") " pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.671025 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0146552b-1489-4b8b-8637-1b0be99091a4-trusted-ca-bundle\") pod \"console-785db76b97-tqrt5\" (UID: \"0146552b-1489-4b8b-8637-1b0be99091a4\") " pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.671050 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0146552b-1489-4b8b-8637-1b0be99091a4-oauth-serving-cert\") pod \"console-785db76b97-tqrt5\" (UID: \"0146552b-1489-4b8b-8637-1b0be99091a4\") " pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.671087 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0146552b-1489-4b8b-8637-1b0be99091a4-console-config\") pod \"console-785db76b97-tqrt5\" (UID: \"0146552b-1489-4b8b-8637-1b0be99091a4\") " pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.671114 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0146552b-1489-4b8b-8637-1b0be99091a4-service-ca\") pod \"console-785db76b97-tqrt5\" (UID: \"0146552b-1489-4b8b-8637-1b0be99091a4\") " pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.681283 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-bkc9p"] Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.738045 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-bkc9p" event={"ID":"6049b388-cb33-408a-848e-90a3e9767488","Type":"ContainerStarted","Data":"495c8e6767283ec544e83e86016da76afec80715878f2fd677bbdfa11ba10a94"} Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.738828 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-ln8xd" event={"ID":"7b2bdabc-b325-4bc2-91f8-39e9f12ec946","Type":"ContainerStarted","Data":"4a47da8cc2ee8456b917df02afe5454278d24060cc69fa3de16fa4278ad3e99b"} Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.771594 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0146552b-1489-4b8b-8637-1b0be99091a4-console-serving-cert\") pod \"console-785db76b97-tqrt5\" (UID: \"0146552b-1489-4b8b-8637-1b0be99091a4\") " pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.771636 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0146552b-1489-4b8b-8637-1b0be99091a4-trusted-ca-bundle\") pod \"console-785db76b97-tqrt5\" (UID: \"0146552b-1489-4b8b-8637-1b0be99091a4\") " pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.771658 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0146552b-1489-4b8b-8637-1b0be99091a4-oauth-serving-cert\") pod \"console-785db76b97-tqrt5\" (UID: \"0146552b-1489-4b8b-8637-1b0be99091a4\") " pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.771692 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0146552b-1489-4b8b-8637-1b0be99091a4-console-config\") pod \"console-785db76b97-tqrt5\" (UID: \"0146552b-1489-4b8b-8637-1b0be99091a4\") " pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.771715 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0146552b-1489-4b8b-8637-1b0be99091a4-service-ca\") pod \"console-785db76b97-tqrt5\" (UID: \"0146552b-1489-4b8b-8637-1b0be99091a4\") " pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.771759 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtnq9\" (UniqueName: \"kubernetes.io/projected/0146552b-1489-4b8b-8637-1b0be99091a4-kube-api-access-jtnq9\") pod \"console-785db76b97-tqrt5\" (UID: \"0146552b-1489-4b8b-8637-1b0be99091a4\") " pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.771795 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0146552b-1489-4b8b-8637-1b0be99091a4-console-oauth-config\") pod \"console-785db76b97-tqrt5\" (UID: \"0146552b-1489-4b8b-8637-1b0be99091a4\") " pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.772847 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0146552b-1489-4b8b-8637-1b0be99091a4-trusted-ca-bundle\") pod \"console-785db76b97-tqrt5\" (UID: \"0146552b-1489-4b8b-8637-1b0be99091a4\") " pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.772996 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0146552b-1489-4b8b-8637-1b0be99091a4-service-ca\") pod \"console-785db76b97-tqrt5\" (UID: \"0146552b-1489-4b8b-8637-1b0be99091a4\") " pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.773102 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0146552b-1489-4b8b-8637-1b0be99091a4-oauth-serving-cert\") pod \"console-785db76b97-tqrt5\" (UID: \"0146552b-1489-4b8b-8637-1b0be99091a4\") " pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.775037 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0146552b-1489-4b8b-8637-1b0be99091a4-console-config\") pod \"console-785db76b97-tqrt5\" (UID: \"0146552b-1489-4b8b-8637-1b0be99091a4\") " pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.776519 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0146552b-1489-4b8b-8637-1b0be99091a4-console-oauth-config\") pod \"console-785db76b97-tqrt5\" (UID: \"0146552b-1489-4b8b-8637-1b0be99091a4\") " pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.777046 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0146552b-1489-4b8b-8637-1b0be99091a4-console-serving-cert\") pod \"console-785db76b97-tqrt5\" (UID: \"0146552b-1489-4b8b-8637-1b0be99091a4\") " pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.787809 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtnq9\" (UniqueName: \"kubernetes.io/projected/0146552b-1489-4b8b-8637-1b0be99091a4-kube-api-access-jtnq9\") pod \"console-785db76b97-tqrt5\" (UID: \"0146552b-1489-4b8b-8637-1b0be99091a4\") " pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.825316 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-thhng"] Feb 27 01:18:40 crc kubenswrapper[4771]: W0227 01:18:40.827953 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c2f136b_c273_45f2_bbd2_923046cf0861.slice/crio-5cc1ea6b448f3550c7709a825b700991f56161f5ec011e69d82d2810d5f3e308 WatchSource:0}: Error finding container 5cc1ea6b448f3550c7709a825b700991f56161f5ec011e69d82d2810d5f3e308: Status 404 returned error can't find the container with id 5cc1ea6b448f3550c7709a825b700991f56161f5ec011e69d82d2810d5f3e308 Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.848676 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.872331 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/397a2bf0-511c-4cc9-964c-e1d2efc662ea-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-pfb4d\" (UID: \"397a2bf0-511c-4cc9-964c-e1d2efc662ea\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-pfb4d" Feb 27 01:18:40 crc kubenswrapper[4771]: I0227 01:18:40.875999 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/397a2bf0-511c-4cc9-964c-e1d2efc662ea-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-pfb4d\" (UID: \"397a2bf0-511c-4cc9-964c-e1d2efc662ea\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-pfb4d" Feb 27 01:18:41 crc kubenswrapper[4771]: I0227 01:18:41.030352 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-785db76b97-tqrt5"] Feb 27 01:18:41 crc kubenswrapper[4771]: I0227 01:18:41.067986 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-pfb4d" Feb 27 01:18:41 crc kubenswrapper[4771]: I0227 01:18:41.276310 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-pfb4d"] Feb 27 01:18:41 crc kubenswrapper[4771]: W0227 01:18:41.291427 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod397a2bf0_511c_4cc9_964c_e1d2efc662ea.slice/crio-7c2834d7ac6acc123154d57f2c6955a63f8671b5c3ade7e7f20f3a966e71d259 WatchSource:0}: Error finding container 7c2834d7ac6acc123154d57f2c6955a63f8671b5c3ade7e7f20f3a966e71d259: Status 404 returned error can't find the container with id 7c2834d7ac6acc123154d57f2c6955a63f8671b5c3ade7e7f20f3a966e71d259 Feb 27 01:18:41 crc kubenswrapper[4771]: I0227 01:18:41.750420 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-pfb4d" event={"ID":"397a2bf0-511c-4cc9-964c-e1d2efc662ea","Type":"ContainerStarted","Data":"7c2834d7ac6acc123154d57f2c6955a63f8671b5c3ade7e7f20f3a966e71d259"} Feb 27 01:18:41 crc kubenswrapper[4771]: I0227 01:18:41.752517 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-thhng" event={"ID":"7c2f136b-c273-45f2-bbd2-923046cf0861","Type":"ContainerStarted","Data":"5cc1ea6b448f3550c7709a825b700991f56161f5ec011e69d82d2810d5f3e308"} Feb 27 01:18:41 crc kubenswrapper[4771]: I0227 01:18:41.755078 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-785db76b97-tqrt5" event={"ID":"0146552b-1489-4b8b-8637-1b0be99091a4","Type":"ContainerStarted","Data":"f78db7760a73289587efd0fc4a1b6ecb20892422de054e7f5754dd2c4f32ea08"} Feb 27 01:18:41 crc kubenswrapper[4771]: I0227 01:18:41.755123 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-785db76b97-tqrt5" event={"ID":"0146552b-1489-4b8b-8637-1b0be99091a4","Type":"ContainerStarted","Data":"7b745bbf5c2a9a1cec4bf0da979cf6cbc19293889850f778bde4b6a0982f5263"} Feb 27 01:18:41 crc kubenswrapper[4771]: I0227 01:18:41.793509 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-785db76b97-tqrt5" podStartSLOduration=1.793485309 podStartE2EDuration="1.793485309s" podCreationTimestamp="2026-02-27 01:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:18:41.780428872 +0000 UTC m=+834.717990260" watchObservedRunningTime="2026-02-27 01:18:41.793485309 +0000 UTC m=+834.731046637" Feb 27 01:18:44 crc kubenswrapper[4771]: I0227 01:18:44.777630 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-ln8xd" event={"ID":"7b2bdabc-b325-4bc2-91f8-39e9f12ec946","Type":"ContainerStarted","Data":"170522934ba1c08f8519ce5bfdf188c0afedef3bb72664018be43bdb801997ca"} Feb 27 01:18:44 crc kubenswrapper[4771]: I0227 01:18:44.778385 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-ln8xd" Feb 27 01:18:44 crc kubenswrapper[4771]: I0227 01:18:44.780319 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-bkc9p" event={"ID":"6049b388-cb33-408a-848e-90a3e9767488","Type":"ContainerStarted","Data":"3291ff9ca2ae50f8e3c42bd2c75e6b5b59a6595127bea554ab63e60a90a655af"} Feb 27 01:18:44 crc kubenswrapper[4771]: I0227 01:18:44.789276 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-pfb4d" event={"ID":"397a2bf0-511c-4cc9-964c-e1d2efc662ea","Type":"ContainerStarted","Data":"15b6553280e7a05e225161f3918799661b421c996dbd51293270f58a174477ee"} Feb 27 01:18:44 crc kubenswrapper[4771]: I0227 01:18:44.789690 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-pfb4d" Feb 27 01:18:44 crc kubenswrapper[4771]: I0227 01:18:44.791377 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-thhng" event={"ID":"7c2f136b-c273-45f2-bbd2-923046cf0861","Type":"ContainerStarted","Data":"94183a106bb7a500b7b6406a0db11de2aeef6b7613d0d68ff3d1e27f311628d6"} Feb 27 01:18:44 crc kubenswrapper[4771]: I0227 01:18:44.810414 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-ln8xd" podStartSLOduration=1.671035716 podStartE2EDuration="4.810384148s" podCreationTimestamp="2026-02-27 01:18:40 +0000 UTC" firstStartedPulling="2026-02-27 01:18:40.509850362 +0000 UTC m=+833.447411650" lastFinishedPulling="2026-02-27 01:18:43.649198764 +0000 UTC m=+836.586760082" observedRunningTime="2026-02-27 01:18:44.800402575 +0000 UTC m=+837.737963913" watchObservedRunningTime="2026-02-27 01:18:44.810384148 +0000 UTC m=+837.747945486" Feb 27 01:18:44 crc kubenswrapper[4771]: I0227 01:18:44.846277 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-thhng" podStartSLOduration=2.025536446 podStartE2EDuration="4.84625871s" podCreationTimestamp="2026-02-27 01:18:40 +0000 UTC" firstStartedPulling="2026-02-27 01:18:40.829687013 +0000 UTC m=+833.767248321" lastFinishedPulling="2026-02-27 01:18:43.650409267 +0000 UTC m=+836.587970585" observedRunningTime="2026-02-27 01:18:44.820177186 +0000 UTC m=+837.757738524" watchObservedRunningTime="2026-02-27 01:18:44.84625871 +0000 UTC m=+837.783820008" Feb 27 01:18:44 crc kubenswrapper[4771]: I0227 01:18:44.849180 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-pfb4d" podStartSLOduration=2.490514929 podStartE2EDuration="4.849168369s" podCreationTimestamp="2026-02-27 01:18:40 +0000 UTC" firstStartedPulling="2026-02-27 01:18:41.293040843 +0000 UTC m=+834.230602131" lastFinishedPulling="2026-02-27 01:18:43.651694233 +0000 UTC m=+836.589255571" observedRunningTime="2026-02-27 01:18:44.842358524 +0000 UTC m=+837.779919862" watchObservedRunningTime="2026-02-27 01:18:44.849168369 +0000 UTC m=+837.786729667" Feb 27 01:18:46 crc kubenswrapper[4771]: I0227 01:18:46.805611 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-bkc9p" event={"ID":"6049b388-cb33-408a-848e-90a3e9767488","Type":"ContainerStarted","Data":"d1a8618e151a24127ae4e3e1e2495a2711a5d2016e1a8791d4cbc604f6cea0ee"} Feb 27 01:18:50 crc kubenswrapper[4771]: I0227 01:18:50.514530 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-ln8xd" Feb 27 01:18:50 crc kubenswrapper[4771]: I0227 01:18:50.536220 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-bkc9p" podStartSLOduration=4.970702108 podStartE2EDuration="10.536194109s" podCreationTimestamp="2026-02-27 01:18:40 +0000 UTC" firstStartedPulling="2026-02-27 01:18:40.69198531 +0000 UTC m=+833.629546598" lastFinishedPulling="2026-02-27 01:18:46.257477311 +0000 UTC m=+839.195038599" observedRunningTime="2026-02-27 01:18:46.83503288 +0000 UTC m=+839.772594198" watchObservedRunningTime="2026-02-27 01:18:50.536194109 +0000 UTC m=+843.473755437" Feb 27 01:18:50 crc kubenswrapper[4771]: I0227 01:18:50.849026 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:50 crc kubenswrapper[4771]: I0227 01:18:50.849268 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:50 crc kubenswrapper[4771]: I0227 01:18:50.857612 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:51 crc kubenswrapper[4771]: I0227 01:18:51.847410 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-785db76b97-tqrt5" Feb 27 01:18:51 crc kubenswrapper[4771]: I0227 01:18:51.923297 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-65dsm"] Feb 27 01:18:54 crc kubenswrapper[4771]: I0227 01:18:54.890881 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2wn72"] Feb 27 01:18:54 crc kubenswrapper[4771]: I0227 01:18:54.893316 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2wn72" Feb 27 01:18:54 crc kubenswrapper[4771]: I0227 01:18:54.903714 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2wn72"] Feb 27 01:18:55 crc kubenswrapper[4771]: I0227 01:18:55.009079 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54a1b4f0-c80a-4e94-ba7e-b5db00375495-utilities\") pod \"certified-operators-2wn72\" (UID: \"54a1b4f0-c80a-4e94-ba7e-b5db00375495\") " pod="openshift-marketplace/certified-operators-2wn72" Feb 27 01:18:55 crc kubenswrapper[4771]: I0227 01:18:55.009522 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54a1b4f0-c80a-4e94-ba7e-b5db00375495-catalog-content\") pod \"certified-operators-2wn72\" (UID: \"54a1b4f0-c80a-4e94-ba7e-b5db00375495\") " pod="openshift-marketplace/certified-operators-2wn72" Feb 27 01:18:55 crc kubenswrapper[4771]: I0227 01:18:55.009728 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgv4c\" (UniqueName: \"kubernetes.io/projected/54a1b4f0-c80a-4e94-ba7e-b5db00375495-kube-api-access-lgv4c\") pod \"certified-operators-2wn72\" (UID: \"54a1b4f0-c80a-4e94-ba7e-b5db00375495\") " pod="openshift-marketplace/certified-operators-2wn72" Feb 27 01:18:55 crc kubenswrapper[4771]: I0227 01:18:55.111018 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54a1b4f0-c80a-4e94-ba7e-b5db00375495-catalog-content\") pod \"certified-operators-2wn72\" (UID: \"54a1b4f0-c80a-4e94-ba7e-b5db00375495\") " pod="openshift-marketplace/certified-operators-2wn72" Feb 27 01:18:55 crc kubenswrapper[4771]: I0227 01:18:55.111616 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgv4c\" (UniqueName: \"kubernetes.io/projected/54a1b4f0-c80a-4e94-ba7e-b5db00375495-kube-api-access-lgv4c\") pod \"certified-operators-2wn72\" (UID: \"54a1b4f0-c80a-4e94-ba7e-b5db00375495\") " pod="openshift-marketplace/certified-operators-2wn72" Feb 27 01:18:55 crc kubenswrapper[4771]: I0227 01:18:55.111536 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54a1b4f0-c80a-4e94-ba7e-b5db00375495-catalog-content\") pod \"certified-operators-2wn72\" (UID: \"54a1b4f0-c80a-4e94-ba7e-b5db00375495\") " pod="openshift-marketplace/certified-operators-2wn72" Feb 27 01:18:55 crc kubenswrapper[4771]: I0227 01:18:55.112102 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54a1b4f0-c80a-4e94-ba7e-b5db00375495-utilities\") pod \"certified-operators-2wn72\" (UID: \"54a1b4f0-c80a-4e94-ba7e-b5db00375495\") " pod="openshift-marketplace/certified-operators-2wn72" Feb 27 01:18:55 crc kubenswrapper[4771]: I0227 01:18:55.112586 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54a1b4f0-c80a-4e94-ba7e-b5db00375495-utilities\") pod \"certified-operators-2wn72\" (UID: \"54a1b4f0-c80a-4e94-ba7e-b5db00375495\") " pod="openshift-marketplace/certified-operators-2wn72" Feb 27 01:18:55 crc kubenswrapper[4771]: I0227 01:18:55.143811 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgv4c\" (UniqueName: \"kubernetes.io/projected/54a1b4f0-c80a-4e94-ba7e-b5db00375495-kube-api-access-lgv4c\") pod \"certified-operators-2wn72\" (UID: \"54a1b4f0-c80a-4e94-ba7e-b5db00375495\") " pod="openshift-marketplace/certified-operators-2wn72" Feb 27 01:18:55 crc kubenswrapper[4771]: I0227 01:18:55.232146 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2wn72" Feb 27 01:18:55 crc kubenswrapper[4771]: I0227 01:18:55.718408 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2wn72"] Feb 27 01:18:55 crc kubenswrapper[4771]: W0227 01:18:55.722006 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54a1b4f0_c80a_4e94_ba7e_b5db00375495.slice/crio-67ee4c386d1f62e0fdfc70d97586a7705fd9d2848e11065627db56d772127bf8 WatchSource:0}: Error finding container 67ee4c386d1f62e0fdfc70d97586a7705fd9d2848e11065627db56d772127bf8: Status 404 returned error can't find the container with id 67ee4c386d1f62e0fdfc70d97586a7705fd9d2848e11065627db56d772127bf8 Feb 27 01:18:56 crc kubenswrapper[4771]: I0227 01:18:56.125412 4771 generic.go:334] "Generic (PLEG): container finished" podID="54a1b4f0-c80a-4e94-ba7e-b5db00375495" containerID="f71f85544847123cc3034fd716a005bf67474cdae78743d8ea7c69c091df0cef" exitCode=0 Feb 27 01:18:56 crc kubenswrapper[4771]: I0227 01:18:56.125488 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2wn72" event={"ID":"54a1b4f0-c80a-4e94-ba7e-b5db00375495","Type":"ContainerDied","Data":"f71f85544847123cc3034fd716a005bf67474cdae78743d8ea7c69c091df0cef"} Feb 27 01:18:56 crc kubenswrapper[4771]: I0227 01:18:56.125612 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2wn72" event={"ID":"54a1b4f0-c80a-4e94-ba7e-b5db00375495","Type":"ContainerStarted","Data":"67ee4c386d1f62e0fdfc70d97586a7705fd9d2848e11065627db56d772127bf8"} Feb 27 01:18:57 crc kubenswrapper[4771]: I0227 01:18:57.132407 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2wn72" event={"ID":"54a1b4f0-c80a-4e94-ba7e-b5db00375495","Type":"ContainerStarted","Data":"849e0d2df3dc5711d47411afb2fa0c5e1ea2279fab18b7cd569755e106dd7bf2"} Feb 27 01:18:58 crc kubenswrapper[4771]: I0227 01:18:58.141576 4771 generic.go:334] "Generic (PLEG): container finished" podID="54a1b4f0-c80a-4e94-ba7e-b5db00375495" containerID="849e0d2df3dc5711d47411afb2fa0c5e1ea2279fab18b7cd569755e106dd7bf2" exitCode=0 Feb 27 01:18:58 crc kubenswrapper[4771]: I0227 01:18:58.141623 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2wn72" event={"ID":"54a1b4f0-c80a-4e94-ba7e-b5db00375495","Type":"ContainerDied","Data":"849e0d2df3dc5711d47411afb2fa0c5e1ea2279fab18b7cd569755e106dd7bf2"} Feb 27 01:18:59 crc kubenswrapper[4771]: I0227 01:18:59.152355 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2wn72" event={"ID":"54a1b4f0-c80a-4e94-ba7e-b5db00375495","Type":"ContainerStarted","Data":"4de1ea7dae70961924494dd59c8a1be0a50dcf899b623447023c638a3050abb4"} Feb 27 01:18:59 crc kubenswrapper[4771]: I0227 01:18:59.171334 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2wn72" podStartSLOduration=2.754466832 podStartE2EDuration="5.171315298s" podCreationTimestamp="2026-02-27 01:18:54 +0000 UTC" firstStartedPulling="2026-02-27 01:18:56.129470618 +0000 UTC m=+849.067031946" lastFinishedPulling="2026-02-27 01:18:58.546319084 +0000 UTC m=+851.483880412" observedRunningTime="2026-02-27 01:18:59.17102127 +0000 UTC m=+852.108582568" watchObservedRunningTime="2026-02-27 01:18:59.171315298 +0000 UTC m=+852.108876596" Feb 27 01:19:01 crc kubenswrapper[4771]: I0227 01:19:01.079889 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-pfb4d" Feb 27 01:19:01 crc kubenswrapper[4771]: I0227 01:19:01.278423 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gwhb5"] Feb 27 01:19:01 crc kubenswrapper[4771]: I0227 01:19:01.280195 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gwhb5" Feb 27 01:19:01 crc kubenswrapper[4771]: I0227 01:19:01.310472 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gwhb5"] Feb 27 01:19:01 crc kubenswrapper[4771]: I0227 01:19:01.402242 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mckdm\" (UniqueName: \"kubernetes.io/projected/9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f-kube-api-access-mckdm\") pod \"community-operators-gwhb5\" (UID: \"9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f\") " pod="openshift-marketplace/community-operators-gwhb5" Feb 27 01:19:01 crc kubenswrapper[4771]: I0227 01:19:01.402322 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f-utilities\") pod \"community-operators-gwhb5\" (UID: \"9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f\") " pod="openshift-marketplace/community-operators-gwhb5" Feb 27 01:19:01 crc kubenswrapper[4771]: I0227 01:19:01.402417 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f-catalog-content\") pod \"community-operators-gwhb5\" (UID: \"9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f\") " pod="openshift-marketplace/community-operators-gwhb5" Feb 27 01:19:01 crc kubenswrapper[4771]: I0227 01:19:01.503900 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f-utilities\") pod \"community-operators-gwhb5\" (UID: \"9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f\") " pod="openshift-marketplace/community-operators-gwhb5" Feb 27 01:19:01 crc kubenswrapper[4771]: I0227 01:19:01.504041 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f-catalog-content\") pod \"community-operators-gwhb5\" (UID: \"9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f\") " pod="openshift-marketplace/community-operators-gwhb5" Feb 27 01:19:01 crc kubenswrapper[4771]: I0227 01:19:01.504106 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mckdm\" (UniqueName: \"kubernetes.io/projected/9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f-kube-api-access-mckdm\") pod \"community-operators-gwhb5\" (UID: \"9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f\") " pod="openshift-marketplace/community-operators-gwhb5" Feb 27 01:19:01 crc kubenswrapper[4771]: I0227 01:19:01.504916 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f-utilities\") pod \"community-operators-gwhb5\" (UID: \"9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f\") " pod="openshift-marketplace/community-operators-gwhb5" Feb 27 01:19:01 crc kubenswrapper[4771]: I0227 01:19:01.505234 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f-catalog-content\") pod \"community-operators-gwhb5\" (UID: \"9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f\") " pod="openshift-marketplace/community-operators-gwhb5" Feb 27 01:19:01 crc kubenswrapper[4771]: I0227 01:19:01.540283 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mckdm\" (UniqueName: \"kubernetes.io/projected/9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f-kube-api-access-mckdm\") pod \"community-operators-gwhb5\" (UID: \"9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f\") " pod="openshift-marketplace/community-operators-gwhb5" Feb 27 01:19:01 crc kubenswrapper[4771]: I0227 01:19:01.660948 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gwhb5" Feb 27 01:19:02 crc kubenswrapper[4771]: I0227 01:19:02.200733 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gwhb5"] Feb 27 01:19:02 crc kubenswrapper[4771]: W0227 01:19:02.203601 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b31c0b5_81eb_4c5f_a79d_ddf6ac86d19f.slice/crio-ab59e6b561b6c60f7c88a91143074a1bf0a352aac4d343ceaabcba82edf9254b WatchSource:0}: Error finding container ab59e6b561b6c60f7c88a91143074a1bf0a352aac4d343ceaabcba82edf9254b: Status 404 returned error can't find the container with id ab59e6b561b6c60f7c88a91143074a1bf0a352aac4d343ceaabcba82edf9254b Feb 27 01:19:03 crc kubenswrapper[4771]: I0227 01:19:03.198176 4771 generic.go:334] "Generic (PLEG): container finished" podID="9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f" containerID="d0f47bb08ee0e6bae76229f915f4937f1c1423a989b0b0da0e30108d01e717ce" exitCode=0 Feb 27 01:19:03 crc kubenswrapper[4771]: I0227 01:19:03.198476 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwhb5" event={"ID":"9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f","Type":"ContainerDied","Data":"d0f47bb08ee0e6bae76229f915f4937f1c1423a989b0b0da0e30108d01e717ce"} Feb 27 01:19:03 crc kubenswrapper[4771]: I0227 01:19:03.199087 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwhb5" event={"ID":"9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f","Type":"ContainerStarted","Data":"ab59e6b561b6c60f7c88a91143074a1bf0a352aac4d343ceaabcba82edf9254b"} Feb 27 01:19:03 crc kubenswrapper[4771]: I0227 01:19:03.201735 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 01:19:05 crc kubenswrapper[4771]: I0227 01:19:05.219939 4771 generic.go:334] "Generic (PLEG): container finished" podID="9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f" containerID="729dc05f6e1a8e3c0638a7cedff5f800d974c940658068c76ed3aecf02e8df3a" exitCode=0 Feb 27 01:19:05 crc kubenswrapper[4771]: I0227 01:19:05.220077 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwhb5" event={"ID":"9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f","Type":"ContainerDied","Data":"729dc05f6e1a8e3c0638a7cedff5f800d974c940658068c76ed3aecf02e8df3a"} Feb 27 01:19:05 crc kubenswrapper[4771]: I0227 01:19:05.233094 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2wn72" Feb 27 01:19:05 crc kubenswrapper[4771]: I0227 01:19:05.233140 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2wn72" Feb 27 01:19:05 crc kubenswrapper[4771]: I0227 01:19:05.294253 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2wn72" Feb 27 01:19:06 crc kubenswrapper[4771]: I0227 01:19:06.232534 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwhb5" event={"ID":"9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f","Type":"ContainerStarted","Data":"f6d08f041dafbd050c4c56d1a454bc6b8c7215e1ef3b71c52909b3d98bba0fd6"} Feb 27 01:19:06 crc kubenswrapper[4771]: I0227 01:19:06.277797 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gwhb5" podStartSLOduration=2.844062127 podStartE2EDuration="5.277768717s" podCreationTimestamp="2026-02-27 01:19:01 +0000 UTC" firstStartedPulling="2026-02-27 01:19:03.201317286 +0000 UTC m=+856.138878584" lastFinishedPulling="2026-02-27 01:19:05.635023886 +0000 UTC m=+858.572585174" observedRunningTime="2026-02-27 01:19:06.255628939 +0000 UTC m=+859.193190317" watchObservedRunningTime="2026-02-27 01:19:06.277768717 +0000 UTC m=+859.215330045" Feb 27 01:19:06 crc kubenswrapper[4771]: I0227 01:19:06.303087 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2wn72" Feb 27 01:19:07 crc kubenswrapper[4771]: I0227 01:19:07.442367 4771 scope.go:117] "RemoveContainer" containerID="bb2a8e20c9d55117ec7502085f7f2fb218308d20f7d81a64bb218b6d74ba0a3e" Feb 27 01:19:08 crc kubenswrapper[4771]: I0227 01:19:08.869298 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2wn72"] Feb 27 01:19:08 crc kubenswrapper[4771]: I0227 01:19:08.869932 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2wn72" podUID="54a1b4f0-c80a-4e94-ba7e-b5db00375495" containerName="registry-server" containerID="cri-o://4de1ea7dae70961924494dd59c8a1be0a50dcf899b623447023c638a3050abb4" gracePeriod=2 Feb 27 01:19:09 crc kubenswrapper[4771]: I0227 01:19:09.236968 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2wn72" Feb 27 01:19:09 crc kubenswrapper[4771]: I0227 01:19:09.254047 4771 generic.go:334] "Generic (PLEG): container finished" podID="54a1b4f0-c80a-4e94-ba7e-b5db00375495" containerID="4de1ea7dae70961924494dd59c8a1be0a50dcf899b623447023c638a3050abb4" exitCode=0 Feb 27 01:19:09 crc kubenswrapper[4771]: I0227 01:19:09.254116 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2wn72" Feb 27 01:19:09 crc kubenswrapper[4771]: I0227 01:19:09.254118 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2wn72" event={"ID":"54a1b4f0-c80a-4e94-ba7e-b5db00375495","Type":"ContainerDied","Data":"4de1ea7dae70961924494dd59c8a1be0a50dcf899b623447023c638a3050abb4"} Feb 27 01:19:09 crc kubenswrapper[4771]: I0227 01:19:09.254531 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2wn72" event={"ID":"54a1b4f0-c80a-4e94-ba7e-b5db00375495","Type":"ContainerDied","Data":"67ee4c386d1f62e0fdfc70d97586a7705fd9d2848e11065627db56d772127bf8"} Feb 27 01:19:09 crc kubenswrapper[4771]: I0227 01:19:09.254600 4771 scope.go:117] "RemoveContainer" containerID="4de1ea7dae70961924494dd59c8a1be0a50dcf899b623447023c638a3050abb4" Feb 27 01:19:09 crc kubenswrapper[4771]: I0227 01:19:09.291256 4771 scope.go:117] "RemoveContainer" containerID="849e0d2df3dc5711d47411afb2fa0c5e1ea2279fab18b7cd569755e106dd7bf2" Feb 27 01:19:09 crc kubenswrapper[4771]: I0227 01:19:09.319726 4771 scope.go:117] "RemoveContainer" containerID="f71f85544847123cc3034fd716a005bf67474cdae78743d8ea7c69c091df0cef" Feb 27 01:19:09 crc kubenswrapper[4771]: I0227 01:19:09.327942 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54a1b4f0-c80a-4e94-ba7e-b5db00375495-catalog-content\") pod \"54a1b4f0-c80a-4e94-ba7e-b5db00375495\" (UID: \"54a1b4f0-c80a-4e94-ba7e-b5db00375495\") " Feb 27 01:19:09 crc kubenswrapper[4771]: I0227 01:19:09.327973 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgv4c\" (UniqueName: \"kubernetes.io/projected/54a1b4f0-c80a-4e94-ba7e-b5db00375495-kube-api-access-lgv4c\") pod \"54a1b4f0-c80a-4e94-ba7e-b5db00375495\" (UID: \"54a1b4f0-c80a-4e94-ba7e-b5db00375495\") " Feb 27 01:19:09 crc kubenswrapper[4771]: I0227 01:19:09.328811 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54a1b4f0-c80a-4e94-ba7e-b5db00375495-utilities" (OuterVolumeSpecName: "utilities") pod "54a1b4f0-c80a-4e94-ba7e-b5db00375495" (UID: "54a1b4f0-c80a-4e94-ba7e-b5db00375495"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:19:09 crc kubenswrapper[4771]: I0227 01:19:09.328903 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54a1b4f0-c80a-4e94-ba7e-b5db00375495-utilities\") pod \"54a1b4f0-c80a-4e94-ba7e-b5db00375495\" (UID: \"54a1b4f0-c80a-4e94-ba7e-b5db00375495\") " Feb 27 01:19:09 crc kubenswrapper[4771]: I0227 01:19:09.330284 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54a1b4f0-c80a-4e94-ba7e-b5db00375495-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:19:09 crc kubenswrapper[4771]: I0227 01:19:09.333149 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54a1b4f0-c80a-4e94-ba7e-b5db00375495-kube-api-access-lgv4c" (OuterVolumeSpecName: "kube-api-access-lgv4c") pod "54a1b4f0-c80a-4e94-ba7e-b5db00375495" (UID: "54a1b4f0-c80a-4e94-ba7e-b5db00375495"). InnerVolumeSpecName "kube-api-access-lgv4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:19:09 crc kubenswrapper[4771]: I0227 01:19:09.333736 4771 scope.go:117] "RemoveContainer" containerID="4de1ea7dae70961924494dd59c8a1be0a50dcf899b623447023c638a3050abb4" Feb 27 01:19:09 crc kubenswrapper[4771]: E0227 01:19:09.334324 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4de1ea7dae70961924494dd59c8a1be0a50dcf899b623447023c638a3050abb4\": container with ID starting with 4de1ea7dae70961924494dd59c8a1be0a50dcf899b623447023c638a3050abb4 not found: ID does not exist" containerID="4de1ea7dae70961924494dd59c8a1be0a50dcf899b623447023c638a3050abb4" Feb 27 01:19:09 crc kubenswrapper[4771]: I0227 01:19:09.334410 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4de1ea7dae70961924494dd59c8a1be0a50dcf899b623447023c638a3050abb4"} err="failed to get container status \"4de1ea7dae70961924494dd59c8a1be0a50dcf899b623447023c638a3050abb4\": rpc error: code = NotFound desc = could not find container \"4de1ea7dae70961924494dd59c8a1be0a50dcf899b623447023c638a3050abb4\": container with ID starting with 4de1ea7dae70961924494dd59c8a1be0a50dcf899b623447023c638a3050abb4 not found: ID does not exist" Feb 27 01:19:09 crc kubenswrapper[4771]: I0227 01:19:09.334493 4771 scope.go:117] "RemoveContainer" containerID="849e0d2df3dc5711d47411afb2fa0c5e1ea2279fab18b7cd569755e106dd7bf2" Feb 27 01:19:09 crc kubenswrapper[4771]: E0227 01:19:09.335008 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"849e0d2df3dc5711d47411afb2fa0c5e1ea2279fab18b7cd569755e106dd7bf2\": container with ID starting with 849e0d2df3dc5711d47411afb2fa0c5e1ea2279fab18b7cd569755e106dd7bf2 not found: ID does not exist" containerID="849e0d2df3dc5711d47411afb2fa0c5e1ea2279fab18b7cd569755e106dd7bf2" Feb 27 01:19:09 crc kubenswrapper[4771]: I0227 01:19:09.335135 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"849e0d2df3dc5711d47411afb2fa0c5e1ea2279fab18b7cd569755e106dd7bf2"} err="failed to get container status \"849e0d2df3dc5711d47411afb2fa0c5e1ea2279fab18b7cd569755e106dd7bf2\": rpc error: code = NotFound desc = could not find container \"849e0d2df3dc5711d47411afb2fa0c5e1ea2279fab18b7cd569755e106dd7bf2\": container with ID starting with 849e0d2df3dc5711d47411afb2fa0c5e1ea2279fab18b7cd569755e106dd7bf2 not found: ID does not exist" Feb 27 01:19:09 crc kubenswrapper[4771]: I0227 01:19:09.335165 4771 scope.go:117] "RemoveContainer" containerID="f71f85544847123cc3034fd716a005bf67474cdae78743d8ea7c69c091df0cef" Feb 27 01:19:09 crc kubenswrapper[4771]: E0227 01:19:09.337771 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f71f85544847123cc3034fd716a005bf67474cdae78743d8ea7c69c091df0cef\": container with ID starting with f71f85544847123cc3034fd716a005bf67474cdae78743d8ea7c69c091df0cef not found: ID does not exist" containerID="f71f85544847123cc3034fd716a005bf67474cdae78743d8ea7c69c091df0cef" Feb 27 01:19:09 crc kubenswrapper[4771]: I0227 01:19:09.337809 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f71f85544847123cc3034fd716a005bf67474cdae78743d8ea7c69c091df0cef"} err="failed to get container status \"f71f85544847123cc3034fd716a005bf67474cdae78743d8ea7c69c091df0cef\": rpc error: code = NotFound desc = could not find container \"f71f85544847123cc3034fd716a005bf67474cdae78743d8ea7c69c091df0cef\": container with ID starting with f71f85544847123cc3034fd716a005bf67474cdae78743d8ea7c69c091df0cef not found: ID does not exist" Feb 27 01:19:09 crc kubenswrapper[4771]: I0227 01:19:09.388013 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54a1b4f0-c80a-4e94-ba7e-b5db00375495-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54a1b4f0-c80a-4e94-ba7e-b5db00375495" (UID: "54a1b4f0-c80a-4e94-ba7e-b5db00375495"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:19:09 crc kubenswrapper[4771]: I0227 01:19:09.431929 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54a1b4f0-c80a-4e94-ba7e-b5db00375495-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:19:09 crc kubenswrapper[4771]: I0227 01:19:09.431963 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgv4c\" (UniqueName: \"kubernetes.io/projected/54a1b4f0-c80a-4e94-ba7e-b5db00375495-kube-api-access-lgv4c\") on node \"crc\" DevicePath \"\"" Feb 27 01:19:09 crc kubenswrapper[4771]: I0227 01:19:09.586685 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2wn72"] Feb 27 01:19:09 crc kubenswrapper[4771]: I0227 01:19:09.598258 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2wn72"] Feb 27 01:19:09 crc kubenswrapper[4771]: I0227 01:19:09.785026 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54a1b4f0-c80a-4e94-ba7e-b5db00375495" path="/var/lib/kubelet/pods/54a1b4f0-c80a-4e94-ba7e-b5db00375495/volumes" Feb 27 01:19:11 crc kubenswrapper[4771]: I0227 01:19:11.661429 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gwhb5" Feb 27 01:19:11 crc kubenswrapper[4771]: I0227 01:19:11.662281 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gwhb5" Feb 27 01:19:11 crc kubenswrapper[4771]: I0227 01:19:11.740969 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gwhb5" Feb 27 01:19:12 crc kubenswrapper[4771]: I0227 01:19:12.320583 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gwhb5" Feb 27 01:19:15 crc kubenswrapper[4771]: I0227 01:19:15.270056 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gwhb5"] Feb 27 01:19:15 crc kubenswrapper[4771]: I0227 01:19:15.271081 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gwhb5" podUID="9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f" containerName="registry-server" containerID="cri-o://f6d08f041dafbd050c4c56d1a454bc6b8c7215e1ef3b71c52909b3d98bba0fd6" gracePeriod=2 Feb 27 01:19:15 crc kubenswrapper[4771]: I0227 01:19:15.689639 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gwhb5" Feb 27 01:19:15 crc kubenswrapper[4771]: I0227 01:19:15.813690 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mckdm\" (UniqueName: \"kubernetes.io/projected/9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f-kube-api-access-mckdm\") pod \"9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f\" (UID: \"9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f\") " Feb 27 01:19:15 crc kubenswrapper[4771]: I0227 01:19:15.813744 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f-utilities\") pod \"9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f\" (UID: \"9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f\") " Feb 27 01:19:15 crc kubenswrapper[4771]: I0227 01:19:15.813763 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f-catalog-content\") pod \"9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f\" (UID: \"9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f\") " Feb 27 01:19:15 crc kubenswrapper[4771]: I0227 01:19:15.816050 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f-utilities" (OuterVolumeSpecName: "utilities") pod "9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f" (UID: "9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:19:15 crc kubenswrapper[4771]: I0227 01:19:15.822694 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f-kube-api-access-mckdm" (OuterVolumeSpecName: "kube-api-access-mckdm") pod "9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f" (UID: "9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f"). InnerVolumeSpecName "kube-api-access-mckdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:19:15 crc kubenswrapper[4771]: I0227 01:19:15.869294 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f" (UID: "9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:19:15 crc kubenswrapper[4771]: I0227 01:19:15.917314 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mckdm\" (UniqueName: \"kubernetes.io/projected/9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f-kube-api-access-mckdm\") on node \"crc\" DevicePath \"\"" Feb 27 01:19:15 crc kubenswrapper[4771]: I0227 01:19:15.917366 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:19:15 crc kubenswrapper[4771]: I0227 01:19:15.917386 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.322041 4771 generic.go:334] "Generic (PLEG): container finished" podID="9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f" containerID="f6d08f041dafbd050c4c56d1a454bc6b8c7215e1ef3b71c52909b3d98bba0fd6" exitCode=0 Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.322150 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gwhb5" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.322178 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwhb5" event={"ID":"9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f","Type":"ContainerDied","Data":"f6d08f041dafbd050c4c56d1a454bc6b8c7215e1ef3b71c52909b3d98bba0fd6"} Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.322942 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwhb5" event={"ID":"9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f","Type":"ContainerDied","Data":"ab59e6b561b6c60f7c88a91143074a1bf0a352aac4d343ceaabcba82edf9254b"} Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.322974 4771 scope.go:117] "RemoveContainer" containerID="f6d08f041dafbd050c4c56d1a454bc6b8c7215e1ef3b71c52909b3d98bba0fd6" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.359009 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7"] Feb 27 01:19:16 crc kubenswrapper[4771]: E0227 01:19:16.359433 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f" containerName="extract-utilities" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.359457 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f" containerName="extract-utilities" Feb 27 01:19:16 crc kubenswrapper[4771]: E0227 01:19:16.359471 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f" containerName="extract-content" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.359480 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f" containerName="extract-content" Feb 27 01:19:16 crc kubenswrapper[4771]: E0227 01:19:16.359514 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54a1b4f0-c80a-4e94-ba7e-b5db00375495" containerName="extract-content" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.359525 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="54a1b4f0-c80a-4e94-ba7e-b5db00375495" containerName="extract-content" Feb 27 01:19:16 crc kubenswrapper[4771]: E0227 01:19:16.359541 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54a1b4f0-c80a-4e94-ba7e-b5db00375495" containerName="extract-utilities" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.359583 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="54a1b4f0-c80a-4e94-ba7e-b5db00375495" containerName="extract-utilities" Feb 27 01:19:16 crc kubenswrapper[4771]: E0227 01:19:16.359609 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f" containerName="registry-server" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.359620 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f" containerName="registry-server" Feb 27 01:19:16 crc kubenswrapper[4771]: E0227 01:19:16.359655 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54a1b4f0-c80a-4e94-ba7e-b5db00375495" containerName="registry-server" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.359665 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="54a1b4f0-c80a-4e94-ba7e-b5db00375495" containerName="registry-server" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.359824 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="54a1b4f0-c80a-4e94-ba7e-b5db00375495" containerName="registry-server" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.359846 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f" containerName="registry-server" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.360898 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.363145 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.369175 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7"] Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.382626 4771 scope.go:117] "RemoveContainer" containerID="729dc05f6e1a8e3c0638a7cedff5f800d974c940658068c76ed3aecf02e8df3a" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.401134 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gwhb5"] Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.405686 4771 scope.go:117] "RemoveContainer" containerID="d0f47bb08ee0e6bae76229f915f4937f1c1423a989b0b0da0e30108d01e717ce" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.406328 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gwhb5"] Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.427431 4771 scope.go:117] "RemoveContainer" containerID="f6d08f041dafbd050c4c56d1a454bc6b8c7215e1ef3b71c52909b3d98bba0fd6" Feb 27 01:19:16 crc kubenswrapper[4771]: E0227 01:19:16.428042 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6d08f041dafbd050c4c56d1a454bc6b8c7215e1ef3b71c52909b3d98bba0fd6\": container with ID starting with f6d08f041dafbd050c4c56d1a454bc6b8c7215e1ef3b71c52909b3d98bba0fd6 not found: ID does not exist" containerID="f6d08f041dafbd050c4c56d1a454bc6b8c7215e1ef3b71c52909b3d98bba0fd6" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.428104 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6d08f041dafbd050c4c56d1a454bc6b8c7215e1ef3b71c52909b3d98bba0fd6"} err="failed to get container status \"f6d08f041dafbd050c4c56d1a454bc6b8c7215e1ef3b71c52909b3d98bba0fd6\": rpc error: code = NotFound desc = could not find container \"f6d08f041dafbd050c4c56d1a454bc6b8c7215e1ef3b71c52909b3d98bba0fd6\": container with ID starting with f6d08f041dafbd050c4c56d1a454bc6b8c7215e1ef3b71c52909b3d98bba0fd6 not found: ID does not exist" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.428147 4771 scope.go:117] "RemoveContainer" containerID="729dc05f6e1a8e3c0638a7cedff5f800d974c940658068c76ed3aecf02e8df3a" Feb 27 01:19:16 crc kubenswrapper[4771]: E0227 01:19:16.428791 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"729dc05f6e1a8e3c0638a7cedff5f800d974c940658068c76ed3aecf02e8df3a\": container with ID starting with 729dc05f6e1a8e3c0638a7cedff5f800d974c940658068c76ed3aecf02e8df3a not found: ID does not exist" containerID="729dc05f6e1a8e3c0638a7cedff5f800d974c940658068c76ed3aecf02e8df3a" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.428836 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"729dc05f6e1a8e3c0638a7cedff5f800d974c940658068c76ed3aecf02e8df3a"} err="failed to get container status \"729dc05f6e1a8e3c0638a7cedff5f800d974c940658068c76ed3aecf02e8df3a\": rpc error: code = NotFound desc = could not find container \"729dc05f6e1a8e3c0638a7cedff5f800d974c940658068c76ed3aecf02e8df3a\": container with ID starting with 729dc05f6e1a8e3c0638a7cedff5f800d974c940658068c76ed3aecf02e8df3a not found: ID does not exist" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.428870 4771 scope.go:117] "RemoveContainer" containerID="d0f47bb08ee0e6bae76229f915f4937f1c1423a989b0b0da0e30108d01e717ce" Feb 27 01:19:16 crc kubenswrapper[4771]: E0227 01:19:16.429339 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0f47bb08ee0e6bae76229f915f4937f1c1423a989b0b0da0e30108d01e717ce\": container with ID starting with d0f47bb08ee0e6bae76229f915f4937f1c1423a989b0b0da0e30108d01e717ce not found: ID does not exist" containerID="d0f47bb08ee0e6bae76229f915f4937f1c1423a989b0b0da0e30108d01e717ce" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.429385 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0f47bb08ee0e6bae76229f915f4937f1c1423a989b0b0da0e30108d01e717ce"} err="failed to get container status \"d0f47bb08ee0e6bae76229f915f4937f1c1423a989b0b0da0e30108d01e717ce\": rpc error: code = NotFound desc = could not find container \"d0f47bb08ee0e6bae76229f915f4937f1c1423a989b0b0da0e30108d01e717ce\": container with ID starting with d0f47bb08ee0e6bae76229f915f4937f1c1423a989b0b0da0e30108d01e717ce not found: ID does not exist" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.442639 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/008a5eed-f47a-4fd7-8fbe-c442e115da9a-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7\" (UID: \"008a5eed-f47a-4fd7-8fbe-c442e115da9a\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.442734 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw28t\" (UniqueName: \"kubernetes.io/projected/008a5eed-f47a-4fd7-8fbe-c442e115da9a-kube-api-access-bw28t\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7\" (UID: \"008a5eed-f47a-4fd7-8fbe-c442e115da9a\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.442808 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/008a5eed-f47a-4fd7-8fbe-c442e115da9a-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7\" (UID: \"008a5eed-f47a-4fd7-8fbe-c442e115da9a\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.543950 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/008a5eed-f47a-4fd7-8fbe-c442e115da9a-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7\" (UID: \"008a5eed-f47a-4fd7-8fbe-c442e115da9a\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.543997 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw28t\" (UniqueName: \"kubernetes.io/projected/008a5eed-f47a-4fd7-8fbe-c442e115da9a-kube-api-access-bw28t\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7\" (UID: \"008a5eed-f47a-4fd7-8fbe-c442e115da9a\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.544032 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/008a5eed-f47a-4fd7-8fbe-c442e115da9a-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7\" (UID: \"008a5eed-f47a-4fd7-8fbe-c442e115da9a\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.544634 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/008a5eed-f47a-4fd7-8fbe-c442e115da9a-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7\" (UID: \"008a5eed-f47a-4fd7-8fbe-c442e115da9a\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.544641 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/008a5eed-f47a-4fd7-8fbe-c442e115da9a-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7\" (UID: \"008a5eed-f47a-4fd7-8fbe-c442e115da9a\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.564063 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw28t\" (UniqueName: \"kubernetes.io/projected/008a5eed-f47a-4fd7-8fbe-c442e115da9a-kube-api-access-bw28t\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7\" (UID: \"008a5eed-f47a-4fd7-8fbe-c442e115da9a\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.697700 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7" Feb 27 01:19:16 crc kubenswrapper[4771]: I0227 01:19:16.986749 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-65dsm" podUID="db8009a0-8b08-421c-8f35-e3127b0b5e8e" containerName="console" containerID="cri-o://1a941b4ceb914b38949c9af872a4e5af7293c8636bd1a1d01ffd2ad49def31fe" gracePeriod=15 Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.107944 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7"] Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.342037 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7" event={"ID":"008a5eed-f47a-4fd7-8fbe-c442e115da9a","Type":"ContainerStarted","Data":"ffd8079da368cb35a786f4c794a8a21b52415da72a3dc6720e2fbb811545a4f4"} Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.342081 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7" event={"ID":"008a5eed-f47a-4fd7-8fbe-c442e115da9a","Type":"ContainerStarted","Data":"bed4cee0eb843ede00feb80e14adc16bd8bc64f0e373ccfa99cc298e44397292"} Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.346975 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-65dsm_db8009a0-8b08-421c-8f35-e3127b0b5e8e/console/0.log" Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.347017 4771 generic.go:334] "Generic (PLEG): container finished" podID="db8009a0-8b08-421c-8f35-e3127b0b5e8e" containerID="1a941b4ceb914b38949c9af872a4e5af7293c8636bd1a1d01ffd2ad49def31fe" exitCode=2 Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.347042 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-65dsm" event={"ID":"db8009a0-8b08-421c-8f35-e3127b0b5e8e","Type":"ContainerDied","Data":"1a941b4ceb914b38949c9af872a4e5af7293c8636bd1a1d01ffd2ad49def31fe"} Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.416788 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-65dsm_db8009a0-8b08-421c-8f35-e3127b0b5e8e/console/0.log" Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.416856 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.555755 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db8009a0-8b08-421c-8f35-e3127b0b5e8e-console-serving-cert\") pod \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.555878 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db8009a0-8b08-421c-8f35-e3127b0b5e8e-console-oauth-config\") pod \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.555940 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db8009a0-8b08-421c-8f35-e3127b0b5e8e-oauth-serving-cert\") pod \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.556012 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db8009a0-8b08-421c-8f35-e3127b0b5e8e-trusted-ca-bundle\") pod \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.556064 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db8009a0-8b08-421c-8f35-e3127b0b5e8e-service-ca\") pod \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.556125 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db8009a0-8b08-421c-8f35-e3127b0b5e8e-console-config\") pod \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.556183 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qck5x\" (UniqueName: \"kubernetes.io/projected/db8009a0-8b08-421c-8f35-e3127b0b5e8e-kube-api-access-qck5x\") pod \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\" (UID: \"db8009a0-8b08-421c-8f35-e3127b0b5e8e\") " Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.557468 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db8009a0-8b08-421c-8f35-e3127b0b5e8e-console-config" (OuterVolumeSpecName: "console-config") pod "db8009a0-8b08-421c-8f35-e3127b0b5e8e" (UID: "db8009a0-8b08-421c-8f35-e3127b0b5e8e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.557668 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db8009a0-8b08-421c-8f35-e3127b0b5e8e-service-ca" (OuterVolumeSpecName: "service-ca") pod "db8009a0-8b08-421c-8f35-e3127b0b5e8e" (UID: "db8009a0-8b08-421c-8f35-e3127b0b5e8e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.557799 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db8009a0-8b08-421c-8f35-e3127b0b5e8e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "db8009a0-8b08-421c-8f35-e3127b0b5e8e" (UID: "db8009a0-8b08-421c-8f35-e3127b0b5e8e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.558877 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db8009a0-8b08-421c-8f35-e3127b0b5e8e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "db8009a0-8b08-421c-8f35-e3127b0b5e8e" (UID: "db8009a0-8b08-421c-8f35-e3127b0b5e8e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.567988 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db8009a0-8b08-421c-8f35-e3127b0b5e8e-kube-api-access-qck5x" (OuterVolumeSpecName: "kube-api-access-qck5x") pod "db8009a0-8b08-421c-8f35-e3127b0b5e8e" (UID: "db8009a0-8b08-421c-8f35-e3127b0b5e8e"). InnerVolumeSpecName "kube-api-access-qck5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.568398 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8009a0-8b08-421c-8f35-e3127b0b5e8e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "db8009a0-8b08-421c-8f35-e3127b0b5e8e" (UID: "db8009a0-8b08-421c-8f35-e3127b0b5e8e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.572141 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8009a0-8b08-421c-8f35-e3127b0b5e8e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "db8009a0-8b08-421c-8f35-e3127b0b5e8e" (UID: "db8009a0-8b08-421c-8f35-e3127b0b5e8e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.657750 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qck5x\" (UniqueName: \"kubernetes.io/projected/db8009a0-8b08-421c-8f35-e3127b0b5e8e-kube-api-access-qck5x\") on node \"crc\" DevicePath \"\"" Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.657812 4771 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db8009a0-8b08-421c-8f35-e3127b0b5e8e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.657832 4771 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db8009a0-8b08-421c-8f35-e3127b0b5e8e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.657850 4771 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db8009a0-8b08-421c-8f35-e3127b0b5e8e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.657869 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db8009a0-8b08-421c-8f35-e3127b0b5e8e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.657887 4771 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db8009a0-8b08-421c-8f35-e3127b0b5e8e-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.657905 4771 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db8009a0-8b08-421c-8f35-e3127b0b5e8e-console-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:19:17 crc kubenswrapper[4771]: I0227 01:19:17.780026 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f" path="/var/lib/kubelet/pods/9b31c0b5-81eb-4c5f-a79d-ddf6ac86d19f/volumes" Feb 27 01:19:17 crc kubenswrapper[4771]: E0227 01:19:17.909804 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb8009a0_8b08_421c_8f35_e3127b0b5e8e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb8009a0_8b08_421c_8f35_e3127b0b5e8e.slice/crio-dec5b9c457df15717739641c79145cc35de5bfc378494663a11ec140f8b6980e\": RecentStats: unable to find data in memory cache]" Feb 27 01:19:18 crc kubenswrapper[4771]: I0227 01:19:18.354652 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-65dsm_db8009a0-8b08-421c-8f35-e3127b0b5e8e/console/0.log" Feb 27 01:19:18 crc kubenswrapper[4771]: I0227 01:19:18.354858 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-65dsm" event={"ID":"db8009a0-8b08-421c-8f35-e3127b0b5e8e","Type":"ContainerDied","Data":"dec5b9c457df15717739641c79145cc35de5bfc378494663a11ec140f8b6980e"} Feb 27 01:19:18 crc kubenswrapper[4771]: I0227 01:19:18.354904 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-65dsm" Feb 27 01:19:18 crc kubenswrapper[4771]: I0227 01:19:18.354925 4771 scope.go:117] "RemoveContainer" containerID="1a941b4ceb914b38949c9af872a4e5af7293c8636bd1a1d01ffd2ad49def31fe" Feb 27 01:19:18 crc kubenswrapper[4771]: I0227 01:19:18.359902 4771 generic.go:334] "Generic (PLEG): container finished" podID="008a5eed-f47a-4fd7-8fbe-c442e115da9a" containerID="ffd8079da368cb35a786f4c794a8a21b52415da72a3dc6720e2fbb811545a4f4" exitCode=0 Feb 27 01:19:18 crc kubenswrapper[4771]: I0227 01:19:18.359972 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7" event={"ID":"008a5eed-f47a-4fd7-8fbe-c442e115da9a","Type":"ContainerDied","Data":"ffd8079da368cb35a786f4c794a8a21b52415da72a3dc6720e2fbb811545a4f4"} Feb 27 01:19:18 crc kubenswrapper[4771]: I0227 01:19:18.410080 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-65dsm"] Feb 27 01:19:18 crc kubenswrapper[4771]: I0227 01:19:18.417732 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-65dsm"] Feb 27 01:19:19 crc kubenswrapper[4771]: I0227 01:19:19.785470 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db8009a0-8b08-421c-8f35-e3127b0b5e8e" path="/var/lib/kubelet/pods/db8009a0-8b08-421c-8f35-e3127b0b5e8e/volumes" Feb 27 01:19:20 crc kubenswrapper[4771]: I0227 01:19:20.381796 4771 generic.go:334] "Generic (PLEG): container finished" podID="008a5eed-f47a-4fd7-8fbe-c442e115da9a" containerID="eccef6380699f403e3272aedef8046baae2069bc01f9ce8e02309c45b124ace5" exitCode=0 Feb 27 01:19:20 crc kubenswrapper[4771]: I0227 01:19:20.381859 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7" event={"ID":"008a5eed-f47a-4fd7-8fbe-c442e115da9a","Type":"ContainerDied","Data":"eccef6380699f403e3272aedef8046baae2069bc01f9ce8e02309c45b124ace5"} Feb 27 01:19:21 crc kubenswrapper[4771]: I0227 01:19:21.074892 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7m6lh"] Feb 27 01:19:21 crc kubenswrapper[4771]: E0227 01:19:21.075464 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db8009a0-8b08-421c-8f35-e3127b0b5e8e" containerName="console" Feb 27 01:19:21 crc kubenswrapper[4771]: I0227 01:19:21.075479 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="db8009a0-8b08-421c-8f35-e3127b0b5e8e" containerName="console" Feb 27 01:19:21 crc kubenswrapper[4771]: I0227 01:19:21.075641 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="db8009a0-8b08-421c-8f35-e3127b0b5e8e" containerName="console" Feb 27 01:19:21 crc kubenswrapper[4771]: I0227 01:19:21.076621 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7m6lh" Feb 27 01:19:21 crc kubenswrapper[4771]: I0227 01:19:21.104092 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7m6lh"] Feb 27 01:19:21 crc kubenswrapper[4771]: I0227 01:19:21.217202 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f18297b-36f4-49f3-b325-69e7d9e36768-utilities\") pod \"redhat-operators-7m6lh\" (UID: \"2f18297b-36f4-49f3-b325-69e7d9e36768\") " pod="openshift-marketplace/redhat-operators-7m6lh" Feb 27 01:19:21 crc kubenswrapper[4771]: I0227 01:19:21.217328 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f18297b-36f4-49f3-b325-69e7d9e36768-catalog-content\") pod \"redhat-operators-7m6lh\" (UID: \"2f18297b-36f4-49f3-b325-69e7d9e36768\") " pod="openshift-marketplace/redhat-operators-7m6lh" Feb 27 01:19:21 crc kubenswrapper[4771]: I0227 01:19:21.217368 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xv6q\" (UniqueName: \"kubernetes.io/projected/2f18297b-36f4-49f3-b325-69e7d9e36768-kube-api-access-5xv6q\") pod \"redhat-operators-7m6lh\" (UID: \"2f18297b-36f4-49f3-b325-69e7d9e36768\") " pod="openshift-marketplace/redhat-operators-7m6lh" Feb 27 01:19:21 crc kubenswrapper[4771]: I0227 01:19:21.319159 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f18297b-36f4-49f3-b325-69e7d9e36768-utilities\") pod \"redhat-operators-7m6lh\" (UID: \"2f18297b-36f4-49f3-b325-69e7d9e36768\") " pod="openshift-marketplace/redhat-operators-7m6lh" Feb 27 01:19:21 crc kubenswrapper[4771]: I0227 01:19:21.319287 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f18297b-36f4-49f3-b325-69e7d9e36768-catalog-content\") pod \"redhat-operators-7m6lh\" (UID: \"2f18297b-36f4-49f3-b325-69e7d9e36768\") " pod="openshift-marketplace/redhat-operators-7m6lh" Feb 27 01:19:21 crc kubenswrapper[4771]: I0227 01:19:21.319332 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xv6q\" (UniqueName: \"kubernetes.io/projected/2f18297b-36f4-49f3-b325-69e7d9e36768-kube-api-access-5xv6q\") pod \"redhat-operators-7m6lh\" (UID: \"2f18297b-36f4-49f3-b325-69e7d9e36768\") " pod="openshift-marketplace/redhat-operators-7m6lh" Feb 27 01:19:21 crc kubenswrapper[4771]: I0227 01:19:21.320160 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f18297b-36f4-49f3-b325-69e7d9e36768-utilities\") pod \"redhat-operators-7m6lh\" (UID: \"2f18297b-36f4-49f3-b325-69e7d9e36768\") " pod="openshift-marketplace/redhat-operators-7m6lh" Feb 27 01:19:21 crc kubenswrapper[4771]: I0227 01:19:21.320183 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f18297b-36f4-49f3-b325-69e7d9e36768-catalog-content\") pod \"redhat-operators-7m6lh\" (UID: \"2f18297b-36f4-49f3-b325-69e7d9e36768\") " pod="openshift-marketplace/redhat-operators-7m6lh" Feb 27 01:19:21 crc kubenswrapper[4771]: I0227 01:19:21.364363 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xv6q\" (UniqueName: \"kubernetes.io/projected/2f18297b-36f4-49f3-b325-69e7d9e36768-kube-api-access-5xv6q\") pod \"redhat-operators-7m6lh\" (UID: \"2f18297b-36f4-49f3-b325-69e7d9e36768\") " pod="openshift-marketplace/redhat-operators-7m6lh" Feb 27 01:19:21 crc kubenswrapper[4771]: I0227 01:19:21.395640 4771 generic.go:334] "Generic (PLEG): container finished" podID="008a5eed-f47a-4fd7-8fbe-c442e115da9a" containerID="15f31def9cac9ae3f0ae2bffef97e5bae01578bd9c7a35281c4a4d8e464bbffb" exitCode=0 Feb 27 01:19:21 crc kubenswrapper[4771]: I0227 01:19:21.395687 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7" event={"ID":"008a5eed-f47a-4fd7-8fbe-c442e115da9a","Type":"ContainerDied","Data":"15f31def9cac9ae3f0ae2bffef97e5bae01578bd9c7a35281c4a4d8e464bbffb"} Feb 27 01:19:21 crc kubenswrapper[4771]: I0227 01:19:21.431498 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7m6lh" Feb 27 01:19:21 crc kubenswrapper[4771]: I0227 01:19:21.846811 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7m6lh"] Feb 27 01:19:21 crc kubenswrapper[4771]: W0227 01:19:21.851725 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f18297b_36f4_49f3_b325_69e7d9e36768.slice/crio-2453969688b819573a20ff0e57032feca2416d734b266545d80efbb3152ea41b WatchSource:0}: Error finding container 2453969688b819573a20ff0e57032feca2416d734b266545d80efbb3152ea41b: Status 404 returned error can't find the container with id 2453969688b819573a20ff0e57032feca2416d734b266545d80efbb3152ea41b Feb 27 01:19:22 crc kubenswrapper[4771]: I0227 01:19:22.401995 4771 generic.go:334] "Generic (PLEG): container finished" podID="2f18297b-36f4-49f3-b325-69e7d9e36768" containerID="5249214e852122e89505c38630a7ab5f8a1fb9514bf0390482f2455a251e31e9" exitCode=0 Feb 27 01:19:22 crc kubenswrapper[4771]: I0227 01:19:22.402095 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7m6lh" event={"ID":"2f18297b-36f4-49f3-b325-69e7d9e36768","Type":"ContainerDied","Data":"5249214e852122e89505c38630a7ab5f8a1fb9514bf0390482f2455a251e31e9"} Feb 27 01:19:22 crc kubenswrapper[4771]: I0227 01:19:22.402142 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7m6lh" event={"ID":"2f18297b-36f4-49f3-b325-69e7d9e36768","Type":"ContainerStarted","Data":"2453969688b819573a20ff0e57032feca2416d734b266545d80efbb3152ea41b"} Feb 27 01:19:22 crc kubenswrapper[4771]: I0227 01:19:22.628712 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7" Feb 27 01:19:22 crc kubenswrapper[4771]: I0227 01:19:22.737464 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw28t\" (UniqueName: \"kubernetes.io/projected/008a5eed-f47a-4fd7-8fbe-c442e115da9a-kube-api-access-bw28t\") pod \"008a5eed-f47a-4fd7-8fbe-c442e115da9a\" (UID: \"008a5eed-f47a-4fd7-8fbe-c442e115da9a\") " Feb 27 01:19:22 crc kubenswrapper[4771]: I0227 01:19:22.737542 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/008a5eed-f47a-4fd7-8fbe-c442e115da9a-util\") pod \"008a5eed-f47a-4fd7-8fbe-c442e115da9a\" (UID: \"008a5eed-f47a-4fd7-8fbe-c442e115da9a\") " Feb 27 01:19:22 crc kubenswrapper[4771]: I0227 01:19:22.737634 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/008a5eed-f47a-4fd7-8fbe-c442e115da9a-bundle\") pod \"008a5eed-f47a-4fd7-8fbe-c442e115da9a\" (UID: \"008a5eed-f47a-4fd7-8fbe-c442e115da9a\") " Feb 27 01:19:22 crc kubenswrapper[4771]: I0227 01:19:22.738523 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/008a5eed-f47a-4fd7-8fbe-c442e115da9a-bundle" (OuterVolumeSpecName: "bundle") pod "008a5eed-f47a-4fd7-8fbe-c442e115da9a" (UID: "008a5eed-f47a-4fd7-8fbe-c442e115da9a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:19:22 crc kubenswrapper[4771]: I0227 01:19:22.738741 4771 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/008a5eed-f47a-4fd7-8fbe-c442e115da9a-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:19:22 crc kubenswrapper[4771]: I0227 01:19:22.742329 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/008a5eed-f47a-4fd7-8fbe-c442e115da9a-kube-api-access-bw28t" (OuterVolumeSpecName: "kube-api-access-bw28t") pod "008a5eed-f47a-4fd7-8fbe-c442e115da9a" (UID: "008a5eed-f47a-4fd7-8fbe-c442e115da9a"). InnerVolumeSpecName "kube-api-access-bw28t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:19:22 crc kubenswrapper[4771]: I0227 01:19:22.764127 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/008a5eed-f47a-4fd7-8fbe-c442e115da9a-util" (OuterVolumeSpecName: "util") pod "008a5eed-f47a-4fd7-8fbe-c442e115da9a" (UID: "008a5eed-f47a-4fd7-8fbe-c442e115da9a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:19:22 crc kubenswrapper[4771]: I0227 01:19:22.840341 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw28t\" (UniqueName: \"kubernetes.io/projected/008a5eed-f47a-4fd7-8fbe-c442e115da9a-kube-api-access-bw28t\") on node \"crc\" DevicePath \"\"" Feb 27 01:19:22 crc kubenswrapper[4771]: I0227 01:19:22.840974 4771 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/008a5eed-f47a-4fd7-8fbe-c442e115da9a-util\") on node \"crc\" DevicePath \"\"" Feb 27 01:19:23 crc kubenswrapper[4771]: I0227 01:19:23.412401 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7m6lh" event={"ID":"2f18297b-36f4-49f3-b325-69e7d9e36768","Type":"ContainerStarted","Data":"be3b041946d8ef7ef700484fb7867626eb7edbff80b9c259e2e6090bfed64b59"} Feb 27 01:19:23 crc kubenswrapper[4771]: I0227 01:19:23.414988 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7" event={"ID":"008a5eed-f47a-4fd7-8fbe-c442e115da9a","Type":"ContainerDied","Data":"bed4cee0eb843ede00feb80e14adc16bd8bc64f0e373ccfa99cc298e44397292"} Feb 27 01:19:23 crc kubenswrapper[4771]: I0227 01:19:23.415030 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bed4cee0eb843ede00feb80e14adc16bd8bc64f0e373ccfa99cc298e44397292" Feb 27 01:19:23 crc kubenswrapper[4771]: I0227 01:19:23.415049 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7" Feb 27 01:19:24 crc kubenswrapper[4771]: I0227 01:19:24.426530 4771 generic.go:334] "Generic (PLEG): container finished" podID="2f18297b-36f4-49f3-b325-69e7d9e36768" containerID="be3b041946d8ef7ef700484fb7867626eb7edbff80b9c259e2e6090bfed64b59" exitCode=0 Feb 27 01:19:24 crc kubenswrapper[4771]: I0227 01:19:24.426628 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7m6lh" event={"ID":"2f18297b-36f4-49f3-b325-69e7d9e36768","Type":"ContainerDied","Data":"be3b041946d8ef7ef700484fb7867626eb7edbff80b9c259e2e6090bfed64b59"} Feb 27 01:19:25 crc kubenswrapper[4771]: I0227 01:19:25.438144 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7m6lh" event={"ID":"2f18297b-36f4-49f3-b325-69e7d9e36768","Type":"ContainerStarted","Data":"af0420455a94a2ddb5b9df1365699a5db61273027d64ed816bc5731b2acd47bd"} Feb 27 01:19:25 crc kubenswrapper[4771]: I0227 01:19:25.479616 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7m6lh" podStartSLOduration=2.070875792 podStartE2EDuration="4.479580995s" podCreationTimestamp="2026-02-27 01:19:21 +0000 UTC" firstStartedPulling="2026-02-27 01:19:22.404746989 +0000 UTC m=+875.342308277" lastFinishedPulling="2026-02-27 01:19:24.813452172 +0000 UTC m=+877.751013480" observedRunningTime="2026-02-27 01:19:25.469629662 +0000 UTC m=+878.407190990" watchObservedRunningTime="2026-02-27 01:19:25.479580995 +0000 UTC m=+878.417142373" Feb 27 01:19:28 crc kubenswrapper[4771]: I0227 01:19:28.953090 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:19:28 crc kubenswrapper[4771]: I0227 01:19:28.953530 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:19:31 crc kubenswrapper[4771]: I0227 01:19:31.432623 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7m6lh" Feb 27 01:19:31 crc kubenswrapper[4771]: I0227 01:19:31.433024 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7m6lh" Feb 27 01:19:32 crc kubenswrapper[4771]: I0227 01:19:32.493888 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7m6lh" podUID="2f18297b-36f4-49f3-b325-69e7d9e36768" containerName="registry-server" probeResult="failure" output=< Feb 27 01:19:32 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 27 01:19:32 crc kubenswrapper[4771]: > Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.454059 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-d75cc4945-g8fp7"] Feb 27 01:19:35 crc kubenswrapper[4771]: E0227 01:19:35.454775 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="008a5eed-f47a-4fd7-8fbe-c442e115da9a" containerName="pull" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.454799 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="008a5eed-f47a-4fd7-8fbe-c442e115da9a" containerName="pull" Feb 27 01:19:35 crc kubenswrapper[4771]: E0227 01:19:35.454822 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="008a5eed-f47a-4fd7-8fbe-c442e115da9a" containerName="util" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.454835 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="008a5eed-f47a-4fd7-8fbe-c442e115da9a" containerName="util" Feb 27 01:19:35 crc kubenswrapper[4771]: E0227 01:19:35.454853 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="008a5eed-f47a-4fd7-8fbe-c442e115da9a" containerName="extract" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.454865 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="008a5eed-f47a-4fd7-8fbe-c442e115da9a" containerName="extract" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.455033 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="008a5eed-f47a-4fd7-8fbe-c442e115da9a" containerName="extract" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.455648 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-d75cc4945-g8fp7" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.458753 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.466741 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-q7zgm" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.466803 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.466935 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.468514 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.472159 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-d75cc4945-g8fp7"] Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.606085 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3294b45f-a2de-4a92-8466-46c17ddd0238-apiservice-cert\") pod \"metallb-operator-controller-manager-d75cc4945-g8fp7\" (UID: \"3294b45f-a2de-4a92-8466-46c17ddd0238\") " pod="metallb-system/metallb-operator-controller-manager-d75cc4945-g8fp7" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.606154 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3294b45f-a2de-4a92-8466-46c17ddd0238-webhook-cert\") pod \"metallb-operator-controller-manager-d75cc4945-g8fp7\" (UID: \"3294b45f-a2de-4a92-8466-46c17ddd0238\") " pod="metallb-system/metallb-operator-controller-manager-d75cc4945-g8fp7" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.606202 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck6bk\" (UniqueName: \"kubernetes.io/projected/3294b45f-a2de-4a92-8466-46c17ddd0238-kube-api-access-ck6bk\") pod \"metallb-operator-controller-manager-d75cc4945-g8fp7\" (UID: \"3294b45f-a2de-4a92-8466-46c17ddd0238\") " pod="metallb-system/metallb-operator-controller-manager-d75cc4945-g8fp7" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.708223 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3294b45f-a2de-4a92-8466-46c17ddd0238-webhook-cert\") pod \"metallb-operator-controller-manager-d75cc4945-g8fp7\" (UID: \"3294b45f-a2de-4a92-8466-46c17ddd0238\") " pod="metallb-system/metallb-operator-controller-manager-d75cc4945-g8fp7" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.708319 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck6bk\" (UniqueName: \"kubernetes.io/projected/3294b45f-a2de-4a92-8466-46c17ddd0238-kube-api-access-ck6bk\") pod \"metallb-operator-controller-manager-d75cc4945-g8fp7\" (UID: \"3294b45f-a2de-4a92-8466-46c17ddd0238\") " pod="metallb-system/metallb-operator-controller-manager-d75cc4945-g8fp7" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.708373 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3294b45f-a2de-4a92-8466-46c17ddd0238-apiservice-cert\") pod \"metallb-operator-controller-manager-d75cc4945-g8fp7\" (UID: \"3294b45f-a2de-4a92-8466-46c17ddd0238\") " pod="metallb-system/metallb-operator-controller-manager-d75cc4945-g8fp7" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.715155 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3294b45f-a2de-4a92-8466-46c17ddd0238-apiservice-cert\") pod \"metallb-operator-controller-manager-d75cc4945-g8fp7\" (UID: \"3294b45f-a2de-4a92-8466-46c17ddd0238\") " pod="metallb-system/metallb-operator-controller-manager-d75cc4945-g8fp7" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.715608 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3294b45f-a2de-4a92-8466-46c17ddd0238-webhook-cert\") pod \"metallb-operator-controller-manager-d75cc4945-g8fp7\" (UID: \"3294b45f-a2de-4a92-8466-46c17ddd0238\") " pod="metallb-system/metallb-operator-controller-manager-d75cc4945-g8fp7" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.732013 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck6bk\" (UniqueName: \"kubernetes.io/projected/3294b45f-a2de-4a92-8466-46c17ddd0238-kube-api-access-ck6bk\") pod \"metallb-operator-controller-manager-d75cc4945-g8fp7\" (UID: \"3294b45f-a2de-4a92-8466-46c17ddd0238\") " pod="metallb-system/metallb-operator-controller-manager-d75cc4945-g8fp7" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.772046 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-d75cc4945-g8fp7" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.789882 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-65c77fdb5d-6ltq2"] Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.790608 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-65c77fdb5d-6ltq2" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.798408 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.822659 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-99c4d" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.825659 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.828819 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-65c77fdb5d-6ltq2"] Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.838966 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x996p\" (UniqueName: \"kubernetes.io/projected/635de0be-09c0-49ad-905c-49caa1c8b50e-kube-api-access-x996p\") pod \"metallb-operator-webhook-server-65c77fdb5d-6ltq2\" (UID: \"635de0be-09c0-49ad-905c-49caa1c8b50e\") " pod="metallb-system/metallb-operator-webhook-server-65c77fdb5d-6ltq2" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.839162 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/635de0be-09c0-49ad-905c-49caa1c8b50e-webhook-cert\") pod \"metallb-operator-webhook-server-65c77fdb5d-6ltq2\" (UID: \"635de0be-09c0-49ad-905c-49caa1c8b50e\") " pod="metallb-system/metallb-operator-webhook-server-65c77fdb5d-6ltq2" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.839276 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/635de0be-09c0-49ad-905c-49caa1c8b50e-apiservice-cert\") pod \"metallb-operator-webhook-server-65c77fdb5d-6ltq2\" (UID: \"635de0be-09c0-49ad-905c-49caa1c8b50e\") " pod="metallb-system/metallb-operator-webhook-server-65c77fdb5d-6ltq2" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.940483 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/635de0be-09c0-49ad-905c-49caa1c8b50e-webhook-cert\") pod \"metallb-operator-webhook-server-65c77fdb5d-6ltq2\" (UID: \"635de0be-09c0-49ad-905c-49caa1c8b50e\") " pod="metallb-system/metallb-operator-webhook-server-65c77fdb5d-6ltq2" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.940818 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/635de0be-09c0-49ad-905c-49caa1c8b50e-apiservice-cert\") pod \"metallb-operator-webhook-server-65c77fdb5d-6ltq2\" (UID: \"635de0be-09c0-49ad-905c-49caa1c8b50e\") " pod="metallb-system/metallb-operator-webhook-server-65c77fdb5d-6ltq2" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.940852 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x996p\" (UniqueName: \"kubernetes.io/projected/635de0be-09c0-49ad-905c-49caa1c8b50e-kube-api-access-x996p\") pod \"metallb-operator-webhook-server-65c77fdb5d-6ltq2\" (UID: \"635de0be-09c0-49ad-905c-49caa1c8b50e\") " pod="metallb-system/metallb-operator-webhook-server-65c77fdb5d-6ltq2" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.945808 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/635de0be-09c0-49ad-905c-49caa1c8b50e-webhook-cert\") pod \"metallb-operator-webhook-server-65c77fdb5d-6ltq2\" (UID: \"635de0be-09c0-49ad-905c-49caa1c8b50e\") " pod="metallb-system/metallb-operator-webhook-server-65c77fdb5d-6ltq2" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.945946 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/635de0be-09c0-49ad-905c-49caa1c8b50e-apiservice-cert\") pod \"metallb-operator-webhook-server-65c77fdb5d-6ltq2\" (UID: \"635de0be-09c0-49ad-905c-49caa1c8b50e\") " pod="metallb-system/metallb-operator-webhook-server-65c77fdb5d-6ltq2" Feb 27 01:19:35 crc kubenswrapper[4771]: I0227 01:19:35.960447 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x996p\" (UniqueName: \"kubernetes.io/projected/635de0be-09c0-49ad-905c-49caa1c8b50e-kube-api-access-x996p\") pod \"metallb-operator-webhook-server-65c77fdb5d-6ltq2\" (UID: \"635de0be-09c0-49ad-905c-49caa1c8b50e\") " pod="metallb-system/metallb-operator-webhook-server-65c77fdb5d-6ltq2" Feb 27 01:19:36 crc kubenswrapper[4771]: I0227 01:19:36.044356 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-d75cc4945-g8fp7"] Feb 27 01:19:36 crc kubenswrapper[4771]: W0227 01:19:36.053126 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3294b45f_a2de_4a92_8466_46c17ddd0238.slice/crio-642f1f430929c9e51b0977527be4c61974109f5957c41d64805e8aec26fae34a WatchSource:0}: Error finding container 642f1f430929c9e51b0977527be4c61974109f5957c41d64805e8aec26fae34a: Status 404 returned error can't find the container with id 642f1f430929c9e51b0977527be4c61974109f5957c41d64805e8aec26fae34a Feb 27 01:19:36 crc kubenswrapper[4771]: I0227 01:19:36.160385 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-65c77fdb5d-6ltq2" Feb 27 01:19:36 crc kubenswrapper[4771]: I0227 01:19:36.505139 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-d75cc4945-g8fp7" event={"ID":"3294b45f-a2de-4a92-8466-46c17ddd0238","Type":"ContainerStarted","Data":"642f1f430929c9e51b0977527be4c61974109f5957c41d64805e8aec26fae34a"} Feb 27 01:19:36 crc kubenswrapper[4771]: I0227 01:19:36.605622 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-65c77fdb5d-6ltq2"] Feb 27 01:19:37 crc kubenswrapper[4771]: I0227 01:19:37.515774 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-65c77fdb5d-6ltq2" event={"ID":"635de0be-09c0-49ad-905c-49caa1c8b50e","Type":"ContainerStarted","Data":"0639963f111443aa4f886f99de499666a1490cb50e7c9d90071ec066918ef20b"} Feb 27 01:19:39 crc kubenswrapper[4771]: I0227 01:19:39.536764 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-d75cc4945-g8fp7" event={"ID":"3294b45f-a2de-4a92-8466-46c17ddd0238","Type":"ContainerStarted","Data":"59264bab923c80adf4db5aa8ff364cc2317dc1c726782d151fe4ea3c39663245"} Feb 27 01:19:39 crc kubenswrapper[4771]: I0227 01:19:39.537735 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-d75cc4945-g8fp7" Feb 27 01:19:39 crc kubenswrapper[4771]: I0227 01:19:39.560892 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-d75cc4945-g8fp7" podStartSLOduration=1.271780607 podStartE2EDuration="4.56086789s" podCreationTimestamp="2026-02-27 01:19:35 +0000 UTC" firstStartedPulling="2026-02-27 01:19:36.056600154 +0000 UTC m=+888.994161462" lastFinishedPulling="2026-02-27 01:19:39.345687457 +0000 UTC m=+892.283248745" observedRunningTime="2026-02-27 01:19:39.556541511 +0000 UTC m=+892.494102799" watchObservedRunningTime="2026-02-27 01:19:39.56086789 +0000 UTC m=+892.498429188" Feb 27 01:19:41 crc kubenswrapper[4771]: I0227 01:19:41.491328 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7m6lh" Feb 27 01:19:41 crc kubenswrapper[4771]: I0227 01:19:41.535489 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7m6lh" Feb 27 01:19:42 crc kubenswrapper[4771]: I0227 01:19:42.560429 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-65c77fdb5d-6ltq2" event={"ID":"635de0be-09c0-49ad-905c-49caa1c8b50e","Type":"ContainerStarted","Data":"bc28f05ab06dacece9b2ca7651a8fe8fff836f8e425c27b334e92ca7a7560fb5"} Feb 27 01:19:42 crc kubenswrapper[4771]: I0227 01:19:42.560821 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-65c77fdb5d-6ltq2" Feb 27 01:19:42 crc kubenswrapper[4771]: I0227 01:19:42.579363 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-65c77fdb5d-6ltq2" podStartSLOduration=2.39363277 podStartE2EDuration="7.57934682s" podCreationTimestamp="2026-02-27 01:19:35 +0000 UTC" firstStartedPulling="2026-02-27 01:19:36.612423391 +0000 UTC m=+889.549984679" lastFinishedPulling="2026-02-27 01:19:41.798137431 +0000 UTC m=+894.735698729" observedRunningTime="2026-02-27 01:19:42.576127832 +0000 UTC m=+895.513689120" watchObservedRunningTime="2026-02-27 01:19:42.57934682 +0000 UTC m=+895.516908098" Feb 27 01:19:44 crc kubenswrapper[4771]: I0227 01:19:44.670048 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7m6lh"] Feb 27 01:19:44 crc kubenswrapper[4771]: I0227 01:19:44.670462 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7m6lh" podUID="2f18297b-36f4-49f3-b325-69e7d9e36768" containerName="registry-server" containerID="cri-o://af0420455a94a2ddb5b9df1365699a5db61273027d64ed816bc5731b2acd47bd" gracePeriod=2 Feb 27 01:19:45 crc kubenswrapper[4771]: I0227 01:19:45.043745 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7m6lh" Feb 27 01:19:45 crc kubenswrapper[4771]: I0227 01:19:45.062909 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f18297b-36f4-49f3-b325-69e7d9e36768-utilities\") pod \"2f18297b-36f4-49f3-b325-69e7d9e36768\" (UID: \"2f18297b-36f4-49f3-b325-69e7d9e36768\") " Feb 27 01:19:45 crc kubenswrapper[4771]: I0227 01:19:45.063001 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xv6q\" (UniqueName: \"kubernetes.io/projected/2f18297b-36f4-49f3-b325-69e7d9e36768-kube-api-access-5xv6q\") pod \"2f18297b-36f4-49f3-b325-69e7d9e36768\" (UID: \"2f18297b-36f4-49f3-b325-69e7d9e36768\") " Feb 27 01:19:45 crc kubenswrapper[4771]: I0227 01:19:45.064286 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f18297b-36f4-49f3-b325-69e7d9e36768-utilities" (OuterVolumeSpecName: "utilities") pod "2f18297b-36f4-49f3-b325-69e7d9e36768" (UID: "2f18297b-36f4-49f3-b325-69e7d9e36768"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:19:45 crc kubenswrapper[4771]: I0227 01:19:45.074747 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f18297b-36f4-49f3-b325-69e7d9e36768-kube-api-access-5xv6q" (OuterVolumeSpecName: "kube-api-access-5xv6q") pod "2f18297b-36f4-49f3-b325-69e7d9e36768" (UID: "2f18297b-36f4-49f3-b325-69e7d9e36768"). InnerVolumeSpecName "kube-api-access-5xv6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:19:45 crc kubenswrapper[4771]: I0227 01:19:45.164016 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f18297b-36f4-49f3-b325-69e7d9e36768-catalog-content\") pod \"2f18297b-36f4-49f3-b325-69e7d9e36768\" (UID: \"2f18297b-36f4-49f3-b325-69e7d9e36768\") " Feb 27 01:19:45 crc kubenswrapper[4771]: I0227 01:19:45.164428 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f18297b-36f4-49f3-b325-69e7d9e36768-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:19:45 crc kubenswrapper[4771]: I0227 01:19:45.164463 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xv6q\" (UniqueName: \"kubernetes.io/projected/2f18297b-36f4-49f3-b325-69e7d9e36768-kube-api-access-5xv6q\") on node \"crc\" DevicePath \"\"" Feb 27 01:19:45 crc kubenswrapper[4771]: I0227 01:19:45.340570 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f18297b-36f4-49f3-b325-69e7d9e36768-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f18297b-36f4-49f3-b325-69e7d9e36768" (UID: "2f18297b-36f4-49f3-b325-69e7d9e36768"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:19:45 crc kubenswrapper[4771]: I0227 01:19:45.366705 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f18297b-36f4-49f3-b325-69e7d9e36768-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:19:45 crc kubenswrapper[4771]: I0227 01:19:45.584434 4771 generic.go:334] "Generic (PLEG): container finished" podID="2f18297b-36f4-49f3-b325-69e7d9e36768" containerID="af0420455a94a2ddb5b9df1365699a5db61273027d64ed816bc5731b2acd47bd" exitCode=0 Feb 27 01:19:45 crc kubenswrapper[4771]: I0227 01:19:45.584476 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7m6lh" event={"ID":"2f18297b-36f4-49f3-b325-69e7d9e36768","Type":"ContainerDied","Data":"af0420455a94a2ddb5b9df1365699a5db61273027d64ed816bc5731b2acd47bd"} Feb 27 01:19:45 crc kubenswrapper[4771]: I0227 01:19:45.584488 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7m6lh" Feb 27 01:19:45 crc kubenswrapper[4771]: I0227 01:19:45.584535 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7m6lh" event={"ID":"2f18297b-36f4-49f3-b325-69e7d9e36768","Type":"ContainerDied","Data":"2453969688b819573a20ff0e57032feca2416d734b266545d80efbb3152ea41b"} Feb 27 01:19:45 crc kubenswrapper[4771]: I0227 01:19:45.584599 4771 scope.go:117] "RemoveContainer" containerID="af0420455a94a2ddb5b9df1365699a5db61273027d64ed816bc5731b2acd47bd" Feb 27 01:19:45 crc kubenswrapper[4771]: I0227 01:19:45.597770 4771 scope.go:117] "RemoveContainer" containerID="be3b041946d8ef7ef700484fb7867626eb7edbff80b9c259e2e6090bfed64b59" Feb 27 01:19:45 crc kubenswrapper[4771]: I0227 01:19:45.609253 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7m6lh"] Feb 27 01:19:45 crc kubenswrapper[4771]: I0227 01:19:45.617253 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7m6lh"] Feb 27 01:19:45 crc kubenswrapper[4771]: I0227 01:19:45.624014 4771 scope.go:117] "RemoveContainer" containerID="5249214e852122e89505c38630a7ab5f8a1fb9514bf0390482f2455a251e31e9" Feb 27 01:19:45 crc kubenswrapper[4771]: I0227 01:19:45.641420 4771 scope.go:117] "RemoveContainer" containerID="af0420455a94a2ddb5b9df1365699a5db61273027d64ed816bc5731b2acd47bd" Feb 27 01:19:45 crc kubenswrapper[4771]: E0227 01:19:45.641844 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af0420455a94a2ddb5b9df1365699a5db61273027d64ed816bc5731b2acd47bd\": container with ID starting with af0420455a94a2ddb5b9df1365699a5db61273027d64ed816bc5731b2acd47bd not found: ID does not exist" containerID="af0420455a94a2ddb5b9df1365699a5db61273027d64ed816bc5731b2acd47bd" Feb 27 01:19:45 crc kubenswrapper[4771]: I0227 01:19:45.641877 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0420455a94a2ddb5b9df1365699a5db61273027d64ed816bc5731b2acd47bd"} err="failed to get container status \"af0420455a94a2ddb5b9df1365699a5db61273027d64ed816bc5731b2acd47bd\": rpc error: code = NotFound desc = could not find container \"af0420455a94a2ddb5b9df1365699a5db61273027d64ed816bc5731b2acd47bd\": container with ID starting with af0420455a94a2ddb5b9df1365699a5db61273027d64ed816bc5731b2acd47bd not found: ID does not exist" Feb 27 01:19:45 crc kubenswrapper[4771]: I0227 01:19:45.641897 4771 scope.go:117] "RemoveContainer" containerID="be3b041946d8ef7ef700484fb7867626eb7edbff80b9c259e2e6090bfed64b59" Feb 27 01:19:45 crc kubenswrapper[4771]: E0227 01:19:45.642196 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be3b041946d8ef7ef700484fb7867626eb7edbff80b9c259e2e6090bfed64b59\": container with ID starting with be3b041946d8ef7ef700484fb7867626eb7edbff80b9c259e2e6090bfed64b59 not found: ID does not exist" containerID="be3b041946d8ef7ef700484fb7867626eb7edbff80b9c259e2e6090bfed64b59" Feb 27 01:19:45 crc kubenswrapper[4771]: I0227 01:19:45.642217 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be3b041946d8ef7ef700484fb7867626eb7edbff80b9c259e2e6090bfed64b59"} err="failed to get container status \"be3b041946d8ef7ef700484fb7867626eb7edbff80b9c259e2e6090bfed64b59\": rpc error: code = NotFound desc = could not find container \"be3b041946d8ef7ef700484fb7867626eb7edbff80b9c259e2e6090bfed64b59\": container with ID starting with be3b041946d8ef7ef700484fb7867626eb7edbff80b9c259e2e6090bfed64b59 not found: ID does not exist" Feb 27 01:19:45 crc kubenswrapper[4771]: I0227 01:19:45.642229 4771 scope.go:117] "RemoveContainer" containerID="5249214e852122e89505c38630a7ab5f8a1fb9514bf0390482f2455a251e31e9" Feb 27 01:19:45 crc kubenswrapper[4771]: E0227 01:19:45.642590 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5249214e852122e89505c38630a7ab5f8a1fb9514bf0390482f2455a251e31e9\": container with ID starting with 5249214e852122e89505c38630a7ab5f8a1fb9514bf0390482f2455a251e31e9 not found: ID does not exist" containerID="5249214e852122e89505c38630a7ab5f8a1fb9514bf0390482f2455a251e31e9" Feb 27 01:19:45 crc kubenswrapper[4771]: I0227 01:19:45.642639 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5249214e852122e89505c38630a7ab5f8a1fb9514bf0390482f2455a251e31e9"} err="failed to get container status \"5249214e852122e89505c38630a7ab5f8a1fb9514bf0390482f2455a251e31e9\": rpc error: code = NotFound desc = could not find container \"5249214e852122e89505c38630a7ab5f8a1fb9514bf0390482f2455a251e31e9\": container with ID starting with 5249214e852122e89505c38630a7ab5f8a1fb9514bf0390482f2455a251e31e9 not found: ID does not exist" Feb 27 01:19:45 crc kubenswrapper[4771]: I0227 01:19:45.782261 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f18297b-36f4-49f3-b325-69e7d9e36768" path="/var/lib/kubelet/pods/2f18297b-36f4-49f3-b325-69e7d9e36768/volumes" Feb 27 01:19:56 crc kubenswrapper[4771]: I0227 01:19:56.165761 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-65c77fdb5d-6ltq2" Feb 27 01:19:58 crc kubenswrapper[4771]: I0227 01:19:58.952896 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:19:58 crc kubenswrapper[4771]: I0227 01:19:58.953313 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:20:00 crc kubenswrapper[4771]: I0227 01:20:00.151168 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535920-6lvqm"] Feb 27 01:20:00 crc kubenswrapper[4771]: E0227 01:20:00.151597 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f18297b-36f4-49f3-b325-69e7d9e36768" containerName="registry-server" Feb 27 01:20:00 crc kubenswrapper[4771]: I0227 01:20:00.151609 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f18297b-36f4-49f3-b325-69e7d9e36768" containerName="registry-server" Feb 27 01:20:00 crc kubenswrapper[4771]: E0227 01:20:00.151621 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f18297b-36f4-49f3-b325-69e7d9e36768" containerName="extract-utilities" Feb 27 01:20:00 crc kubenswrapper[4771]: I0227 01:20:00.151627 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f18297b-36f4-49f3-b325-69e7d9e36768" containerName="extract-utilities" Feb 27 01:20:00 crc kubenswrapper[4771]: E0227 01:20:00.151645 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f18297b-36f4-49f3-b325-69e7d9e36768" containerName="extract-content" Feb 27 01:20:00 crc kubenswrapper[4771]: I0227 01:20:00.151650 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f18297b-36f4-49f3-b325-69e7d9e36768" containerName="extract-content" Feb 27 01:20:00 crc kubenswrapper[4771]: I0227 01:20:00.151759 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f18297b-36f4-49f3-b325-69e7d9e36768" containerName="registry-server" Feb 27 01:20:00 crc kubenswrapper[4771]: I0227 01:20:00.152086 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535920-6lvqm" Feb 27 01:20:00 crc kubenswrapper[4771]: I0227 01:20:00.154906 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:20:00 crc kubenswrapper[4771]: I0227 01:20:00.155101 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:20:00 crc kubenswrapper[4771]: I0227 01:20:00.155219 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 01:20:00 crc kubenswrapper[4771]: I0227 01:20:00.168883 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535920-6lvqm"] Feb 27 01:20:00 crc kubenswrapper[4771]: I0227 01:20:00.256381 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2mws\" (UniqueName: \"kubernetes.io/projected/db5208a9-6556-4267-8519-a646c7b1aff6-kube-api-access-t2mws\") pod \"auto-csr-approver-29535920-6lvqm\" (UID: \"db5208a9-6556-4267-8519-a646c7b1aff6\") " pod="openshift-infra/auto-csr-approver-29535920-6lvqm" Feb 27 01:20:00 crc kubenswrapper[4771]: I0227 01:20:00.358066 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2mws\" (UniqueName: \"kubernetes.io/projected/db5208a9-6556-4267-8519-a646c7b1aff6-kube-api-access-t2mws\") pod \"auto-csr-approver-29535920-6lvqm\" (UID: \"db5208a9-6556-4267-8519-a646c7b1aff6\") " pod="openshift-infra/auto-csr-approver-29535920-6lvqm" Feb 27 01:20:00 crc kubenswrapper[4771]: I0227 01:20:00.390015 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2mws\" (UniqueName: \"kubernetes.io/projected/db5208a9-6556-4267-8519-a646c7b1aff6-kube-api-access-t2mws\") pod \"auto-csr-approver-29535920-6lvqm\" (UID: \"db5208a9-6556-4267-8519-a646c7b1aff6\") " pod="openshift-infra/auto-csr-approver-29535920-6lvqm" Feb 27 01:20:00 crc kubenswrapper[4771]: I0227 01:20:00.465330 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535920-6lvqm" Feb 27 01:20:00 crc kubenswrapper[4771]: I0227 01:20:00.774106 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535920-6lvqm"] Feb 27 01:20:01 crc kubenswrapper[4771]: I0227 01:20:01.693759 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535920-6lvqm" event={"ID":"db5208a9-6556-4267-8519-a646c7b1aff6","Type":"ContainerStarted","Data":"4308d5aad93b1e67239238666d6546897a0bd406d3235541032e6139529a851a"} Feb 27 01:20:02 crc kubenswrapper[4771]: I0227 01:20:02.700652 4771 generic.go:334] "Generic (PLEG): container finished" podID="db5208a9-6556-4267-8519-a646c7b1aff6" containerID="044206f14224ccf813eacb74b23ed15fc083de8149735fb6e64399341fa97794" exitCode=0 Feb 27 01:20:02 crc kubenswrapper[4771]: I0227 01:20:02.700743 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535920-6lvqm" event={"ID":"db5208a9-6556-4267-8519-a646c7b1aff6","Type":"ContainerDied","Data":"044206f14224ccf813eacb74b23ed15fc083de8149735fb6e64399341fa97794"} Feb 27 01:20:03 crc kubenswrapper[4771]: I0227 01:20:03.983150 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535920-6lvqm" Feb 27 01:20:04 crc kubenswrapper[4771]: I0227 01:20:04.114670 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2mws\" (UniqueName: \"kubernetes.io/projected/db5208a9-6556-4267-8519-a646c7b1aff6-kube-api-access-t2mws\") pod \"db5208a9-6556-4267-8519-a646c7b1aff6\" (UID: \"db5208a9-6556-4267-8519-a646c7b1aff6\") " Feb 27 01:20:04 crc kubenswrapper[4771]: I0227 01:20:04.125765 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db5208a9-6556-4267-8519-a646c7b1aff6-kube-api-access-t2mws" (OuterVolumeSpecName: "kube-api-access-t2mws") pod "db5208a9-6556-4267-8519-a646c7b1aff6" (UID: "db5208a9-6556-4267-8519-a646c7b1aff6"). InnerVolumeSpecName "kube-api-access-t2mws". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:20:04 crc kubenswrapper[4771]: I0227 01:20:04.216531 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2mws\" (UniqueName: \"kubernetes.io/projected/db5208a9-6556-4267-8519-a646c7b1aff6-kube-api-access-t2mws\") on node \"crc\" DevicePath \"\"" Feb 27 01:20:04 crc kubenswrapper[4771]: I0227 01:20:04.717826 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535920-6lvqm" event={"ID":"db5208a9-6556-4267-8519-a646c7b1aff6","Type":"ContainerDied","Data":"4308d5aad93b1e67239238666d6546897a0bd406d3235541032e6139529a851a"} Feb 27 01:20:04 crc kubenswrapper[4771]: I0227 01:20:04.717884 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4308d5aad93b1e67239238666d6546897a0bd406d3235541032e6139529a851a" Feb 27 01:20:04 crc kubenswrapper[4771]: I0227 01:20:04.717891 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535920-6lvqm" Feb 27 01:20:05 crc kubenswrapper[4771]: I0227 01:20:05.048802 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535914-cm6cs"] Feb 27 01:20:05 crc kubenswrapper[4771]: I0227 01:20:05.056290 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535914-cm6cs"] Feb 27 01:20:05 crc kubenswrapper[4771]: I0227 01:20:05.782630 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc341037-7ad9-499e-b1cb-e3523551dcf5" path="/var/lib/kubelet/pods/dc341037-7ad9-499e-b1cb-e3523551dcf5/volumes" Feb 27 01:20:07 crc kubenswrapper[4771]: I0227 01:20:07.502919 4771 scope.go:117] "RemoveContainer" containerID="6008c2469e289a3ca37c87e8197b0bb98a04f606d12591197345ccfdb0bb85f8" Feb 27 01:20:15 crc kubenswrapper[4771]: I0227 01:20:15.784997 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-d75cc4945-g8fp7" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.634721 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-cvvlf"] Feb 27 01:20:16 crc kubenswrapper[4771]: E0227 01:20:16.635443 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db5208a9-6556-4267-8519-a646c7b1aff6" containerName="oc" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.635473 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="db5208a9-6556-4267-8519-a646c7b1aff6" containerName="oc" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.635695 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="db5208a9-6556-4267-8519-a646c7b1aff6" containerName="oc" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.639096 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.641870 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-8mq2q"] Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.642698 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8mq2q" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.642796 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.642886 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-h7mc5" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.647032 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.647160 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.666924 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-8mq2q"] Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.699021 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/88db3f72-8aff-4838-b58f-a37d0e2e2e64-frr-conf\") pod \"frr-k8s-cvvlf\" (UID: \"88db3f72-8aff-4838-b58f-a37d0e2e2e64\") " pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.699079 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76bgp\" (UniqueName: \"kubernetes.io/projected/88db3f72-8aff-4838-b58f-a37d0e2e2e64-kube-api-access-76bgp\") pod \"frr-k8s-cvvlf\" (UID: \"88db3f72-8aff-4838-b58f-a37d0e2e2e64\") " pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.699136 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88db3f72-8aff-4838-b58f-a37d0e2e2e64-metrics-certs\") pod \"frr-k8s-cvvlf\" (UID: \"88db3f72-8aff-4838-b58f-a37d0e2e2e64\") " pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.699168 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/88db3f72-8aff-4838-b58f-a37d0e2e2e64-frr-sockets\") pod \"frr-k8s-cvvlf\" (UID: \"88db3f72-8aff-4838-b58f-a37d0e2e2e64\") " pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.699187 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/88db3f72-8aff-4838-b58f-a37d0e2e2e64-reloader\") pod \"frr-k8s-cvvlf\" (UID: \"88db3f72-8aff-4838-b58f-a37d0e2e2e64\") " pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.699213 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e3da97e-a051-4d50-b905-3ed4c804cfc6-cert\") pod \"frr-k8s-webhook-server-7f989f654f-8mq2q\" (UID: \"4e3da97e-a051-4d50-b905-3ed4c804cfc6\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8mq2q" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.699248 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/88db3f72-8aff-4838-b58f-a37d0e2e2e64-metrics\") pod \"frr-k8s-cvvlf\" (UID: \"88db3f72-8aff-4838-b58f-a37d0e2e2e64\") " pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.699275 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/88db3f72-8aff-4838-b58f-a37d0e2e2e64-frr-startup\") pod \"frr-k8s-cvvlf\" (UID: \"88db3f72-8aff-4838-b58f-a37d0e2e2e64\") " pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.699306 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtc6g\" (UniqueName: \"kubernetes.io/projected/4e3da97e-a051-4d50-b905-3ed4c804cfc6-kube-api-access-mtc6g\") pod \"frr-k8s-webhook-server-7f989f654f-8mq2q\" (UID: \"4e3da97e-a051-4d50-b905-3ed4c804cfc6\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8mq2q" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.714807 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-l5k7x"] Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.715642 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-l5k7x" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.720310 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.720540 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-jfbg5" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.720721 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.720867 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.743444 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-sgfp9"] Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.748652 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-sgfp9" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.753199 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.780881 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-sgfp9"] Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.800345 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/88db3f72-8aff-4838-b58f-a37d0e2e2e64-frr-sockets\") pod \"frr-k8s-cvvlf\" (UID: \"88db3f72-8aff-4838-b58f-a37d0e2e2e64\") " pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.800399 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/88db3f72-8aff-4838-b58f-a37d0e2e2e64-reloader\") pod \"frr-k8s-cvvlf\" (UID: \"88db3f72-8aff-4838-b58f-a37d0e2e2e64\") " pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.800424 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw5kz\" (UniqueName: \"kubernetes.io/projected/61a9b00d-d330-4575-bdac-adff64f6786d-kube-api-access-jw5kz\") pod \"speaker-l5k7x\" (UID: \"61a9b00d-d330-4575-bdac-adff64f6786d\") " pod="metallb-system/speaker-l5k7x" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.800445 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e3da97e-a051-4d50-b905-3ed4c804cfc6-cert\") pod \"frr-k8s-webhook-server-7f989f654f-8mq2q\" (UID: \"4e3da97e-a051-4d50-b905-3ed4c804cfc6\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8mq2q" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.800513 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61a9b00d-d330-4575-bdac-adff64f6786d-metrics-certs\") pod \"speaker-l5k7x\" (UID: \"61a9b00d-d330-4575-bdac-adff64f6786d\") " pod="metallb-system/speaker-l5k7x" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.800536 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd363b49-3f3c-46af-834d-5ab27e2ed35e-metrics-certs\") pod \"controller-86ddb6bd46-sgfp9\" (UID: \"cd363b49-3f3c-46af-834d-5ab27e2ed35e\") " pod="metallb-system/controller-86ddb6bd46-sgfp9" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.800583 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/88db3f72-8aff-4838-b58f-a37d0e2e2e64-metrics\") pod \"frr-k8s-cvvlf\" (UID: \"88db3f72-8aff-4838-b58f-a37d0e2e2e64\") " pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.800604 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/88db3f72-8aff-4838-b58f-a37d0e2e2e64-frr-startup\") pod \"frr-k8s-cvvlf\" (UID: \"88db3f72-8aff-4838-b58f-a37d0e2e2e64\") " pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.800626 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtc6g\" (UniqueName: \"kubernetes.io/projected/4e3da97e-a051-4d50-b905-3ed4c804cfc6-kube-api-access-mtc6g\") pod \"frr-k8s-webhook-server-7f989f654f-8mq2q\" (UID: \"4e3da97e-a051-4d50-b905-3ed4c804cfc6\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8mq2q" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.800667 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/61a9b00d-d330-4575-bdac-adff64f6786d-metallb-excludel2\") pod \"speaker-l5k7x\" (UID: \"61a9b00d-d330-4575-bdac-adff64f6786d\") " pod="metallb-system/speaker-l5k7x" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.800690 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/61a9b00d-d330-4575-bdac-adff64f6786d-memberlist\") pod \"speaker-l5k7x\" (UID: \"61a9b00d-d330-4575-bdac-adff64f6786d\") " pod="metallb-system/speaker-l5k7x" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.800707 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-248wj\" (UniqueName: \"kubernetes.io/projected/cd363b49-3f3c-46af-834d-5ab27e2ed35e-kube-api-access-248wj\") pod \"controller-86ddb6bd46-sgfp9\" (UID: \"cd363b49-3f3c-46af-834d-5ab27e2ed35e\") " pod="metallb-system/controller-86ddb6bd46-sgfp9" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.800757 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/88db3f72-8aff-4838-b58f-a37d0e2e2e64-frr-conf\") pod \"frr-k8s-cvvlf\" (UID: \"88db3f72-8aff-4838-b58f-a37d0e2e2e64\") " pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.800778 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76bgp\" (UniqueName: \"kubernetes.io/projected/88db3f72-8aff-4838-b58f-a37d0e2e2e64-kube-api-access-76bgp\") pod \"frr-k8s-cvvlf\" (UID: \"88db3f72-8aff-4838-b58f-a37d0e2e2e64\") " pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.800815 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88db3f72-8aff-4838-b58f-a37d0e2e2e64-metrics-certs\") pod \"frr-k8s-cvvlf\" (UID: \"88db3f72-8aff-4838-b58f-a37d0e2e2e64\") " pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.800837 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd363b49-3f3c-46af-834d-5ab27e2ed35e-cert\") pod \"controller-86ddb6bd46-sgfp9\" (UID: \"cd363b49-3f3c-46af-834d-5ab27e2ed35e\") " pod="metallb-system/controller-86ddb6bd46-sgfp9" Feb 27 01:20:16 crc kubenswrapper[4771]: E0227 01:20:16.801487 4771 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Feb 27 01:20:16 crc kubenswrapper[4771]: E0227 01:20:16.801560 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e3da97e-a051-4d50-b905-3ed4c804cfc6-cert podName:4e3da97e-a051-4d50-b905-3ed4c804cfc6 nodeName:}" failed. No retries permitted until 2026-02-27 01:20:17.301529562 +0000 UTC m=+930.239090850 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4e3da97e-a051-4d50-b905-3ed4c804cfc6-cert") pod "frr-k8s-webhook-server-7f989f654f-8mq2q" (UID: "4e3da97e-a051-4d50-b905-3ed4c804cfc6") : secret "frr-k8s-webhook-server-cert" not found Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.801629 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/88db3f72-8aff-4838-b58f-a37d0e2e2e64-metrics\") pod \"frr-k8s-cvvlf\" (UID: \"88db3f72-8aff-4838-b58f-a37d0e2e2e64\") " pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.801729 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/88db3f72-8aff-4838-b58f-a37d0e2e2e64-frr-sockets\") pod \"frr-k8s-cvvlf\" (UID: \"88db3f72-8aff-4838-b58f-a37d0e2e2e64\") " pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.801820 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/88db3f72-8aff-4838-b58f-a37d0e2e2e64-reloader\") pod \"frr-k8s-cvvlf\" (UID: \"88db3f72-8aff-4838-b58f-a37d0e2e2e64\") " pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.801924 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/88db3f72-8aff-4838-b58f-a37d0e2e2e64-frr-conf\") pod \"frr-k8s-cvvlf\" (UID: \"88db3f72-8aff-4838-b58f-a37d0e2e2e64\") " pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.802100 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/88db3f72-8aff-4838-b58f-a37d0e2e2e64-frr-startup\") pod \"frr-k8s-cvvlf\" (UID: \"88db3f72-8aff-4838-b58f-a37d0e2e2e64\") " pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.820359 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88db3f72-8aff-4838-b58f-a37d0e2e2e64-metrics-certs\") pod \"frr-k8s-cvvlf\" (UID: \"88db3f72-8aff-4838-b58f-a37d0e2e2e64\") " pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.824400 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76bgp\" (UniqueName: \"kubernetes.io/projected/88db3f72-8aff-4838-b58f-a37d0e2e2e64-kube-api-access-76bgp\") pod \"frr-k8s-cvvlf\" (UID: \"88db3f72-8aff-4838-b58f-a37d0e2e2e64\") " pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.825202 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtc6g\" (UniqueName: \"kubernetes.io/projected/4e3da97e-a051-4d50-b905-3ed4c804cfc6-kube-api-access-mtc6g\") pod \"frr-k8s-webhook-server-7f989f654f-8mq2q\" (UID: \"4e3da97e-a051-4d50-b905-3ed4c804cfc6\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8mq2q" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.905098 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd363b49-3f3c-46af-834d-5ab27e2ed35e-cert\") pod \"controller-86ddb6bd46-sgfp9\" (UID: \"cd363b49-3f3c-46af-834d-5ab27e2ed35e\") " pod="metallb-system/controller-86ddb6bd46-sgfp9" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.905148 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw5kz\" (UniqueName: \"kubernetes.io/projected/61a9b00d-d330-4575-bdac-adff64f6786d-kube-api-access-jw5kz\") pod \"speaker-l5k7x\" (UID: \"61a9b00d-d330-4575-bdac-adff64f6786d\") " pod="metallb-system/speaker-l5k7x" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.905188 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61a9b00d-d330-4575-bdac-adff64f6786d-metrics-certs\") pod \"speaker-l5k7x\" (UID: \"61a9b00d-d330-4575-bdac-adff64f6786d\") " pod="metallb-system/speaker-l5k7x" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.905208 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd363b49-3f3c-46af-834d-5ab27e2ed35e-metrics-certs\") pod \"controller-86ddb6bd46-sgfp9\" (UID: \"cd363b49-3f3c-46af-834d-5ab27e2ed35e\") " pod="metallb-system/controller-86ddb6bd46-sgfp9" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.905241 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/61a9b00d-d330-4575-bdac-adff64f6786d-metallb-excludel2\") pod \"speaker-l5k7x\" (UID: \"61a9b00d-d330-4575-bdac-adff64f6786d\") " pod="metallb-system/speaker-l5k7x" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.905263 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/61a9b00d-d330-4575-bdac-adff64f6786d-memberlist\") pod \"speaker-l5k7x\" (UID: \"61a9b00d-d330-4575-bdac-adff64f6786d\") " pod="metallb-system/speaker-l5k7x" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.905285 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-248wj\" (UniqueName: \"kubernetes.io/projected/cd363b49-3f3c-46af-834d-5ab27e2ed35e-kube-api-access-248wj\") pod \"controller-86ddb6bd46-sgfp9\" (UID: \"cd363b49-3f3c-46af-834d-5ab27e2ed35e\") " pod="metallb-system/controller-86ddb6bd46-sgfp9" Feb 27 01:20:16 crc kubenswrapper[4771]: E0227 01:20:16.905872 4771 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 27 01:20:16 crc kubenswrapper[4771]: E0227 01:20:16.905921 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61a9b00d-d330-4575-bdac-adff64f6786d-memberlist podName:61a9b00d-d330-4575-bdac-adff64f6786d nodeName:}" failed. No retries permitted until 2026-02-27 01:20:17.405907675 +0000 UTC m=+930.343468963 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/61a9b00d-d330-4575-bdac-adff64f6786d-memberlist") pod "speaker-l5k7x" (UID: "61a9b00d-d330-4575-bdac-adff64f6786d") : secret "metallb-memberlist" not found Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.906459 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/61a9b00d-d330-4575-bdac-adff64f6786d-metallb-excludel2\") pod \"speaker-l5k7x\" (UID: \"61a9b00d-d330-4575-bdac-adff64f6786d\") " pod="metallb-system/speaker-l5k7x" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.908405 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.909182 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd363b49-3f3c-46af-834d-5ab27e2ed35e-metrics-certs\") pod \"controller-86ddb6bd46-sgfp9\" (UID: \"cd363b49-3f3c-46af-834d-5ab27e2ed35e\") " pod="metallb-system/controller-86ddb6bd46-sgfp9" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.909726 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61a9b00d-d330-4575-bdac-adff64f6786d-metrics-certs\") pod \"speaker-l5k7x\" (UID: \"61a9b00d-d330-4575-bdac-adff64f6786d\") " pod="metallb-system/speaker-l5k7x" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.919044 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd363b49-3f3c-46af-834d-5ab27e2ed35e-cert\") pod \"controller-86ddb6bd46-sgfp9\" (UID: \"cd363b49-3f3c-46af-834d-5ab27e2ed35e\") " pod="metallb-system/controller-86ddb6bd46-sgfp9" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.920241 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-248wj\" (UniqueName: \"kubernetes.io/projected/cd363b49-3f3c-46af-834d-5ab27e2ed35e-kube-api-access-248wj\") pod \"controller-86ddb6bd46-sgfp9\" (UID: \"cd363b49-3f3c-46af-834d-5ab27e2ed35e\") " pod="metallb-system/controller-86ddb6bd46-sgfp9" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.921469 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw5kz\" (UniqueName: \"kubernetes.io/projected/61a9b00d-d330-4575-bdac-adff64f6786d-kube-api-access-jw5kz\") pod \"speaker-l5k7x\" (UID: \"61a9b00d-d330-4575-bdac-adff64f6786d\") " pod="metallb-system/speaker-l5k7x" Feb 27 01:20:16 crc kubenswrapper[4771]: I0227 01:20:16.973088 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:17 crc kubenswrapper[4771]: I0227 01:20:17.080465 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-sgfp9" Feb 27 01:20:17 crc kubenswrapper[4771]: I0227 01:20:17.311503 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e3da97e-a051-4d50-b905-3ed4c804cfc6-cert\") pod \"frr-k8s-webhook-server-7f989f654f-8mq2q\" (UID: \"4e3da97e-a051-4d50-b905-3ed4c804cfc6\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8mq2q" Feb 27 01:20:17 crc kubenswrapper[4771]: I0227 01:20:17.318712 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e3da97e-a051-4d50-b905-3ed4c804cfc6-cert\") pod \"frr-k8s-webhook-server-7f989f654f-8mq2q\" (UID: \"4e3da97e-a051-4d50-b905-3ed4c804cfc6\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8mq2q" Feb 27 01:20:17 crc kubenswrapper[4771]: I0227 01:20:17.336478 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-sgfp9"] Feb 27 01:20:17 crc kubenswrapper[4771]: W0227 01:20:17.340519 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd363b49_3f3c_46af_834d_5ab27e2ed35e.slice/crio-053eb38ce188757b558925f694e722db73c33c94edd3476052edd1bba74ad2ab WatchSource:0}: Error finding container 053eb38ce188757b558925f694e722db73c33c94edd3476052edd1bba74ad2ab: Status 404 returned error can't find the container with id 053eb38ce188757b558925f694e722db73c33c94edd3476052edd1bba74ad2ab Feb 27 01:20:17 crc kubenswrapper[4771]: I0227 01:20:17.413328 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/61a9b00d-d330-4575-bdac-adff64f6786d-memberlist\") pod \"speaker-l5k7x\" (UID: \"61a9b00d-d330-4575-bdac-adff64f6786d\") " pod="metallb-system/speaker-l5k7x" Feb 27 01:20:17 crc kubenswrapper[4771]: E0227 01:20:17.413620 4771 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 27 01:20:17 crc kubenswrapper[4771]: E0227 01:20:17.413838 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61a9b00d-d330-4575-bdac-adff64f6786d-memberlist podName:61a9b00d-d330-4575-bdac-adff64f6786d nodeName:}" failed. No retries permitted until 2026-02-27 01:20:18.413815398 +0000 UTC m=+931.351376706 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/61a9b00d-d330-4575-bdac-adff64f6786d-memberlist") pod "speaker-l5k7x" (UID: "61a9b00d-d330-4575-bdac-adff64f6786d") : secret "metallb-memberlist" not found Feb 27 01:20:17 crc kubenswrapper[4771]: I0227 01:20:17.586834 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8mq2q" Feb 27 01:20:17 crc kubenswrapper[4771]: I0227 01:20:17.818577 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-sgfp9" event={"ID":"cd363b49-3f3c-46af-834d-5ab27e2ed35e","Type":"ContainerStarted","Data":"8fdddcd2b768de124aae9e875d9f05d6952daa32d1d554f6df4c8958cbc57be7"} Feb 27 01:20:17 crc kubenswrapper[4771]: I0227 01:20:17.819010 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-sgfp9" event={"ID":"cd363b49-3f3c-46af-834d-5ab27e2ed35e","Type":"ContainerStarted","Data":"ffe7ec9b8fd077119deb3a031f11bc87e4ba3b6a4def53c8a047ffb9421b5792"} Feb 27 01:20:17 crc kubenswrapper[4771]: I0227 01:20:17.819032 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-sgfp9" event={"ID":"cd363b49-3f3c-46af-834d-5ab27e2ed35e","Type":"ContainerStarted","Data":"053eb38ce188757b558925f694e722db73c33c94edd3476052edd1bba74ad2ab"} Feb 27 01:20:17 crc kubenswrapper[4771]: I0227 01:20:17.819276 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-sgfp9" Feb 27 01:20:17 crc kubenswrapper[4771]: I0227 01:20:17.820626 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cvvlf" event={"ID":"88db3f72-8aff-4838-b58f-a37d0e2e2e64","Type":"ContainerStarted","Data":"dcb2215298e0b68a5366293d2371e9f45919ce01fd25fdbcf1b9a0b7b853217a"} Feb 27 01:20:17 crc kubenswrapper[4771]: I0227 01:20:17.839089 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-sgfp9" podStartSLOduration=1.839067083 podStartE2EDuration="1.839067083s" podCreationTimestamp="2026-02-27 01:20:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:20:17.838880018 +0000 UTC m=+930.776441326" watchObservedRunningTime="2026-02-27 01:20:17.839067083 +0000 UTC m=+930.776628391" Feb 27 01:20:17 crc kubenswrapper[4771]: I0227 01:20:17.855191 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-8mq2q"] Feb 27 01:20:18 crc kubenswrapper[4771]: I0227 01:20:18.427484 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/61a9b00d-d330-4575-bdac-adff64f6786d-memberlist\") pod \"speaker-l5k7x\" (UID: \"61a9b00d-d330-4575-bdac-adff64f6786d\") " pod="metallb-system/speaker-l5k7x" Feb 27 01:20:18 crc kubenswrapper[4771]: I0227 01:20:18.437093 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/61a9b00d-d330-4575-bdac-adff64f6786d-memberlist\") pod \"speaker-l5k7x\" (UID: \"61a9b00d-d330-4575-bdac-adff64f6786d\") " pod="metallb-system/speaker-l5k7x" Feb 27 01:20:18 crc kubenswrapper[4771]: I0227 01:20:18.534745 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-l5k7x" Feb 27 01:20:18 crc kubenswrapper[4771]: W0227 01:20:18.575411 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61a9b00d_d330_4575_bdac_adff64f6786d.slice/crio-97f6be4da198842d84e6c98c3b15096b0e23220a0a8e1d65cdd3d45f5a5e9176 WatchSource:0}: Error finding container 97f6be4da198842d84e6c98c3b15096b0e23220a0a8e1d65cdd3d45f5a5e9176: Status 404 returned error can't find the container with id 97f6be4da198842d84e6c98c3b15096b0e23220a0a8e1d65cdd3d45f5a5e9176 Feb 27 01:20:18 crc kubenswrapper[4771]: I0227 01:20:18.829073 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8mq2q" event={"ID":"4e3da97e-a051-4d50-b905-3ed4c804cfc6","Type":"ContainerStarted","Data":"732da0085d3cf05b437c3c01ed96cc9917d497eb082f9e16d4b60c88e029d65d"} Feb 27 01:20:18 crc kubenswrapper[4771]: I0227 01:20:18.832086 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-l5k7x" event={"ID":"61a9b00d-d330-4575-bdac-adff64f6786d","Type":"ContainerStarted","Data":"89efea402820c081be17921b828c43b4a74da6be486327d1256d7da1a20f503a"} Feb 27 01:20:18 crc kubenswrapper[4771]: I0227 01:20:18.832117 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-l5k7x" event={"ID":"61a9b00d-d330-4575-bdac-adff64f6786d","Type":"ContainerStarted","Data":"97f6be4da198842d84e6c98c3b15096b0e23220a0a8e1d65cdd3d45f5a5e9176"} Feb 27 01:20:19 crc kubenswrapper[4771]: I0227 01:20:19.885297 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-l5k7x" event={"ID":"61a9b00d-d330-4575-bdac-adff64f6786d","Type":"ContainerStarted","Data":"ef31989b043a3bdb1cc7b978a2af63aa30fc5aed6d26c0f79c28fd52de0c24d4"} Feb 27 01:20:19 crc kubenswrapper[4771]: I0227 01:20:19.885922 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-l5k7x" Feb 27 01:20:19 crc kubenswrapper[4771]: I0227 01:20:19.920411 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-l5k7x" podStartSLOduration=3.9203906760000002 podStartE2EDuration="3.920390676s" podCreationTimestamp="2026-02-27 01:20:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:20:19.915735908 +0000 UTC m=+932.853297196" watchObservedRunningTime="2026-02-27 01:20:19.920390676 +0000 UTC m=+932.857951964" Feb 27 01:20:24 crc kubenswrapper[4771]: I0227 01:20:24.918159 4771 generic.go:334] "Generic (PLEG): container finished" podID="88db3f72-8aff-4838-b58f-a37d0e2e2e64" containerID="b61c50533083c545a1001cc357bd231065d5dcbcc04034d0e7fb87f8851751b7" exitCode=0 Feb 27 01:20:24 crc kubenswrapper[4771]: I0227 01:20:24.918295 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cvvlf" event={"ID":"88db3f72-8aff-4838-b58f-a37d0e2e2e64","Type":"ContainerDied","Data":"b61c50533083c545a1001cc357bd231065d5dcbcc04034d0e7fb87f8851751b7"} Feb 27 01:20:24 crc kubenswrapper[4771]: I0227 01:20:24.921513 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8mq2q" event={"ID":"4e3da97e-a051-4d50-b905-3ed4c804cfc6","Type":"ContainerStarted","Data":"8651ff0a23a4b04fd5e9cfa63334bca3e8dd35ed97f4d90fa530a685e7b7e24d"} Feb 27 01:20:24 crc kubenswrapper[4771]: I0227 01:20:24.921732 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8mq2q" Feb 27 01:20:24 crc kubenswrapper[4771]: I0227 01:20:24.980258 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8mq2q" podStartSLOduration=2.358208594 podStartE2EDuration="8.980238693s" podCreationTimestamp="2026-02-27 01:20:16 +0000 UTC" firstStartedPulling="2026-02-27 01:20:17.864301885 +0000 UTC m=+930.801863193" lastFinishedPulling="2026-02-27 01:20:24.486332004 +0000 UTC m=+937.423893292" observedRunningTime="2026-02-27 01:20:24.977352864 +0000 UTC m=+937.914914162" watchObservedRunningTime="2026-02-27 01:20:24.980238693 +0000 UTC m=+937.917799981" Feb 27 01:20:25 crc kubenswrapper[4771]: I0227 01:20:25.928503 4771 generic.go:334] "Generic (PLEG): container finished" podID="88db3f72-8aff-4838-b58f-a37d0e2e2e64" containerID="02b61e1ed0e55002dcb8c8b1226158fbb99f0f9f20c234cbd3ea7e883cc05eb6" exitCode=0 Feb 27 01:20:25 crc kubenswrapper[4771]: I0227 01:20:25.928580 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cvvlf" event={"ID":"88db3f72-8aff-4838-b58f-a37d0e2e2e64","Type":"ContainerDied","Data":"02b61e1ed0e55002dcb8c8b1226158fbb99f0f9f20c234cbd3ea7e883cc05eb6"} Feb 27 01:20:26 crc kubenswrapper[4771]: I0227 01:20:26.936630 4771 generic.go:334] "Generic (PLEG): container finished" podID="88db3f72-8aff-4838-b58f-a37d0e2e2e64" containerID="e370fed9946ca7dc6a448c845cd72a4b8cf081b18633bf93075de2e416497135" exitCode=0 Feb 27 01:20:26 crc kubenswrapper[4771]: I0227 01:20:26.936700 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cvvlf" event={"ID":"88db3f72-8aff-4838-b58f-a37d0e2e2e64","Type":"ContainerDied","Data":"e370fed9946ca7dc6a448c845cd72a4b8cf081b18633bf93075de2e416497135"} Feb 27 01:20:27 crc kubenswrapper[4771]: I0227 01:20:27.088419 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-sgfp9" Feb 27 01:20:27 crc kubenswrapper[4771]: I0227 01:20:27.958543 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cvvlf" event={"ID":"88db3f72-8aff-4838-b58f-a37d0e2e2e64","Type":"ContainerStarted","Data":"36e514592eda9ebf32cced8b8ec87a075d791fe742a84c380b17e98fcf0d809a"} Feb 27 01:20:27 crc kubenswrapper[4771]: I0227 01:20:27.958625 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cvvlf" event={"ID":"88db3f72-8aff-4838-b58f-a37d0e2e2e64","Type":"ContainerStarted","Data":"37c22a2e951388574d0b2cbc89addaedf25e56699d81b4576f39209e3a5ab0f9"} Feb 27 01:20:27 crc kubenswrapper[4771]: I0227 01:20:27.958643 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cvvlf" event={"ID":"88db3f72-8aff-4838-b58f-a37d0e2e2e64","Type":"ContainerStarted","Data":"0b56810318ba7134b058d2d310ca0161ab6e29fb49e95ea1ef301ae0a9beede4"} Feb 27 01:20:27 crc kubenswrapper[4771]: I0227 01:20:27.958658 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cvvlf" event={"ID":"88db3f72-8aff-4838-b58f-a37d0e2e2e64","Type":"ContainerStarted","Data":"177b21728cd3257a98f5497a6d0a1bfb3ec142d692290f356a143420fe6368d4"} Feb 27 01:20:27 crc kubenswrapper[4771]: I0227 01:20:27.958672 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cvvlf" event={"ID":"88db3f72-8aff-4838-b58f-a37d0e2e2e64","Type":"ContainerStarted","Data":"7745638d2f04523af508fd254944773a52fc63296b1744d176b5d4e5eca6f9e4"} Feb 27 01:20:28 crc kubenswrapper[4771]: I0227 01:20:28.539448 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-l5k7x" Feb 27 01:20:28 crc kubenswrapper[4771]: I0227 01:20:28.952904 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:20:28 crc kubenswrapper[4771]: I0227 01:20:28.952960 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:20:28 crc kubenswrapper[4771]: I0227 01:20:28.953008 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 01:20:28 crc kubenswrapper[4771]: I0227 01:20:28.953645 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c06019bd1d417bdca00ed2eff4e51501f46dbc51fa52f89a80770d81ea06c432"} pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 01:20:28 crc kubenswrapper[4771]: I0227 01:20:28.953707 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" containerID="cri-o://c06019bd1d417bdca00ed2eff4e51501f46dbc51fa52f89a80770d81ea06c432" gracePeriod=600 Feb 27 01:20:28 crc kubenswrapper[4771]: I0227 01:20:28.968574 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cvvlf" event={"ID":"88db3f72-8aff-4838-b58f-a37d0e2e2e64","Type":"ContainerStarted","Data":"241072bd7ee396ab3a031e7b0d69f4d5d808d34cde948de518668f58f76ed7f6"} Feb 27 01:20:28 crc kubenswrapper[4771]: I0227 01:20:28.968752 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:29 crc kubenswrapper[4771]: I0227 01:20:29.005276 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-cvvlf" podStartSLOduration=5.671359377 podStartE2EDuration="13.005251364s" podCreationTimestamp="2026-02-27 01:20:16 +0000 UTC" firstStartedPulling="2026-02-27 01:20:17.106715224 +0000 UTC m=+930.044276522" lastFinishedPulling="2026-02-27 01:20:24.440607221 +0000 UTC m=+937.378168509" observedRunningTime="2026-02-27 01:20:28.999287711 +0000 UTC m=+941.936849009" watchObservedRunningTime="2026-02-27 01:20:29.005251364 +0000 UTC m=+941.942812662" Feb 27 01:20:29 crc kubenswrapper[4771]: I0227 01:20:29.978791 4771 generic.go:334] "Generic (PLEG): container finished" podID="ca81e505-d53f-496e-bd26-7cec669591e4" containerID="c06019bd1d417bdca00ed2eff4e51501f46dbc51fa52f89a80770d81ea06c432" exitCode=0 Feb 27 01:20:29 crc kubenswrapper[4771]: I0227 01:20:29.981735 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerDied","Data":"c06019bd1d417bdca00ed2eff4e51501f46dbc51fa52f89a80770d81ea06c432"} Feb 27 01:20:29 crc kubenswrapper[4771]: I0227 01:20:29.981968 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerStarted","Data":"f3112e69f234defa1fcff4a9c5517c895c98346bf69153547a5fa6e13f50fed1"} Feb 27 01:20:29 crc kubenswrapper[4771]: I0227 01:20:29.982179 4771 scope.go:117] "RemoveContainer" containerID="dc866424c36588a9cdf7ab45975036bf986f480af4ea79144c7263e416051408" Feb 27 01:20:31 crc kubenswrapper[4771]: I0227 01:20:31.276336 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-f5drf"] Feb 27 01:20:31 crc kubenswrapper[4771]: I0227 01:20:31.278415 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-f5drf" Feb 27 01:20:31 crc kubenswrapper[4771]: I0227 01:20:31.281368 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 27 01:20:31 crc kubenswrapper[4771]: I0227 01:20:31.282158 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-nngmk" Feb 27 01:20:31 crc kubenswrapper[4771]: I0227 01:20:31.282753 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 27 01:20:31 crc kubenswrapper[4771]: I0227 01:20:31.304423 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-f5drf"] Feb 27 01:20:31 crc kubenswrapper[4771]: I0227 01:20:31.440798 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lg88\" (UniqueName: \"kubernetes.io/projected/d7ad77ef-f795-4824-9621-394d1c33dfef-kube-api-access-5lg88\") pod \"openstack-operator-index-f5drf\" (UID: \"d7ad77ef-f795-4824-9621-394d1c33dfef\") " pod="openstack-operators/openstack-operator-index-f5drf" Feb 27 01:20:31 crc kubenswrapper[4771]: I0227 01:20:31.542295 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lg88\" (UniqueName: \"kubernetes.io/projected/d7ad77ef-f795-4824-9621-394d1c33dfef-kube-api-access-5lg88\") pod \"openstack-operator-index-f5drf\" (UID: \"d7ad77ef-f795-4824-9621-394d1c33dfef\") " pod="openstack-operators/openstack-operator-index-f5drf" Feb 27 01:20:31 crc kubenswrapper[4771]: I0227 01:20:31.564861 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lg88\" (UniqueName: \"kubernetes.io/projected/d7ad77ef-f795-4824-9621-394d1c33dfef-kube-api-access-5lg88\") pod \"openstack-operator-index-f5drf\" (UID: \"d7ad77ef-f795-4824-9621-394d1c33dfef\") " pod="openstack-operators/openstack-operator-index-f5drf" Feb 27 01:20:31 crc kubenswrapper[4771]: I0227 01:20:31.597208 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-f5drf" Feb 27 01:20:31 crc kubenswrapper[4771]: I0227 01:20:31.887133 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-f5drf"] Feb 27 01:20:31 crc kubenswrapper[4771]: I0227 01:20:31.974152 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:32 crc kubenswrapper[4771]: I0227 01:20:32.005133 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-f5drf" event={"ID":"d7ad77ef-f795-4824-9621-394d1c33dfef","Type":"ContainerStarted","Data":"b972b782f0d0d16a43bd76e103cbc989c216b611049045b56df42ef2ef85f631"} Feb 27 01:20:32 crc kubenswrapper[4771]: I0227 01:20:32.024707 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:34 crc kubenswrapper[4771]: I0227 01:20:34.669234 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-f5drf"] Feb 27 01:20:35 crc kubenswrapper[4771]: I0227 01:20:35.026483 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-f5drf" event={"ID":"d7ad77ef-f795-4824-9621-394d1c33dfef","Type":"ContainerStarted","Data":"1f5df769e041a56d75eff78d3be4e15bd798a0597e8dbee1eb5a7d67a6873415"} Feb 27 01:20:35 crc kubenswrapper[4771]: I0227 01:20:35.057995 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-f5drf" podStartSLOduration=1.639159996 podStartE2EDuration="4.057964547s" podCreationTimestamp="2026-02-27 01:20:31 +0000 UTC" firstStartedPulling="2026-02-27 01:20:31.897275935 +0000 UTC m=+944.834837233" lastFinishedPulling="2026-02-27 01:20:34.316080496 +0000 UTC m=+947.253641784" observedRunningTime="2026-02-27 01:20:35.049747131 +0000 UTC m=+947.987308459" watchObservedRunningTime="2026-02-27 01:20:35.057964547 +0000 UTC m=+947.995525865" Feb 27 01:20:35 crc kubenswrapper[4771]: I0227 01:20:35.282816 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-d8jl8"] Feb 27 01:20:35 crc kubenswrapper[4771]: I0227 01:20:35.284271 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-d8jl8" Feb 27 01:20:35 crc kubenswrapper[4771]: I0227 01:20:35.294766 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-d8jl8"] Feb 27 01:20:35 crc kubenswrapper[4771]: I0227 01:20:35.401395 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8m98\" (UniqueName: \"kubernetes.io/projected/6846ec0e-56f5-4bad-9539-0f6578027f45-kube-api-access-m8m98\") pod \"openstack-operator-index-d8jl8\" (UID: \"6846ec0e-56f5-4bad-9539-0f6578027f45\") " pod="openstack-operators/openstack-operator-index-d8jl8" Feb 27 01:20:35 crc kubenswrapper[4771]: I0227 01:20:35.502538 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8m98\" (UniqueName: \"kubernetes.io/projected/6846ec0e-56f5-4bad-9539-0f6578027f45-kube-api-access-m8m98\") pod \"openstack-operator-index-d8jl8\" (UID: \"6846ec0e-56f5-4bad-9539-0f6578027f45\") " pod="openstack-operators/openstack-operator-index-d8jl8" Feb 27 01:20:35 crc kubenswrapper[4771]: I0227 01:20:35.523127 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8m98\" (UniqueName: \"kubernetes.io/projected/6846ec0e-56f5-4bad-9539-0f6578027f45-kube-api-access-m8m98\") pod \"openstack-operator-index-d8jl8\" (UID: \"6846ec0e-56f5-4bad-9539-0f6578027f45\") " pod="openstack-operators/openstack-operator-index-d8jl8" Feb 27 01:20:35 crc kubenswrapper[4771]: I0227 01:20:35.618517 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-d8jl8" Feb 27 01:20:35 crc kubenswrapper[4771]: I0227 01:20:35.859786 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-d8jl8"] Feb 27 01:20:35 crc kubenswrapper[4771]: W0227 01:20:35.871714 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6846ec0e_56f5_4bad_9539_0f6578027f45.slice/crio-a899abf820f8c2dd778e9a0cda3c64937f6aee151a6eaae39198a7ec89693d4b WatchSource:0}: Error finding container a899abf820f8c2dd778e9a0cda3c64937f6aee151a6eaae39198a7ec89693d4b: Status 404 returned error can't find the container with id a899abf820f8c2dd778e9a0cda3c64937f6aee151a6eaae39198a7ec89693d4b Feb 27 01:20:36 crc kubenswrapper[4771]: I0227 01:20:36.037999 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-d8jl8" event={"ID":"6846ec0e-56f5-4bad-9539-0f6578027f45","Type":"ContainerStarted","Data":"a899abf820f8c2dd778e9a0cda3c64937f6aee151a6eaae39198a7ec89693d4b"} Feb 27 01:20:36 crc kubenswrapper[4771]: I0227 01:20:36.038155 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-f5drf" podUID="d7ad77ef-f795-4824-9621-394d1c33dfef" containerName="registry-server" containerID="cri-o://1f5df769e041a56d75eff78d3be4e15bd798a0597e8dbee1eb5a7d67a6873415" gracePeriod=2 Feb 27 01:20:36 crc kubenswrapper[4771]: I0227 01:20:36.398516 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-f5drf" Feb 27 01:20:36 crc kubenswrapper[4771]: I0227 01:20:36.524005 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lg88\" (UniqueName: \"kubernetes.io/projected/d7ad77ef-f795-4824-9621-394d1c33dfef-kube-api-access-5lg88\") pod \"d7ad77ef-f795-4824-9621-394d1c33dfef\" (UID: \"d7ad77ef-f795-4824-9621-394d1c33dfef\") " Feb 27 01:20:36 crc kubenswrapper[4771]: I0227 01:20:36.533938 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7ad77ef-f795-4824-9621-394d1c33dfef-kube-api-access-5lg88" (OuterVolumeSpecName: "kube-api-access-5lg88") pod "d7ad77ef-f795-4824-9621-394d1c33dfef" (UID: "d7ad77ef-f795-4824-9621-394d1c33dfef"). InnerVolumeSpecName "kube-api-access-5lg88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:20:36 crc kubenswrapper[4771]: I0227 01:20:36.625936 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lg88\" (UniqueName: \"kubernetes.io/projected/d7ad77ef-f795-4824-9621-394d1c33dfef-kube-api-access-5lg88\") on node \"crc\" DevicePath \"\"" Feb 27 01:20:37 crc kubenswrapper[4771]: I0227 01:20:37.046817 4771 generic.go:334] "Generic (PLEG): container finished" podID="d7ad77ef-f795-4824-9621-394d1c33dfef" containerID="1f5df769e041a56d75eff78d3be4e15bd798a0597e8dbee1eb5a7d67a6873415" exitCode=0 Feb 27 01:20:37 crc kubenswrapper[4771]: I0227 01:20:37.046919 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-f5drf" event={"ID":"d7ad77ef-f795-4824-9621-394d1c33dfef","Type":"ContainerDied","Data":"1f5df769e041a56d75eff78d3be4e15bd798a0597e8dbee1eb5a7d67a6873415"} Feb 27 01:20:37 crc kubenswrapper[4771]: I0227 01:20:37.046967 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-f5drf" event={"ID":"d7ad77ef-f795-4824-9621-394d1c33dfef","Type":"ContainerDied","Data":"b972b782f0d0d16a43bd76e103cbc989c216b611049045b56df42ef2ef85f631"} Feb 27 01:20:37 crc kubenswrapper[4771]: I0227 01:20:37.046988 4771 scope.go:117] "RemoveContainer" containerID="1f5df769e041a56d75eff78d3be4e15bd798a0597e8dbee1eb5a7d67a6873415" Feb 27 01:20:37 crc kubenswrapper[4771]: I0227 01:20:37.048343 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-f5drf" Feb 27 01:20:37 crc kubenswrapper[4771]: I0227 01:20:37.049412 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-d8jl8" event={"ID":"6846ec0e-56f5-4bad-9539-0f6578027f45","Type":"ContainerStarted","Data":"9d83abd813ad614e7788688203380da3f1ec4026c61b535adc227ae3946231a9"} Feb 27 01:20:37 crc kubenswrapper[4771]: I0227 01:20:37.072308 4771 scope.go:117] "RemoveContainer" containerID="1f5df769e041a56d75eff78d3be4e15bd798a0597e8dbee1eb5a7d67a6873415" Feb 27 01:20:37 crc kubenswrapper[4771]: E0227 01:20:37.072734 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f5df769e041a56d75eff78d3be4e15bd798a0597e8dbee1eb5a7d67a6873415\": container with ID starting with 1f5df769e041a56d75eff78d3be4e15bd798a0597e8dbee1eb5a7d67a6873415 not found: ID does not exist" containerID="1f5df769e041a56d75eff78d3be4e15bd798a0597e8dbee1eb5a7d67a6873415" Feb 27 01:20:37 crc kubenswrapper[4771]: I0227 01:20:37.072775 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f5df769e041a56d75eff78d3be4e15bd798a0597e8dbee1eb5a7d67a6873415"} err="failed to get container status \"1f5df769e041a56d75eff78d3be4e15bd798a0597e8dbee1eb5a7d67a6873415\": rpc error: code = NotFound desc = could not find container \"1f5df769e041a56d75eff78d3be4e15bd798a0597e8dbee1eb5a7d67a6873415\": container with ID starting with 1f5df769e041a56d75eff78d3be4e15bd798a0597e8dbee1eb5a7d67a6873415 not found: ID does not exist" Feb 27 01:20:37 crc kubenswrapper[4771]: I0227 01:20:37.074337 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-d8jl8" podStartSLOduration=2.018241309 podStartE2EDuration="2.074315477s" podCreationTimestamp="2026-02-27 01:20:35 +0000 UTC" firstStartedPulling="2026-02-27 01:20:35.879207934 +0000 UTC m=+948.816769232" lastFinishedPulling="2026-02-27 01:20:35.935282112 +0000 UTC m=+948.872843400" observedRunningTime="2026-02-27 01:20:37.069596508 +0000 UTC m=+950.007157806" watchObservedRunningTime="2026-02-27 01:20:37.074315477 +0000 UTC m=+950.011876785" Feb 27 01:20:37 crc kubenswrapper[4771]: I0227 01:20:37.092495 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-f5drf"] Feb 27 01:20:37 crc kubenswrapper[4771]: I0227 01:20:37.100541 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-f5drf"] Feb 27 01:20:37 crc kubenswrapper[4771]: I0227 01:20:37.595057 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8mq2q" Feb 27 01:20:37 crc kubenswrapper[4771]: I0227 01:20:37.786814 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7ad77ef-f795-4824-9621-394d1c33dfef" path="/var/lib/kubelet/pods/d7ad77ef-f795-4824-9621-394d1c33dfef/volumes" Feb 27 01:20:45 crc kubenswrapper[4771]: I0227 01:20:45.619333 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-d8jl8" Feb 27 01:20:45 crc kubenswrapper[4771]: I0227 01:20:45.620139 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-d8jl8" Feb 27 01:20:45 crc kubenswrapper[4771]: I0227 01:20:45.669010 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-d8jl8" Feb 27 01:20:46 crc kubenswrapper[4771]: I0227 01:20:46.153868 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-d8jl8" Feb 27 01:20:46 crc kubenswrapper[4771]: I0227 01:20:46.927753 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h"] Feb 27 01:20:46 crc kubenswrapper[4771]: E0227 01:20:46.937437 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7ad77ef-f795-4824-9621-394d1c33dfef" containerName="registry-server" Feb 27 01:20:46 crc kubenswrapper[4771]: I0227 01:20:46.937651 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7ad77ef-f795-4824-9621-394d1c33dfef" containerName="registry-server" Feb 27 01:20:46 crc kubenswrapper[4771]: I0227 01:20:46.938017 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7ad77ef-f795-4824-9621-394d1c33dfef" containerName="registry-server" Feb 27 01:20:46 crc kubenswrapper[4771]: I0227 01:20:46.939730 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h" Feb 27 01:20:46 crc kubenswrapper[4771]: I0227 01:20:46.942760 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-qdk47" Feb 27 01:20:46 crc kubenswrapper[4771]: I0227 01:20:46.963521 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h"] Feb 27 01:20:46 crc kubenswrapper[4771]: I0227 01:20:46.985794 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-cvvlf" Feb 27 01:20:47 crc kubenswrapper[4771]: I0227 01:20:47.075902 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzmlr\" (UniqueName: \"kubernetes.io/projected/46247d46-066d-45e2-975a-8404fd28a0ac-kube-api-access-lzmlr\") pod \"d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h\" (UID: \"46247d46-066d-45e2-975a-8404fd28a0ac\") " pod="openstack-operators/d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h" Feb 27 01:20:47 crc kubenswrapper[4771]: I0227 01:20:47.076011 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46247d46-066d-45e2-975a-8404fd28a0ac-util\") pod \"d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h\" (UID: \"46247d46-066d-45e2-975a-8404fd28a0ac\") " pod="openstack-operators/d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h" Feb 27 01:20:47 crc kubenswrapper[4771]: I0227 01:20:47.076159 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46247d46-066d-45e2-975a-8404fd28a0ac-bundle\") pod \"d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h\" (UID: \"46247d46-066d-45e2-975a-8404fd28a0ac\") " pod="openstack-operators/d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h" Feb 27 01:20:47 crc kubenswrapper[4771]: I0227 01:20:47.177435 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzmlr\" (UniqueName: \"kubernetes.io/projected/46247d46-066d-45e2-975a-8404fd28a0ac-kube-api-access-lzmlr\") pod \"d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h\" (UID: \"46247d46-066d-45e2-975a-8404fd28a0ac\") " pod="openstack-operators/d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h" Feb 27 01:20:47 crc kubenswrapper[4771]: I0227 01:20:47.177588 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46247d46-066d-45e2-975a-8404fd28a0ac-util\") pod \"d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h\" (UID: \"46247d46-066d-45e2-975a-8404fd28a0ac\") " pod="openstack-operators/d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h" Feb 27 01:20:47 crc kubenswrapper[4771]: I0227 01:20:47.177686 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46247d46-066d-45e2-975a-8404fd28a0ac-bundle\") pod \"d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h\" (UID: \"46247d46-066d-45e2-975a-8404fd28a0ac\") " pod="openstack-operators/d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h" Feb 27 01:20:47 crc kubenswrapper[4771]: I0227 01:20:47.178186 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46247d46-066d-45e2-975a-8404fd28a0ac-util\") pod \"d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h\" (UID: \"46247d46-066d-45e2-975a-8404fd28a0ac\") " pod="openstack-operators/d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h" Feb 27 01:20:47 crc kubenswrapper[4771]: I0227 01:20:47.178323 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46247d46-066d-45e2-975a-8404fd28a0ac-bundle\") pod \"d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h\" (UID: \"46247d46-066d-45e2-975a-8404fd28a0ac\") " pod="openstack-operators/d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h" Feb 27 01:20:47 crc kubenswrapper[4771]: I0227 01:20:47.205681 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzmlr\" (UniqueName: \"kubernetes.io/projected/46247d46-066d-45e2-975a-8404fd28a0ac-kube-api-access-lzmlr\") pod \"d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h\" (UID: \"46247d46-066d-45e2-975a-8404fd28a0ac\") " pod="openstack-operators/d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h" Feb 27 01:20:47 crc kubenswrapper[4771]: I0227 01:20:47.270412 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h" Feb 27 01:20:47 crc kubenswrapper[4771]: I0227 01:20:47.729246 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h"] Feb 27 01:20:47 crc kubenswrapper[4771]: W0227 01:20:47.732616 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46247d46_066d_45e2_975a_8404fd28a0ac.slice/crio-1f0af8c618ae5ec549d24e1e448ec36bf66aa1dc285a04b9ea1768d41bca8c18 WatchSource:0}: Error finding container 1f0af8c618ae5ec549d24e1e448ec36bf66aa1dc285a04b9ea1768d41bca8c18: Status 404 returned error can't find the container with id 1f0af8c618ae5ec549d24e1e448ec36bf66aa1dc285a04b9ea1768d41bca8c18 Feb 27 01:20:48 crc kubenswrapper[4771]: I0227 01:20:48.134423 4771 generic.go:334] "Generic (PLEG): container finished" podID="46247d46-066d-45e2-975a-8404fd28a0ac" containerID="fdce322695fda67f3974f4ac3bc63b64358600d0668eb01305702ac9a99eb8ac" exitCode=0 Feb 27 01:20:48 crc kubenswrapper[4771]: I0227 01:20:48.134646 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h" event={"ID":"46247d46-066d-45e2-975a-8404fd28a0ac","Type":"ContainerDied","Data":"fdce322695fda67f3974f4ac3bc63b64358600d0668eb01305702ac9a99eb8ac"} Feb 27 01:20:48 crc kubenswrapper[4771]: I0227 01:20:48.134785 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h" event={"ID":"46247d46-066d-45e2-975a-8404fd28a0ac","Type":"ContainerStarted","Data":"1f0af8c618ae5ec549d24e1e448ec36bf66aa1dc285a04b9ea1768d41bca8c18"} Feb 27 01:20:49 crc kubenswrapper[4771]: I0227 01:20:49.145631 4771 generic.go:334] "Generic (PLEG): container finished" podID="46247d46-066d-45e2-975a-8404fd28a0ac" containerID="b8f21a3d7f58cb6fd9777f624ffaa7c47ac31e0cc31720c53e42c18ab340e4ef" exitCode=0 Feb 27 01:20:49 crc kubenswrapper[4771]: I0227 01:20:49.145716 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h" event={"ID":"46247d46-066d-45e2-975a-8404fd28a0ac","Type":"ContainerDied","Data":"b8f21a3d7f58cb6fd9777f624ffaa7c47ac31e0cc31720c53e42c18ab340e4ef"} Feb 27 01:20:50 crc kubenswrapper[4771]: I0227 01:20:50.157840 4771 generic.go:334] "Generic (PLEG): container finished" podID="46247d46-066d-45e2-975a-8404fd28a0ac" containerID="9e8c36f2f137ba17aed1c3d8ea7aa0454384ecfea4e875e8d5be2b38f59ac8b5" exitCode=0 Feb 27 01:20:50 crc kubenswrapper[4771]: I0227 01:20:50.157893 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h" event={"ID":"46247d46-066d-45e2-975a-8404fd28a0ac","Type":"ContainerDied","Data":"9e8c36f2f137ba17aed1c3d8ea7aa0454384ecfea4e875e8d5be2b38f59ac8b5"} Feb 27 01:20:51 crc kubenswrapper[4771]: I0227 01:20:51.528770 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h" Feb 27 01:20:51 crc kubenswrapper[4771]: I0227 01:20:51.659968 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzmlr\" (UniqueName: \"kubernetes.io/projected/46247d46-066d-45e2-975a-8404fd28a0ac-kube-api-access-lzmlr\") pod \"46247d46-066d-45e2-975a-8404fd28a0ac\" (UID: \"46247d46-066d-45e2-975a-8404fd28a0ac\") " Feb 27 01:20:51 crc kubenswrapper[4771]: I0227 01:20:51.660338 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46247d46-066d-45e2-975a-8404fd28a0ac-bundle\") pod \"46247d46-066d-45e2-975a-8404fd28a0ac\" (UID: \"46247d46-066d-45e2-975a-8404fd28a0ac\") " Feb 27 01:20:51 crc kubenswrapper[4771]: I0227 01:20:51.660434 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46247d46-066d-45e2-975a-8404fd28a0ac-util\") pod \"46247d46-066d-45e2-975a-8404fd28a0ac\" (UID: \"46247d46-066d-45e2-975a-8404fd28a0ac\") " Feb 27 01:20:51 crc kubenswrapper[4771]: I0227 01:20:51.669326 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46247d46-066d-45e2-975a-8404fd28a0ac-kube-api-access-lzmlr" (OuterVolumeSpecName: "kube-api-access-lzmlr") pod "46247d46-066d-45e2-975a-8404fd28a0ac" (UID: "46247d46-066d-45e2-975a-8404fd28a0ac"). InnerVolumeSpecName "kube-api-access-lzmlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:20:51 crc kubenswrapper[4771]: I0227 01:20:51.674258 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46247d46-066d-45e2-975a-8404fd28a0ac-bundle" (OuterVolumeSpecName: "bundle") pod "46247d46-066d-45e2-975a-8404fd28a0ac" (UID: "46247d46-066d-45e2-975a-8404fd28a0ac"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:20:51 crc kubenswrapper[4771]: I0227 01:20:51.675343 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46247d46-066d-45e2-975a-8404fd28a0ac-util" (OuterVolumeSpecName: "util") pod "46247d46-066d-45e2-975a-8404fd28a0ac" (UID: "46247d46-066d-45e2-975a-8404fd28a0ac"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:20:51 crc kubenswrapper[4771]: I0227 01:20:51.762717 4771 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46247d46-066d-45e2-975a-8404fd28a0ac-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:20:51 crc kubenswrapper[4771]: I0227 01:20:51.763145 4771 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46247d46-066d-45e2-975a-8404fd28a0ac-util\") on node \"crc\" DevicePath \"\"" Feb 27 01:20:51 crc kubenswrapper[4771]: I0227 01:20:51.763221 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzmlr\" (UniqueName: \"kubernetes.io/projected/46247d46-066d-45e2-975a-8404fd28a0ac-kube-api-access-lzmlr\") on node \"crc\" DevicePath \"\"" Feb 27 01:20:52 crc kubenswrapper[4771]: I0227 01:20:52.184647 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h" event={"ID":"46247d46-066d-45e2-975a-8404fd28a0ac","Type":"ContainerDied","Data":"1f0af8c618ae5ec549d24e1e448ec36bf66aa1dc285a04b9ea1768d41bca8c18"} Feb 27 01:20:52 crc kubenswrapper[4771]: I0227 01:20:52.184700 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f0af8c618ae5ec549d24e1e448ec36bf66aa1dc285a04b9ea1768d41bca8c18" Feb 27 01:20:52 crc kubenswrapper[4771]: I0227 01:20:52.184750 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h" Feb 27 01:20:59 crc kubenswrapper[4771]: I0227 01:20:59.718113 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-b5b8f6cf4-m2tbq"] Feb 27 01:20:59 crc kubenswrapper[4771]: E0227 01:20:59.718949 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46247d46-066d-45e2-975a-8404fd28a0ac" containerName="pull" Feb 27 01:20:59 crc kubenswrapper[4771]: I0227 01:20:59.718962 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="46247d46-066d-45e2-975a-8404fd28a0ac" containerName="pull" Feb 27 01:20:59 crc kubenswrapper[4771]: E0227 01:20:59.718983 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46247d46-066d-45e2-975a-8404fd28a0ac" containerName="extract" Feb 27 01:20:59 crc kubenswrapper[4771]: I0227 01:20:59.718988 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="46247d46-066d-45e2-975a-8404fd28a0ac" containerName="extract" Feb 27 01:20:59 crc kubenswrapper[4771]: E0227 01:20:59.718997 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46247d46-066d-45e2-975a-8404fd28a0ac" containerName="util" Feb 27 01:20:59 crc kubenswrapper[4771]: I0227 01:20:59.719003 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="46247d46-066d-45e2-975a-8404fd28a0ac" containerName="util" Feb 27 01:20:59 crc kubenswrapper[4771]: I0227 01:20:59.719109 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="46247d46-066d-45e2-975a-8404fd28a0ac" containerName="extract" Feb 27 01:20:59 crc kubenswrapper[4771]: I0227 01:20:59.719536 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-b5b8f6cf4-m2tbq" Feb 27 01:20:59 crc kubenswrapper[4771]: I0227 01:20:59.721991 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-vkzgm" Feb 27 01:20:59 crc kubenswrapper[4771]: I0227 01:20:59.738805 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-b5b8f6cf4-m2tbq"] Feb 27 01:20:59 crc kubenswrapper[4771]: I0227 01:20:59.887721 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwmnd\" (UniqueName: \"kubernetes.io/projected/79f9396a-5f0c-4909-b710-4914faa9e011-kube-api-access-xwmnd\") pod \"openstack-operator-controller-init-b5b8f6cf4-m2tbq\" (UID: \"79f9396a-5f0c-4909-b710-4914faa9e011\") " pod="openstack-operators/openstack-operator-controller-init-b5b8f6cf4-m2tbq" Feb 27 01:20:59 crc kubenswrapper[4771]: I0227 01:20:59.989346 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwmnd\" (UniqueName: \"kubernetes.io/projected/79f9396a-5f0c-4909-b710-4914faa9e011-kube-api-access-xwmnd\") pod \"openstack-operator-controller-init-b5b8f6cf4-m2tbq\" (UID: \"79f9396a-5f0c-4909-b710-4914faa9e011\") " pod="openstack-operators/openstack-operator-controller-init-b5b8f6cf4-m2tbq" Feb 27 01:21:00 crc kubenswrapper[4771]: I0227 01:21:00.007359 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwmnd\" (UniqueName: \"kubernetes.io/projected/79f9396a-5f0c-4909-b710-4914faa9e011-kube-api-access-xwmnd\") pod \"openstack-operator-controller-init-b5b8f6cf4-m2tbq\" (UID: \"79f9396a-5f0c-4909-b710-4914faa9e011\") " pod="openstack-operators/openstack-operator-controller-init-b5b8f6cf4-m2tbq" Feb 27 01:21:00 crc kubenswrapper[4771]: I0227 01:21:00.087375 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-b5b8f6cf4-m2tbq" Feb 27 01:21:00 crc kubenswrapper[4771]: I0227 01:21:00.568309 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-b5b8f6cf4-m2tbq"] Feb 27 01:21:00 crc kubenswrapper[4771]: W0227 01:21:00.573355 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79f9396a_5f0c_4909_b710_4914faa9e011.slice/crio-06c3e7eb84bc9946a498f471ebe6f3e2912ad390f13eb2c5a7c23cc5da897504 WatchSource:0}: Error finding container 06c3e7eb84bc9946a498f471ebe6f3e2912ad390f13eb2c5a7c23cc5da897504: Status 404 returned error can't find the container with id 06c3e7eb84bc9946a498f471ebe6f3e2912ad390f13eb2c5a7c23cc5da897504 Feb 27 01:21:01 crc kubenswrapper[4771]: I0227 01:21:01.270962 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-b5b8f6cf4-m2tbq" event={"ID":"79f9396a-5f0c-4909-b710-4914faa9e011","Type":"ContainerStarted","Data":"06c3e7eb84bc9946a498f471ebe6f3e2912ad390f13eb2c5a7c23cc5da897504"} Feb 27 01:21:05 crc kubenswrapper[4771]: I0227 01:21:05.312077 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-b5b8f6cf4-m2tbq" event={"ID":"79f9396a-5f0c-4909-b710-4914faa9e011","Type":"ContainerStarted","Data":"240bd94a570d440b94eeaa25eee072dba425287f60109aa86a8aefaa8c644e68"} Feb 27 01:21:05 crc kubenswrapper[4771]: I0227 01:21:05.312686 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-b5b8f6cf4-m2tbq" Feb 27 01:21:05 crc kubenswrapper[4771]: I0227 01:21:05.361353 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-b5b8f6cf4-m2tbq" podStartSLOduration=2.607264937 podStartE2EDuration="6.361328631s" podCreationTimestamp="2026-02-27 01:20:59 +0000 UTC" firstStartedPulling="2026-02-27 01:21:00.575606709 +0000 UTC m=+973.513168007" lastFinishedPulling="2026-02-27 01:21:04.329670383 +0000 UTC m=+977.267231701" observedRunningTime="2026-02-27 01:21:05.356387915 +0000 UTC m=+978.293949223" watchObservedRunningTime="2026-02-27 01:21:05.361328631 +0000 UTC m=+978.298889959" Feb 27 01:21:10 crc kubenswrapper[4771]: I0227 01:21:10.092655 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-b5b8f6cf4-m2tbq" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.197783 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-mrvth"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.199007 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-mrvth" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.200518 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-fhhss" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.203407 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jpm2"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.204151 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jpm2" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.207378 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-np45p" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.209694 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-mrvth"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.219215 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jpm2"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.224151 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-snqrx"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.225008 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-snqrx" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.226788 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-qdtqn" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.240137 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-zlggr"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.241123 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-zlggr" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.243318 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-rdv48" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.249696 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-snqrx"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.262725 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-zlggr"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.319031 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-x969l"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.320450 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-x969l" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.323845 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-x969l"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.327033 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-mcs85" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.330637 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmhqt\" (UniqueName: \"kubernetes.io/projected/8bd8d6ef-0025-4148-a530-1964ae763645-kube-api-access-gmhqt\") pod \"barbican-operator-controller-manager-868647ff47-mrvth\" (UID: \"8bd8d6ef-0025-4148-a530-1964ae763645\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-mrvth" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.330711 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74hsx\" (UniqueName: \"kubernetes.io/projected/f882b343-7b46-4516-9a17-833858bbfda7-kube-api-access-74hsx\") pod \"cinder-operator-controller-manager-55d77d7b5c-9jpm2\" (UID: \"f882b343-7b46-4516-9a17-833858bbfda7\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jpm2" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.330731 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sk8c\" (UniqueName: \"kubernetes.io/projected/9f4615e8-ebc8-43ff-bdec-481f86af58bf-kube-api-access-7sk8c\") pod \"designate-operator-controller-manager-6d8bf5c495-snqrx\" (UID: \"9f4615e8-ebc8-43ff-bdec-481f86af58bf\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-snqrx" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.330759 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhbhh\" (UniqueName: \"kubernetes.io/projected/f77508f2-411f-4644-9b48-7edbefaf3bb4-kube-api-access-dhbhh\") pod \"glance-operator-controller-manager-784b5bb6c5-zlggr\" (UID: \"f77508f2-411f-4644-9b48-7edbefaf3bb4\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-zlggr" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.341151 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8rvj"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.342078 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8rvj" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.345918 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-4p5fg"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.346811 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4p5fg" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.349385 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8rvj"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.359351 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-cp6vm" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.359700 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.364015 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-xx482" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.365492 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-4p5fg"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.378662 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-w5hxv"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.379644 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-w5hxv" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.385001 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-nsrs9" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.394687 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-w5hxv"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.411857 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-t65sw"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.412681 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t65sw" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.414894 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-t65sw"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.417955 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-hm6c8" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.427732 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-llvjw"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.428536 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-llvjw" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.432446 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ea9fc68-1ea7-48fe-b692-f99747dbd694-cert\") pod \"infra-operator-controller-manager-79d975b745-4p5fg\" (UID: \"5ea9fc68-1ea7-48fe-b692-f99747dbd694\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4p5fg" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.432492 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74hsx\" (UniqueName: \"kubernetes.io/projected/f882b343-7b46-4516-9a17-833858bbfda7-kube-api-access-74hsx\") pod \"cinder-operator-controller-manager-55d77d7b5c-9jpm2\" (UID: \"f882b343-7b46-4516-9a17-833858bbfda7\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jpm2" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.432525 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sk8c\" (UniqueName: \"kubernetes.io/projected/9f4615e8-ebc8-43ff-bdec-481f86af58bf-kube-api-access-7sk8c\") pod \"designate-operator-controller-manager-6d8bf5c495-snqrx\" (UID: \"9f4615e8-ebc8-43ff-bdec-481f86af58bf\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-snqrx" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.432567 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhbhh\" (UniqueName: \"kubernetes.io/projected/f77508f2-411f-4644-9b48-7edbefaf3bb4-kube-api-access-dhbhh\") pod \"glance-operator-controller-manager-784b5bb6c5-zlggr\" (UID: \"f77508f2-411f-4644-9b48-7edbefaf3bb4\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-zlggr" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.432593 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shlfh\" (UniqueName: \"kubernetes.io/projected/b563eec9-7160-44db-a640-4cf7e25bc893-kube-api-access-shlfh\") pod \"ironic-operator-controller-manager-554564d7fc-w5hxv\" (UID: \"b563eec9-7160-44db-a640-4cf7e25bc893\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-w5hxv" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.432613 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9677g\" (UniqueName: \"kubernetes.io/projected/5ea9fc68-1ea7-48fe-b692-f99747dbd694-kube-api-access-9677g\") pod \"infra-operator-controller-manager-79d975b745-4p5fg\" (UID: \"5ea9fc68-1ea7-48fe-b692-f99747dbd694\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4p5fg" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.432638 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcfsc\" (UniqueName: \"kubernetes.io/projected/646fbcd2-1bd9-4e76-a70b-c4812c6cdbf7-kube-api-access-fcfsc\") pod \"horizon-operator-controller-manager-5b9b8895d5-p8rvj\" (UID: \"646fbcd2-1bd9-4e76-a70b-c4812c6cdbf7\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8rvj" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.432667 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v4sd\" (UniqueName: \"kubernetes.io/projected/a40b776f-5677-4909-8b04-a5b2318737bc-kube-api-access-9v4sd\") pod \"heat-operator-controller-manager-69f49c598c-x969l\" (UID: \"a40b776f-5677-4909-8b04-a5b2318737bc\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-x969l" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.432691 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmhqt\" (UniqueName: \"kubernetes.io/projected/8bd8d6ef-0025-4148-a530-1964ae763645-kube-api-access-gmhqt\") pod \"barbican-operator-controller-manager-868647ff47-mrvth\" (UID: \"8bd8d6ef-0025-4148-a530-1964ae763645\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-mrvth" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.442497 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-llvjw"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.446819 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-kldnt" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.460872 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-df8gr"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.461874 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-df8gr" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.466488 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-c76pg" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.475896 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sk8c\" (UniqueName: \"kubernetes.io/projected/9f4615e8-ebc8-43ff-bdec-481f86af58bf-kube-api-access-7sk8c\") pod \"designate-operator-controller-manager-6d8bf5c495-snqrx\" (UID: \"9f4615e8-ebc8-43ff-bdec-481f86af58bf\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-snqrx" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.478213 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhbhh\" (UniqueName: \"kubernetes.io/projected/f77508f2-411f-4644-9b48-7edbefaf3bb4-kube-api-access-dhbhh\") pod \"glance-operator-controller-manager-784b5bb6c5-zlggr\" (UID: \"f77508f2-411f-4644-9b48-7edbefaf3bb4\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-zlggr" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.483295 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-df8gr"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.507461 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74hsx\" (UniqueName: \"kubernetes.io/projected/f882b343-7b46-4516-9a17-833858bbfda7-kube-api-access-74hsx\") pod \"cinder-operator-controller-manager-55d77d7b5c-9jpm2\" (UID: \"f882b343-7b46-4516-9a17-833858bbfda7\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jpm2" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.513872 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-6j9rs"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.514686 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-6j9rs" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.520532 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-2b9lm" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.523337 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmhqt\" (UniqueName: \"kubernetes.io/projected/8bd8d6ef-0025-4148-a530-1964ae763645-kube-api-access-gmhqt\") pod \"barbican-operator-controller-manager-868647ff47-mrvth\" (UID: \"8bd8d6ef-0025-4148-a530-1964ae763645\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-mrvth" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.536871 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ea9fc68-1ea7-48fe-b692-f99747dbd694-cert\") pod \"infra-operator-controller-manager-79d975b745-4p5fg\" (UID: \"5ea9fc68-1ea7-48fe-b692-f99747dbd694\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4p5fg" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.536940 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcbsj\" (UniqueName: \"kubernetes.io/projected/0c8b88b1-8f42-458c-933e-0bcd17da38cb-kube-api-access-kcbsj\") pod \"manila-operator-controller-manager-67d996989d-llvjw\" (UID: \"0c8b88b1-8f42-458c-933e-0bcd17da38cb\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-llvjw" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.536989 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shlfh\" (UniqueName: \"kubernetes.io/projected/b563eec9-7160-44db-a640-4cf7e25bc893-kube-api-access-shlfh\") pod \"ironic-operator-controller-manager-554564d7fc-w5hxv\" (UID: \"b563eec9-7160-44db-a640-4cf7e25bc893\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-w5hxv" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.537017 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9677g\" (UniqueName: \"kubernetes.io/projected/5ea9fc68-1ea7-48fe-b692-f99747dbd694-kube-api-access-9677g\") pod \"infra-operator-controller-manager-79d975b745-4p5fg\" (UID: \"5ea9fc68-1ea7-48fe-b692-f99747dbd694\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4p5fg" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.537058 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcfsc\" (UniqueName: \"kubernetes.io/projected/646fbcd2-1bd9-4e76-a70b-c4812c6cdbf7-kube-api-access-fcfsc\") pod \"horizon-operator-controller-manager-5b9b8895d5-p8rvj\" (UID: \"646fbcd2-1bd9-4e76-a70b-c4812c6cdbf7\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8rvj" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.537080 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v4sd\" (UniqueName: \"kubernetes.io/projected/a40b776f-5677-4909-8b04-a5b2318737bc-kube-api-access-9v4sd\") pod \"heat-operator-controller-manager-69f49c598c-x969l\" (UID: \"a40b776f-5677-4909-8b04-a5b2318737bc\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-x969l" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.537124 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72ggg\" (UniqueName: \"kubernetes.io/projected/eb603c5e-cb7c-41e4-ac8a-f9a960141d16-kube-api-access-72ggg\") pod \"keystone-operator-controller-manager-b4d948c87-t65sw\" (UID: \"eb603c5e-cb7c-41e4-ac8a-f9a960141d16\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t65sw" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.537157 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmnsh\" (UniqueName: \"kubernetes.io/projected/17dfc012-107f-437d-bbfd-13a1250857ed-kube-api-access-mmnsh\") pod \"mariadb-operator-controller-manager-6994f66f48-df8gr\" (UID: \"17dfc012-107f-437d-bbfd-13a1250857ed\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-df8gr" Feb 27 01:21:30 crc kubenswrapper[4771]: E0227 01:21:30.537376 4771 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 27 01:21:30 crc kubenswrapper[4771]: E0227 01:21:30.537429 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ea9fc68-1ea7-48fe-b692-f99747dbd694-cert podName:5ea9fc68-1ea7-48fe-b692-f99747dbd694 nodeName:}" failed. No retries permitted until 2026-02-27 01:21:31.03740768 +0000 UTC m=+1003.974968978 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ea9fc68-1ea7-48fe-b692-f99747dbd694-cert") pod "infra-operator-controller-manager-79d975b745-4p5fg" (UID: "5ea9fc68-1ea7-48fe-b692-f99747dbd694") : secret "infra-operator-webhook-server-cert" not found Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.540845 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jpm2" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.546873 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-6j9rs"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.565812 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-snqrx" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.572868 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-4fsjk"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.573681 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4fsjk" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.578145 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v4sd\" (UniqueName: \"kubernetes.io/projected/a40b776f-5677-4909-8b04-a5b2318737bc-kube-api-access-9v4sd\") pod \"heat-operator-controller-manager-69f49c598c-x969l\" (UID: \"a40b776f-5677-4909-8b04-a5b2318737bc\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-x969l" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.593463 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-zlggr" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.597808 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-65x55"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.597847 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-kc287" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.598600 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-65x55" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.601307 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9677g\" (UniqueName: \"kubernetes.io/projected/5ea9fc68-1ea7-48fe-b692-f99747dbd694-kube-api-access-9677g\") pod \"infra-operator-controller-manager-79d975b745-4p5fg\" (UID: \"5ea9fc68-1ea7-48fe-b692-f99747dbd694\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4p5fg" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.614616 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcfsc\" (UniqueName: \"kubernetes.io/projected/646fbcd2-1bd9-4e76-a70b-c4812c6cdbf7-kube-api-access-fcfsc\") pod \"horizon-operator-controller-manager-5b9b8895d5-p8rvj\" (UID: \"646fbcd2-1bd9-4e76-a70b-c4812c6cdbf7\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8rvj" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.615380 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shlfh\" (UniqueName: \"kubernetes.io/projected/b563eec9-7160-44db-a640-4cf7e25bc893-kube-api-access-shlfh\") pod \"ironic-operator-controller-manager-554564d7fc-w5hxv\" (UID: \"b563eec9-7160-44db-a640-4cf7e25bc893\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-w5hxv" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.624923 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-scrll" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.628615 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-4fsjk"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.642442 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72ggg\" (UniqueName: \"kubernetes.io/projected/eb603c5e-cb7c-41e4-ac8a-f9a960141d16-kube-api-access-72ggg\") pod \"keystone-operator-controller-manager-b4d948c87-t65sw\" (UID: \"eb603c5e-cb7c-41e4-ac8a-f9a960141d16\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t65sw" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.642485 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdlgw\" (UniqueName: \"kubernetes.io/projected/61b58ad1-8db7-4a41-9774-38781245baff-kube-api-access-hdlgw\") pod \"nova-operator-controller-manager-567668f5cf-4fsjk\" (UID: \"61b58ad1-8db7-4a41-9774-38781245baff\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4fsjk" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.642506 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmnsh\" (UniqueName: \"kubernetes.io/projected/17dfc012-107f-437d-bbfd-13a1250857ed-kube-api-access-mmnsh\") pod \"mariadb-operator-controller-manager-6994f66f48-df8gr\" (UID: \"17dfc012-107f-437d-bbfd-13a1250857ed\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-df8gr" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.642589 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcbsj\" (UniqueName: \"kubernetes.io/projected/0c8b88b1-8f42-458c-933e-0bcd17da38cb-kube-api-access-kcbsj\") pod \"manila-operator-controller-manager-67d996989d-llvjw\" (UID: \"0c8b88b1-8f42-458c-933e-0bcd17da38cb\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-llvjw" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.642610 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbx2f\" (UniqueName: \"kubernetes.io/projected/20a5fef1-ac14-40c6-bb97-6e6f39be1645-kube-api-access-hbx2f\") pod \"neutron-operator-controller-manager-6bd4687957-6j9rs\" (UID: \"20a5fef1-ac14-40c6-bb97-6e6f39be1645\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-6j9rs" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.651619 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-65x55"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.651935 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-x969l" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.670786 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8rvj" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.676481 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-2qcds"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.679846 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-2qcds" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.694823 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.695748 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.695816 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-k725j" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.702708 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-2qcds"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.704939 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-w5hxv" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.729242 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72ggg\" (UniqueName: \"kubernetes.io/projected/eb603c5e-cb7c-41e4-ac8a-f9a960141d16-kube-api-access-72ggg\") pod \"keystone-operator-controller-manager-b4d948c87-t65sw\" (UID: \"eb603c5e-cb7c-41e4-ac8a-f9a960141d16\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t65sw" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.733040 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.734631 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-62fjg" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.738665 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t65sw" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.747462 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbx2f\" (UniqueName: \"kubernetes.io/projected/20a5fef1-ac14-40c6-bb97-6e6f39be1645-kube-api-access-hbx2f\") pod \"neutron-operator-controller-manager-6bd4687957-6j9rs\" (UID: \"20a5fef1-ac14-40c6-bb97-6e6f39be1645\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-6j9rs" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.747510 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwxgw\" (UniqueName: \"kubernetes.io/projected/7cf10a28-d86e-4299-8b06-84888ca3dcb9-kube-api-access-nwxgw\") pod \"ovn-operator-controller-manager-5955d8c787-2qcds\" (UID: \"7cf10a28-d86e-4299-8b06-84888ca3dcb9\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-2qcds" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.747538 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn8r9\" (UniqueName: \"kubernetes.io/projected/e01a3024-1558-41e4-bbb4-06451d536782-kube-api-access-dn8r9\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq\" (UID: \"e01a3024-1558-41e4-bbb4-06451d536782\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.747584 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e01a3024-1558-41e4-bbb4-06451d536782-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq\" (UID: \"e01a3024-1558-41e4-bbb4-06451d536782\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.747626 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkww4\" (UniqueName: \"kubernetes.io/projected/7a2d4cdb-cbb2-4910-a9b5-ae2adfb04205-kube-api-access-vkww4\") pod \"octavia-operator-controller-manager-659dc6bbfc-65x55\" (UID: \"7a2d4cdb-cbb2-4910-a9b5-ae2adfb04205\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-65x55" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.747647 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdlgw\" (UniqueName: \"kubernetes.io/projected/61b58ad1-8db7-4a41-9774-38781245baff-kube-api-access-hdlgw\") pod \"nova-operator-controller-manager-567668f5cf-4fsjk\" (UID: \"61b58ad1-8db7-4a41-9774-38781245baff\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4fsjk" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.750135 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcbsj\" (UniqueName: \"kubernetes.io/projected/0c8b88b1-8f42-458c-933e-0bcd17da38cb-kube-api-access-kcbsj\") pod \"manila-operator-controller-manager-67d996989d-llvjw\" (UID: \"0c8b88b1-8f42-458c-933e-0bcd17da38cb\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-llvjw" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.757639 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-vbhct"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.760876 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-vbhct" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.767223 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-jbzdn" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.767330 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmnsh\" (UniqueName: \"kubernetes.io/projected/17dfc012-107f-437d-bbfd-13a1250857ed-kube-api-access-mmnsh\") pod \"mariadb-operator-controller-manager-6994f66f48-df8gr\" (UID: \"17dfc012-107f-437d-bbfd-13a1250857ed\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-df8gr" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.776679 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-vbhct"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.820288 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-589c568786-wb7w9"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.821565 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-wb7w9" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.822996 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-mrvth" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.839681 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-bcbq5" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.855797 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.882161 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdlgw\" (UniqueName: \"kubernetes.io/projected/61b58ad1-8db7-4a41-9774-38781245baff-kube-api-access-hdlgw\") pod \"nova-operator-controller-manager-567668f5cf-4fsjk\" (UID: \"61b58ad1-8db7-4a41-9774-38781245baff\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4fsjk" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.885976 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwxgw\" (UniqueName: \"kubernetes.io/projected/7cf10a28-d86e-4299-8b06-84888ca3dcb9-kube-api-access-nwxgw\") pod \"ovn-operator-controller-manager-5955d8c787-2qcds\" (UID: \"7cf10a28-d86e-4299-8b06-84888ca3dcb9\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-2qcds" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.886079 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn8r9\" (UniqueName: \"kubernetes.io/projected/e01a3024-1558-41e4-bbb4-06451d536782-kube-api-access-dn8r9\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq\" (UID: \"e01a3024-1558-41e4-bbb4-06451d536782\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.886172 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8zrq\" (UniqueName: \"kubernetes.io/projected/e5ed9ba2-1499-42b0-9a16-213f7bd6336f-kube-api-access-p8zrq\") pod \"placement-operator-controller-manager-8497b45c89-vbhct\" (UID: \"e5ed9ba2-1499-42b0-9a16-213f7bd6336f\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-vbhct" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.886205 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e01a3024-1558-41e4-bbb4-06451d536782-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq\" (UID: \"e01a3024-1558-41e4-bbb4-06451d536782\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.886250 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkww4\" (UniqueName: \"kubernetes.io/projected/7a2d4cdb-cbb2-4910-a9b5-ae2adfb04205-kube-api-access-vkww4\") pod \"octavia-operator-controller-manager-659dc6bbfc-65x55\" (UID: \"7a2d4cdb-cbb2-4910-a9b5-ae2adfb04205\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-65x55" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.947117 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbx2f\" (UniqueName: \"kubernetes.io/projected/20a5fef1-ac14-40c6-bb97-6e6f39be1645-kube-api-access-hbx2f\") pod \"neutron-operator-controller-manager-6bd4687957-6j9rs\" (UID: \"20a5fef1-ac14-40c6-bb97-6e6f39be1645\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-6j9rs" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.951164 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-d8xdb"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.952032 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-d8xdb" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.954885 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-df8gr" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.960627 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-djkzk" Feb 27 01:21:30 crc kubenswrapper[4771]: E0227 01:21:30.965434 4771 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 01:21:30 crc kubenswrapper[4771]: E0227 01:21:30.965506 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e01a3024-1558-41e4-bbb4-06451d536782-cert podName:e01a3024-1558-41e4-bbb4-06451d536782 nodeName:}" failed. No retries permitted until 2026-02-27 01:21:31.465486082 +0000 UTC m=+1004.403047370 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e01a3024-1558-41e4-bbb4-06451d536782-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq" (UID: "e01a3024-1558-41e4-bbb4-06451d536782") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.972690 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-589c568786-wb7w9"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.979640 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-d8xdb"] Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.991854 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4fsjk" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.991928 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8zrq\" (UniqueName: \"kubernetes.io/projected/e5ed9ba2-1499-42b0-9a16-213f7bd6336f-kube-api-access-p8zrq\") pod \"placement-operator-controller-manager-8497b45c89-vbhct\" (UID: \"e5ed9ba2-1499-42b0-9a16-213f7bd6336f\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-vbhct" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.993472 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9s4f\" (UniqueName: \"kubernetes.io/projected/aece7f0f-11e5-4934-b818-f8c92e54439b-kube-api-access-l9s4f\") pod \"swift-operator-controller-manager-68f46476f-d8xdb\" (UID: \"aece7f0f-11e5-4934-b818-f8c92e54439b\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-d8xdb" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.993518 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwgwn\" (UniqueName: \"kubernetes.io/projected/a7c97c14-2dc7-409a-bb85-7e10031e839b-kube-api-access-zwgwn\") pod \"telemetry-operator-controller-manager-589c568786-wb7w9\" (UID: \"a7c97c14-2dc7-409a-bb85-7e10031e839b\") " pod="openstack-operators/telemetry-operator-controller-manager-589c568786-wb7w9" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.996603 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkww4\" (UniqueName: \"kubernetes.io/projected/7a2d4cdb-cbb2-4910-a9b5-ae2adfb04205-kube-api-access-vkww4\") pod \"octavia-operator-controller-manager-659dc6bbfc-65x55\" (UID: \"7a2d4cdb-cbb2-4910-a9b5-ae2adfb04205\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-65x55" Feb 27 01:21:30 crc kubenswrapper[4771]: I0227 01:21:30.999266 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn8r9\" (UniqueName: \"kubernetes.io/projected/e01a3024-1558-41e4-bbb4-06451d536782-kube-api-access-dn8r9\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq\" (UID: \"e01a3024-1558-41e4-bbb4-06451d536782\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.008746 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-cp5l8"] Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.009607 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-cp5l8" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.009907 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwxgw\" (UniqueName: \"kubernetes.io/projected/7cf10a28-d86e-4299-8b06-84888ca3dcb9-kube-api-access-nwxgw\") pod \"ovn-operator-controller-manager-5955d8c787-2qcds\" (UID: \"7cf10a28-d86e-4299-8b06-84888ca3dcb9\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-2qcds" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.014432 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-2lbx4" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.015869 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-cp5l8"] Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.024014 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8zrq\" (UniqueName: \"kubernetes.io/projected/e5ed9ba2-1499-42b0-9a16-213f7bd6336f-kube-api-access-p8zrq\") pod \"placement-operator-controller-manager-8497b45c89-vbhct\" (UID: \"e5ed9ba2-1499-42b0-9a16-213f7bd6336f\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-vbhct" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.034191 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-lg7vn"] Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.035029 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-lg7vn" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.036523 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-65x55" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.037772 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-h5nkk" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.045946 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-llvjw" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.047362 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-lg7vn"] Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.070105 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-2qcds" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.094825 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ea9fc68-1ea7-48fe-b692-f99747dbd694-cert\") pod \"infra-operator-controller-manager-79d975b745-4p5fg\" (UID: \"5ea9fc68-1ea7-48fe-b692-f99747dbd694\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4p5fg" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.094871 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9s4f\" (UniqueName: \"kubernetes.io/projected/aece7f0f-11e5-4934-b818-f8c92e54439b-kube-api-access-l9s4f\") pod \"swift-operator-controller-manager-68f46476f-d8xdb\" (UID: \"aece7f0f-11e5-4934-b818-f8c92e54439b\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-d8xdb" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.094898 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwgwn\" (UniqueName: \"kubernetes.io/projected/a7c97c14-2dc7-409a-bb85-7e10031e839b-kube-api-access-zwgwn\") pod \"telemetry-operator-controller-manager-589c568786-wb7w9\" (UID: \"a7c97c14-2dc7-409a-bb85-7e10031e839b\") " pod="openstack-operators/telemetry-operator-controller-manager-589c568786-wb7w9" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.094918 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgc28\" (UniqueName: \"kubernetes.io/projected/987278ec-2526-4db5-a442-58b38687805c-kube-api-access-pgc28\") pod \"test-operator-controller-manager-5dc6794d5b-cp5l8\" (UID: \"987278ec-2526-4db5-a442-58b38687805c\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-cp5l8" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.094954 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqhj2\" (UniqueName: \"kubernetes.io/projected/981a63b0-1a15-42f0-8d4a-0dc24dbd87b1-kube-api-access-wqhj2\") pod \"watcher-operator-controller-manager-bccc79885-lg7vn\" (UID: \"981a63b0-1a15-42f0-8d4a-0dc24dbd87b1\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-lg7vn" Feb 27 01:21:31 crc kubenswrapper[4771]: E0227 01:21:31.095086 4771 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 27 01:21:31 crc kubenswrapper[4771]: E0227 01:21:31.095128 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ea9fc68-1ea7-48fe-b692-f99747dbd694-cert podName:5ea9fc68-1ea7-48fe-b692-f99747dbd694 nodeName:}" failed. No retries permitted until 2026-02-27 01:21:32.095112402 +0000 UTC m=+1005.032673690 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ea9fc68-1ea7-48fe-b692-f99747dbd694-cert") pod "infra-operator-controller-manager-79d975b745-4p5fg" (UID: "5ea9fc68-1ea7-48fe-b692-f99747dbd694") : secret "infra-operator-webhook-server-cert" not found Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.104955 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64"] Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.105794 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.112878 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.112913 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-8kjh8" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.112912 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.114199 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64"] Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.121689 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2rpsh"] Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.122634 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2rpsh" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.124266 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-xfv4v" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.124501 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwgwn\" (UniqueName: \"kubernetes.io/projected/a7c97c14-2dc7-409a-bb85-7e10031e839b-kube-api-access-zwgwn\") pod \"telemetry-operator-controller-manager-589c568786-wb7w9\" (UID: \"a7c97c14-2dc7-409a-bb85-7e10031e839b\") " pod="openstack-operators/telemetry-operator-controller-manager-589c568786-wb7w9" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.126982 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9s4f\" (UniqueName: \"kubernetes.io/projected/aece7f0f-11e5-4934-b818-f8c92e54439b-kube-api-access-l9s4f\") pod \"swift-operator-controller-manager-68f46476f-d8xdb\" (UID: \"aece7f0f-11e5-4934-b818-f8c92e54439b\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-d8xdb" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.131279 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2rpsh"] Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.141940 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-vbhct" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.167746 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-d8xdb" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.195947 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-metrics-certs\") pod \"openstack-operator-controller-manager-5dc6fb848b-7nk64\" (UID: \"b4a70780-ab41-4199-b1b8-09b01cd6a4ac\") " pod="openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.195977 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-webhook-certs\") pod \"openstack-operator-controller-manager-5dc6fb848b-7nk64\" (UID: \"b4a70780-ab41-4199-b1b8-09b01cd6a4ac\") " pod="openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.196017 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgc28\" (UniqueName: \"kubernetes.io/projected/987278ec-2526-4db5-a442-58b38687805c-kube-api-access-pgc28\") pod \"test-operator-controller-manager-5dc6794d5b-cp5l8\" (UID: \"987278ec-2526-4db5-a442-58b38687805c\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-cp5l8" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.196046 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlfpn\" (UniqueName: \"kubernetes.io/projected/bef6603d-191e-4d4b-b824-4a8d4f81c991-kube-api-access-hlfpn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-2rpsh\" (UID: \"bef6603d-191e-4d4b-b824-4a8d4f81c991\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2rpsh" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.196068 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqhj2\" (UniqueName: \"kubernetes.io/projected/981a63b0-1a15-42f0-8d4a-0dc24dbd87b1-kube-api-access-wqhj2\") pod \"watcher-operator-controller-manager-bccc79885-lg7vn\" (UID: \"981a63b0-1a15-42f0-8d4a-0dc24dbd87b1\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-lg7vn" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.196101 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwlkw\" (UniqueName: \"kubernetes.io/projected/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-kube-api-access-dwlkw\") pod \"openstack-operator-controller-manager-5dc6fb848b-7nk64\" (UID: \"b4a70780-ab41-4199-b1b8-09b01cd6a4ac\") " pod="openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.219058 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqhj2\" (UniqueName: \"kubernetes.io/projected/981a63b0-1a15-42f0-8d4a-0dc24dbd87b1-kube-api-access-wqhj2\") pod \"watcher-operator-controller-manager-bccc79885-lg7vn\" (UID: \"981a63b0-1a15-42f0-8d4a-0dc24dbd87b1\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-lg7vn" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.220722 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgc28\" (UniqueName: \"kubernetes.io/projected/987278ec-2526-4db5-a442-58b38687805c-kube-api-access-pgc28\") pod \"test-operator-controller-manager-5dc6794d5b-cp5l8\" (UID: \"987278ec-2526-4db5-a442-58b38687805c\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-cp5l8" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.257421 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-6j9rs" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.297650 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwlkw\" (UniqueName: \"kubernetes.io/projected/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-kube-api-access-dwlkw\") pod \"openstack-operator-controller-manager-5dc6fb848b-7nk64\" (UID: \"b4a70780-ab41-4199-b1b8-09b01cd6a4ac\") " pod="openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.297739 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-metrics-certs\") pod \"openstack-operator-controller-manager-5dc6fb848b-7nk64\" (UID: \"b4a70780-ab41-4199-b1b8-09b01cd6a4ac\") " pod="openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.297761 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-webhook-certs\") pod \"openstack-operator-controller-manager-5dc6fb848b-7nk64\" (UID: \"b4a70780-ab41-4199-b1b8-09b01cd6a4ac\") " pod="openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.297807 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlfpn\" (UniqueName: \"kubernetes.io/projected/bef6603d-191e-4d4b-b824-4a8d4f81c991-kube-api-access-hlfpn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-2rpsh\" (UID: \"bef6603d-191e-4d4b-b824-4a8d4f81c991\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2rpsh" Feb 27 01:21:31 crc kubenswrapper[4771]: E0227 01:21:31.298522 4771 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 27 01:21:31 crc kubenswrapper[4771]: E0227 01:21:31.298584 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-metrics-certs podName:b4a70780-ab41-4199-b1b8-09b01cd6a4ac nodeName:}" failed. No retries permitted until 2026-02-27 01:21:31.7985688 +0000 UTC m=+1004.736130088 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-metrics-certs") pod "openstack-operator-controller-manager-5dc6fb848b-7nk64" (UID: "b4a70780-ab41-4199-b1b8-09b01cd6a4ac") : secret "metrics-server-cert" not found Feb 27 01:21:31 crc kubenswrapper[4771]: E0227 01:21:31.298742 4771 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 27 01:21:31 crc kubenswrapper[4771]: E0227 01:21:31.298810 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-webhook-certs podName:b4a70780-ab41-4199-b1b8-09b01cd6a4ac nodeName:}" failed. No retries permitted until 2026-02-27 01:21:31.798789235 +0000 UTC m=+1004.736350523 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-webhook-certs") pod "openstack-operator-controller-manager-5dc6fb848b-7nk64" (UID: "b4a70780-ab41-4199-b1b8-09b01cd6a4ac") : secret "webhook-server-cert" not found Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.315200 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlfpn\" (UniqueName: \"kubernetes.io/projected/bef6603d-191e-4d4b-b824-4a8d4f81c991-kube-api-access-hlfpn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-2rpsh\" (UID: \"bef6603d-191e-4d4b-b824-4a8d4f81c991\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2rpsh" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.328593 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwlkw\" (UniqueName: \"kubernetes.io/projected/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-kube-api-access-dwlkw\") pod \"openstack-operator-controller-manager-5dc6fb848b-7nk64\" (UID: \"b4a70780-ab41-4199-b1b8-09b01cd6a4ac\") " pod="openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.422724 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-wb7w9" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.476664 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-zlggr"] Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.487644 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-snqrx"] Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.495894 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-cp5l8" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.500488 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-lg7vn" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.501135 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e01a3024-1558-41e4-bbb4-06451d536782-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq\" (UID: \"e01a3024-1558-41e4-bbb4-06451d536782\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq" Feb 27 01:21:31 crc kubenswrapper[4771]: E0227 01:21:31.501448 4771 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 01:21:31 crc kubenswrapper[4771]: E0227 01:21:31.501497 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e01a3024-1558-41e4-bbb4-06451d536782-cert podName:e01a3024-1558-41e4-bbb4-06451d536782 nodeName:}" failed. No retries permitted until 2026-02-27 01:21:32.501480821 +0000 UTC m=+1005.439042109 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e01a3024-1558-41e4-bbb4-06451d536782-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq" (UID: "e01a3024-1558-41e4-bbb4-06451d536782") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.538765 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2rpsh" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.803747 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-metrics-certs\") pod \"openstack-operator-controller-manager-5dc6fb848b-7nk64\" (UID: \"b4a70780-ab41-4199-b1b8-09b01cd6a4ac\") " pod="openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64" Feb 27 01:21:31 crc kubenswrapper[4771]: I0227 01:21:31.804042 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-webhook-certs\") pod \"openstack-operator-controller-manager-5dc6fb848b-7nk64\" (UID: \"b4a70780-ab41-4199-b1b8-09b01cd6a4ac\") " pod="openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64" Feb 27 01:21:31 crc kubenswrapper[4771]: E0227 01:21:31.803945 4771 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 27 01:21:31 crc kubenswrapper[4771]: E0227 01:21:31.804234 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-metrics-certs podName:b4a70780-ab41-4199-b1b8-09b01cd6a4ac nodeName:}" failed. No retries permitted until 2026-02-27 01:21:32.80421983 +0000 UTC m=+1005.741781118 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-metrics-certs") pod "openstack-operator-controller-manager-5dc6fb848b-7nk64" (UID: "b4a70780-ab41-4199-b1b8-09b01cd6a4ac") : secret "metrics-server-cert" not found Feb 27 01:21:31 crc kubenswrapper[4771]: E0227 01:21:31.804182 4771 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 27 01:21:31 crc kubenswrapper[4771]: E0227 01:21:31.805333 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-webhook-certs podName:b4a70780-ab41-4199-b1b8-09b01cd6a4ac nodeName:}" failed. No retries permitted until 2026-02-27 01:21:32.805300219 +0000 UTC m=+1005.742861497 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-webhook-certs") pod "openstack-operator-controller-manager-5dc6fb848b-7nk64" (UID: "b4a70780-ab41-4199-b1b8-09b01cd6a4ac") : secret "webhook-server-cert" not found Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.007207 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-t65sw"] Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.031974 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-w5hxv"] Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.044344 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8rvj"] Feb 27 01:21:32 crc kubenswrapper[4771]: W0227 01:21:32.055796 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod646fbcd2_1bd9_4e76_a70b_c4812c6cdbf7.slice/crio-17aab5a54609a57201049daaa9a7948873e483d103a9df5e013ae707b70c5d35 WatchSource:0}: Error finding container 17aab5a54609a57201049daaa9a7948873e483d103a9df5e013ae707b70c5d35: Status 404 returned error can't find the container with id 17aab5a54609a57201049daaa9a7948873e483d103a9df5e013ae707b70c5d35 Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.094107 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-4fsjk"] Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.102609 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-vbhct"] Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.109526 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ea9fc68-1ea7-48fe-b692-f99747dbd694-cert\") pod \"infra-operator-controller-manager-79d975b745-4p5fg\" (UID: \"5ea9fc68-1ea7-48fe-b692-f99747dbd694\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4p5fg" Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.110312 4771 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.110365 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ea9fc68-1ea7-48fe-b692-f99747dbd694-cert podName:5ea9fc68-1ea7-48fe-b692-f99747dbd694 nodeName:}" failed. No retries permitted until 2026-02-27 01:21:34.110348941 +0000 UTC m=+1007.047910229 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ea9fc68-1ea7-48fe-b692-f99747dbd694-cert") pod "infra-operator-controller-manager-79d975b745-4p5fg" (UID: "5ea9fc68-1ea7-48fe-b692-f99747dbd694") : secret "infra-operator-webhook-server-cert" not found Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.112662 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-x969l"] Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.116891 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:f4143497c70c048a7733c284060347a0c74ef4e628aca22ee191e5bc9e4c7192,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nwxgw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-5955d8c787-2qcds_openstack-operators(7cf10a28-d86e-4299-8b06-84888ca3dcb9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.117224 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:7961c67cfc87de69055f8330771af625f73d857426c4bb17ebb888ead843fff3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-74hsx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-55d77d7b5c-9jpm2_openstack-operators(f882b343-7b46-4516-9a17-833858bbfda7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.118783 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jpm2" podUID="f882b343-7b46-4516-9a17-833858bbfda7" Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.118831 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-2qcds" podUID="7cf10a28-d86e-4299-8b06-84888ca3dcb9" Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.122145 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-llvjw"] Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.130359 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-mrvth"] Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.139793 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-df8gr"] Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.145126 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-d8xdb"] Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.149928 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jpm2"] Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.156411 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-2qcds"] Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.269646 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2rpsh"] Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.290988 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-65x55"] Feb 27 01:21:32 crc kubenswrapper[4771]: W0227 01:21:32.292063 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbef6603d_191e_4d4b_b824_4a8d4f81c991.slice/crio-677a15b721f6fbd8f259a348486d6854d4fd4c0fbf00d2359caeeb0eae473720 WatchSource:0}: Error finding container 677a15b721f6fbd8f259a348486d6854d4fd4c0fbf00d2359caeeb0eae473720: Status 404 returned error can't find the container with id 677a15b721f6fbd8f259a348486d6854d4fd4c0fbf00d2359caeeb0eae473720 Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.296329 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:c7c7d4228994efb8b93cfabe4d78b40b085d91848dc49db247b7bbca689dae06,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vkww4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-659dc6bbfc-65x55_openstack-operators(7a2d4cdb-cbb2-4910-a9b5-ae2adfb04205): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.297661 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-65x55" podUID="7a2d4cdb-cbb2-4910-a9b5-ae2adfb04205" Feb 27 01:21:32 crc kubenswrapper[4771]: W0227 01:21:32.302378 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod987278ec_2526_4db5_a442_58b38687805c.slice/crio-9e7a12feb7c2841b7ee0ae16938b9ab2a4bf6baf0a0619c670b215c58e092991 WatchSource:0}: Error finding container 9e7a12feb7c2841b7ee0ae16938b9ab2a4bf6baf0a0619c670b215c58e092991: Status 404 returned error can't find the container with id 9e7a12feb7c2841b7ee0ae16938b9ab2a4bf6baf0a0619c670b215c58e092991 Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.303838 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-cp5l8"] Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.305165 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pgc28,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5dc6794d5b-cp5l8_openstack-operators(987278ec-2526-4db5-a442-58b38687805c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.307957 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-cp5l8" podUID="987278ec-2526-4db5-a442-58b38687805c" Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.313835 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-589c568786-wb7w9"] Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.326213 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-lg7vn"] Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.326688 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:4eb8fab5530a08915d3ab3e11e2808aeae16c8a220ed34ee04a186b2ae2303dc,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zwgwn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-589c568786-wb7w9_openstack-operators(a7c97c14-2dc7-409a-bb85-7e10031e839b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.328378 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-wb7w9" podUID="a7c97c14-2dc7-409a-bb85-7e10031e839b" Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.332966 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-6j9rs"] Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.339704 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wqhj2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-bccc79885-lg7vn_openstack-operators(981a63b0-1a15-42f0-8d4a-0dc24dbd87b1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.340991 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-lg7vn" podUID="981a63b0-1a15-42f0-8d4a-0dc24dbd87b1" Feb 27 01:21:32 crc kubenswrapper[4771]: W0227 01:21:32.342871 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20a5fef1_ac14_40c6_bb97_6e6f39be1645.slice/crio-65bb3b168ff61aa43814c64473e6a923fd515bca206393f2a3a587eac0ef371a WatchSource:0}: Error finding container 65bb3b168ff61aa43814c64473e6a923fd515bca206393f2a3a587eac0ef371a: Status 404 returned error can't find the container with id 65bb3b168ff61aa43814c64473e6a923fd515bca206393f2a3a587eac0ef371a Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.344914 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:14ae1fb8d065e2317959ce7490a878dc87731d27ebf40259f801ba1a83cfefcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hbx2f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-6bd4687957-6j9rs_openstack-operators(20a5fef1-ac14-40c6-bb97-6e6f39be1645): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.346075 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-6j9rs" podUID="20a5fef1-ac14-40c6-bb97-6e6f39be1645" Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.489613 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-d8xdb" event={"ID":"aece7f0f-11e5-4934-b818-f8c92e54439b","Type":"ContainerStarted","Data":"07ca538e48f1a720a545be62a73e752590a80133cf5cf44beb3a7e0e1fe34b9f"} Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.490818 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-df8gr" event={"ID":"17dfc012-107f-437d-bbfd-13a1250857ed","Type":"ContainerStarted","Data":"8ac817354848a7362626b3648c7b7b4cf2534d29220d7c817a33c16888f27d84"} Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.492787 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t65sw" event={"ID":"eb603c5e-cb7c-41e4-ac8a-f9a960141d16","Type":"ContainerStarted","Data":"9cd24446678c16a9e07d55f2ceffb790d686b89327841aab1d6d5779fd44e840"} Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.494081 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-x969l" event={"ID":"a40b776f-5677-4909-8b04-a5b2318737bc","Type":"ContainerStarted","Data":"18eb4c5fa46aa024b0b8f6d6385dfa727e64e67230b8115ac3111427870d572a"} Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.495251 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jpm2" event={"ID":"f882b343-7b46-4516-9a17-833858bbfda7","Type":"ContainerStarted","Data":"377dc8f55e9bb145251e813997f413317bd9dd9b7030b727c2206830612c65a0"} Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.497427 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-vbhct" event={"ID":"e5ed9ba2-1499-42b0-9a16-213f7bd6336f","Type":"ContainerStarted","Data":"53d8c2882cb7c008ddf5defe7a468fa068d5ed5a3d8d875b362a0fbde5c5968a"} Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.498520 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:7961c67cfc87de69055f8330771af625f73d857426c4bb17ebb888ead843fff3\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jpm2" podUID="f882b343-7b46-4516-9a17-833858bbfda7" Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.499229 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-2qcds" event={"ID":"7cf10a28-d86e-4299-8b06-84888ca3dcb9","Type":"ContainerStarted","Data":"f642738565d84678b5113f37ebb464c41bf970b8a70a29bf46c0f6a18945891a"} Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.500848 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-65x55" event={"ID":"7a2d4cdb-cbb2-4910-a9b5-ae2adfb04205","Type":"ContainerStarted","Data":"8cfe46e315a8af1bc5ac0c51e543a8ea04155b07b841779d815e601d2c589f73"} Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.502086 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:c7c7d4228994efb8b93cfabe4d78b40b085d91848dc49db247b7bbca689dae06\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-65x55" podUID="7a2d4cdb-cbb2-4910-a9b5-ae2adfb04205" Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.502190 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:f4143497c70c048a7733c284060347a0c74ef4e628aca22ee191e5bc9e4c7192\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-2qcds" podUID="7cf10a28-d86e-4299-8b06-84888ca3dcb9" Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.502633 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4fsjk" event={"ID":"61b58ad1-8db7-4a41-9774-38781245baff","Type":"ContainerStarted","Data":"d995c57c26ca535ea2c64b1c55dff7f7875d8de2ee150e3cbec7543a0fad7365"} Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.513189 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-mrvth" event={"ID":"8bd8d6ef-0025-4148-a530-1964ae763645","Type":"ContainerStarted","Data":"9fac190dfffdce86920e69b4fb34ee2672240a1ff7d7038a4bf7ffe41b3c7580"} Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.515230 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-wb7w9" event={"ID":"a7c97c14-2dc7-409a-bb85-7e10031e839b","Type":"ContainerStarted","Data":"60ad02cb50e57130b340eca1649223692db1a630103c161c9277a6007182a8e7"} Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.517190 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e01a3024-1558-41e4-bbb4-06451d536782-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq\" (UID: \"e01a3024-1558-41e4-bbb4-06451d536782\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq" Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.517410 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:4eb8fab5530a08915d3ab3e11e2808aeae16c8a220ed34ee04a186b2ae2303dc\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-wb7w9" podUID="a7c97c14-2dc7-409a-bb85-7e10031e839b" Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.517539 4771 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.517704 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e01a3024-1558-41e4-bbb4-06451d536782-cert podName:e01a3024-1558-41e4-bbb4-06451d536782 nodeName:}" failed. No retries permitted until 2026-02-27 01:21:34.517682086 +0000 UTC m=+1007.455243384 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e01a3024-1558-41e4-bbb4-06451d536782-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq" (UID: "e01a3024-1558-41e4-bbb4-06451d536782") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.524055 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-zlggr" event={"ID":"f77508f2-411f-4644-9b48-7edbefaf3bb4","Type":"ContainerStarted","Data":"cd8207a7b44306c36a81d5372a6165357cc354c210bfde19acbb5b66a94423b8"} Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.529284 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-snqrx" event={"ID":"9f4615e8-ebc8-43ff-bdec-481f86af58bf","Type":"ContainerStarted","Data":"0e7c6712cfd09164c5da89a13da3678589f481b8a229f0762bbc8d8d9f407932"} Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.530925 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-llvjw" event={"ID":"0c8b88b1-8f42-458c-933e-0bcd17da38cb","Type":"ContainerStarted","Data":"53fd6421d2549908e99d5252af1d1eed49bda1f0d0d6c19064efe61268dd3326"} Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.533450 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-cp5l8" event={"ID":"987278ec-2526-4db5-a442-58b38687805c","Type":"ContainerStarted","Data":"9e7a12feb7c2841b7ee0ae16938b9ab2a4bf6baf0a0619c670b215c58e092991"} Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.534760 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98\\\"\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-cp5l8" podUID="987278ec-2526-4db5-a442-58b38687805c" Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.535887 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8rvj" event={"ID":"646fbcd2-1bd9-4e76-a70b-c4812c6cdbf7","Type":"ContainerStarted","Data":"17aab5a54609a57201049daaa9a7948873e483d103a9df5e013ae707b70c5d35"} Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.541442 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2rpsh" event={"ID":"bef6603d-191e-4d4b-b824-4a8d4f81c991","Type":"ContainerStarted","Data":"677a15b721f6fbd8f259a348486d6854d4fd4c0fbf00d2359caeeb0eae473720"} Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.552769 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-6j9rs" event={"ID":"20a5fef1-ac14-40c6-bb97-6e6f39be1645","Type":"ContainerStarted","Data":"65bb3b168ff61aa43814c64473e6a923fd515bca206393f2a3a587eac0ef371a"} Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.554577 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:14ae1fb8d065e2317959ce7490a878dc87731d27ebf40259f801ba1a83cfefcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-6j9rs" podUID="20a5fef1-ac14-40c6-bb97-6e6f39be1645" Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.555981 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-w5hxv" event={"ID":"b563eec9-7160-44db-a640-4cf7e25bc893","Type":"ContainerStarted","Data":"c283fc6d523c5464a33ca944b25b2d0d50b83bc7c49ddaf968c4795f950dc072"} Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.557093 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-lg7vn" event={"ID":"981a63b0-1a15-42f0-8d4a-0dc24dbd87b1","Type":"ContainerStarted","Data":"852c3d9e4a56967893cdec0c075d291edcc86be1be7b8801a4f5a8051449aeb3"} Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.558030 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-lg7vn" podUID="981a63b0-1a15-42f0-8d4a-0dc24dbd87b1" Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.822161 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-metrics-certs\") pod \"openstack-operator-controller-manager-5dc6fb848b-7nk64\" (UID: \"b4a70780-ab41-4199-b1b8-09b01cd6a4ac\") " pod="openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64" Feb 27 01:21:32 crc kubenswrapper[4771]: I0227 01:21:32.822200 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-webhook-certs\") pod \"openstack-operator-controller-manager-5dc6fb848b-7nk64\" (UID: \"b4a70780-ab41-4199-b1b8-09b01cd6a4ac\") " pod="openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64" Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.823009 4771 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.823088 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-metrics-certs podName:b4a70780-ab41-4199-b1b8-09b01cd6a4ac nodeName:}" failed. No retries permitted until 2026-02-27 01:21:34.823070837 +0000 UTC m=+1007.760632125 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-metrics-certs") pod "openstack-operator-controller-manager-5dc6fb848b-7nk64" (UID: "b4a70780-ab41-4199-b1b8-09b01cd6a4ac") : secret "metrics-server-cert" not found Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.823668 4771 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 27 01:21:32 crc kubenswrapper[4771]: E0227 01:21:32.823730 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-webhook-certs podName:b4a70780-ab41-4199-b1b8-09b01cd6a4ac nodeName:}" failed. No retries permitted until 2026-02-27 01:21:34.823713655 +0000 UTC m=+1007.761274943 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-webhook-certs") pod "openstack-operator-controller-manager-5dc6fb848b-7nk64" (UID: "b4a70780-ab41-4199-b1b8-09b01cd6a4ac") : secret "webhook-server-cert" not found Feb 27 01:21:33 crc kubenswrapper[4771]: E0227 01:21:33.571620 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:f4143497c70c048a7733c284060347a0c74ef4e628aca22ee191e5bc9e4c7192\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-2qcds" podUID="7cf10a28-d86e-4299-8b06-84888ca3dcb9" Feb 27 01:21:33 crc kubenswrapper[4771]: E0227 01:21:33.571623 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-lg7vn" podUID="981a63b0-1a15-42f0-8d4a-0dc24dbd87b1" Feb 27 01:21:33 crc kubenswrapper[4771]: E0227 01:21:33.571708 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:14ae1fb8d065e2317959ce7490a878dc87731d27ebf40259f801ba1a83cfefcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-6j9rs" podUID="20a5fef1-ac14-40c6-bb97-6e6f39be1645" Feb 27 01:21:33 crc kubenswrapper[4771]: E0227 01:21:33.571842 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:4eb8fab5530a08915d3ab3e11e2808aeae16c8a220ed34ee04a186b2ae2303dc\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-wb7w9" podUID="a7c97c14-2dc7-409a-bb85-7e10031e839b" Feb 27 01:21:33 crc kubenswrapper[4771]: E0227 01:21:33.571889 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:7961c67cfc87de69055f8330771af625f73d857426c4bb17ebb888ead843fff3\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jpm2" podUID="f882b343-7b46-4516-9a17-833858bbfda7" Feb 27 01:21:33 crc kubenswrapper[4771]: E0227 01:21:33.571939 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:c7c7d4228994efb8b93cfabe4d78b40b085d91848dc49db247b7bbca689dae06\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-65x55" podUID="7a2d4cdb-cbb2-4910-a9b5-ae2adfb04205" Feb 27 01:21:33 crc kubenswrapper[4771]: E0227 01:21:33.574942 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98\\\"\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-cp5l8" podUID="987278ec-2526-4db5-a442-58b38687805c" Feb 27 01:21:34 crc kubenswrapper[4771]: I0227 01:21:34.148309 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ea9fc68-1ea7-48fe-b692-f99747dbd694-cert\") pod \"infra-operator-controller-manager-79d975b745-4p5fg\" (UID: \"5ea9fc68-1ea7-48fe-b692-f99747dbd694\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4p5fg" Feb 27 01:21:34 crc kubenswrapper[4771]: E0227 01:21:34.148841 4771 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 27 01:21:34 crc kubenswrapper[4771]: E0227 01:21:34.148886 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ea9fc68-1ea7-48fe-b692-f99747dbd694-cert podName:5ea9fc68-1ea7-48fe-b692-f99747dbd694 nodeName:}" failed. No retries permitted until 2026-02-27 01:21:38.148873169 +0000 UTC m=+1011.086434457 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ea9fc68-1ea7-48fe-b692-f99747dbd694-cert") pod "infra-operator-controller-manager-79d975b745-4p5fg" (UID: "5ea9fc68-1ea7-48fe-b692-f99747dbd694") : secret "infra-operator-webhook-server-cert" not found Feb 27 01:21:34 crc kubenswrapper[4771]: I0227 01:21:34.553960 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e01a3024-1558-41e4-bbb4-06451d536782-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq\" (UID: \"e01a3024-1558-41e4-bbb4-06451d536782\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq" Feb 27 01:21:34 crc kubenswrapper[4771]: E0227 01:21:34.554145 4771 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 01:21:34 crc kubenswrapper[4771]: E0227 01:21:34.554244 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e01a3024-1558-41e4-bbb4-06451d536782-cert podName:e01a3024-1558-41e4-bbb4-06451d536782 nodeName:}" failed. No retries permitted until 2026-02-27 01:21:38.55422038 +0000 UTC m=+1011.491781688 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e01a3024-1558-41e4-bbb4-06451d536782-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq" (UID: "e01a3024-1558-41e4-bbb4-06451d536782") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 01:21:34 crc kubenswrapper[4771]: I0227 01:21:34.858561 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-metrics-certs\") pod \"openstack-operator-controller-manager-5dc6fb848b-7nk64\" (UID: \"b4a70780-ab41-4199-b1b8-09b01cd6a4ac\") " pod="openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64" Feb 27 01:21:34 crc kubenswrapper[4771]: I0227 01:21:34.858611 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-webhook-certs\") pod \"openstack-operator-controller-manager-5dc6fb848b-7nk64\" (UID: \"b4a70780-ab41-4199-b1b8-09b01cd6a4ac\") " pod="openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64" Feb 27 01:21:34 crc kubenswrapper[4771]: E0227 01:21:34.858847 4771 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 27 01:21:34 crc kubenswrapper[4771]: E0227 01:21:34.858910 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-webhook-certs podName:b4a70780-ab41-4199-b1b8-09b01cd6a4ac nodeName:}" failed. No retries permitted until 2026-02-27 01:21:38.858891161 +0000 UTC m=+1011.796452449 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-webhook-certs") pod "openstack-operator-controller-manager-5dc6fb848b-7nk64" (UID: "b4a70780-ab41-4199-b1b8-09b01cd6a4ac") : secret "webhook-server-cert" not found Feb 27 01:21:34 crc kubenswrapper[4771]: E0227 01:21:34.858968 4771 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 27 01:21:34 crc kubenswrapper[4771]: E0227 01:21:34.858995 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-metrics-certs podName:b4a70780-ab41-4199-b1b8-09b01cd6a4ac nodeName:}" failed. No retries permitted until 2026-02-27 01:21:38.858986223 +0000 UTC m=+1011.796547511 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-metrics-certs") pod "openstack-operator-controller-manager-5dc6fb848b-7nk64" (UID: "b4a70780-ab41-4199-b1b8-09b01cd6a4ac") : secret "metrics-server-cert" not found Feb 27 01:21:38 crc kubenswrapper[4771]: I0227 01:21:38.218715 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ea9fc68-1ea7-48fe-b692-f99747dbd694-cert\") pod \"infra-operator-controller-manager-79d975b745-4p5fg\" (UID: \"5ea9fc68-1ea7-48fe-b692-f99747dbd694\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4p5fg" Feb 27 01:21:38 crc kubenswrapper[4771]: E0227 01:21:38.218915 4771 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 27 01:21:38 crc kubenswrapper[4771]: E0227 01:21:38.219242 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ea9fc68-1ea7-48fe-b692-f99747dbd694-cert podName:5ea9fc68-1ea7-48fe-b692-f99747dbd694 nodeName:}" failed. No retries permitted until 2026-02-27 01:21:46.219218561 +0000 UTC m=+1019.156779849 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ea9fc68-1ea7-48fe-b692-f99747dbd694-cert") pod "infra-operator-controller-manager-79d975b745-4p5fg" (UID: "5ea9fc68-1ea7-48fe-b692-f99747dbd694") : secret "infra-operator-webhook-server-cert" not found Feb 27 01:21:38 crc kubenswrapper[4771]: I0227 01:21:38.624125 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e01a3024-1558-41e4-bbb4-06451d536782-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq\" (UID: \"e01a3024-1558-41e4-bbb4-06451d536782\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq" Feb 27 01:21:38 crc kubenswrapper[4771]: E0227 01:21:38.624290 4771 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 01:21:38 crc kubenswrapper[4771]: E0227 01:21:38.624349 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e01a3024-1558-41e4-bbb4-06451d536782-cert podName:e01a3024-1558-41e4-bbb4-06451d536782 nodeName:}" failed. No retries permitted until 2026-02-27 01:21:46.624327776 +0000 UTC m=+1019.561889074 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e01a3024-1558-41e4-bbb4-06451d536782-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq" (UID: "e01a3024-1558-41e4-bbb4-06451d536782") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 27 01:21:38 crc kubenswrapper[4771]: I0227 01:21:38.927114 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-metrics-certs\") pod \"openstack-operator-controller-manager-5dc6fb848b-7nk64\" (UID: \"b4a70780-ab41-4199-b1b8-09b01cd6a4ac\") " pod="openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64" Feb 27 01:21:38 crc kubenswrapper[4771]: I0227 01:21:38.927160 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-webhook-certs\") pod \"openstack-operator-controller-manager-5dc6fb848b-7nk64\" (UID: \"b4a70780-ab41-4199-b1b8-09b01cd6a4ac\") " pod="openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64" Feb 27 01:21:38 crc kubenswrapper[4771]: E0227 01:21:38.927392 4771 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 27 01:21:38 crc kubenswrapper[4771]: E0227 01:21:38.927451 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-webhook-certs podName:b4a70780-ab41-4199-b1b8-09b01cd6a4ac nodeName:}" failed. No retries permitted until 2026-02-27 01:21:46.927429894 +0000 UTC m=+1019.864991182 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-webhook-certs") pod "openstack-operator-controller-manager-5dc6fb848b-7nk64" (UID: "b4a70780-ab41-4199-b1b8-09b01cd6a4ac") : secret "webhook-server-cert" not found Feb 27 01:21:38 crc kubenswrapper[4771]: E0227 01:21:38.927795 4771 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 27 01:21:38 crc kubenswrapper[4771]: E0227 01:21:38.927830 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-metrics-certs podName:b4a70780-ab41-4199-b1b8-09b01cd6a4ac nodeName:}" failed. No retries permitted until 2026-02-27 01:21:46.927820734 +0000 UTC m=+1019.865382022 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-metrics-certs") pod "openstack-operator-controller-manager-5dc6fb848b-7nk64" (UID: "b4a70780-ab41-4199-b1b8-09b01cd6a4ac") : secret "metrics-server-cert" not found Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.680108 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4fsjk" event={"ID":"61b58ad1-8db7-4a41-9774-38781245baff","Type":"ContainerStarted","Data":"9d19dda155b0ecf95fc9c4f5375dafbbb5b4ba35ab3990a817b629d2326038a2"} Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.680693 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4fsjk" Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.685008 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-snqrx" event={"ID":"9f4615e8-ebc8-43ff-bdec-481f86af58bf","Type":"ContainerStarted","Data":"449300b03e7f2ece740b013861c5d37d141e084a6a32aa959c28e15c6697d720"} Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.685141 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-snqrx" Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.695041 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8rvj" event={"ID":"646fbcd2-1bd9-4e76-a70b-c4812c6cdbf7","Type":"ContainerStarted","Data":"8874cbf6eab0f4a2132461742460921b00d17d696f6019efb185fe48fd6ee23e"} Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.695190 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8rvj" Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.707075 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4fsjk" podStartSLOduration=3.271780022 podStartE2EDuration="15.707055074s" podCreationTimestamp="2026-02-27 01:21:30 +0000 UTC" firstStartedPulling="2026-02-27 01:21:32.081910324 +0000 UTC m=+1005.019471612" lastFinishedPulling="2026-02-27 01:21:44.517185376 +0000 UTC m=+1017.454746664" observedRunningTime="2026-02-27 01:21:45.704897066 +0000 UTC m=+1018.642458354" watchObservedRunningTime="2026-02-27 01:21:45.707055074 +0000 UTC m=+1018.644616362" Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.708190 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-x969l" event={"ID":"a40b776f-5677-4909-8b04-a5b2318737bc","Type":"ContainerStarted","Data":"53185ddff7e3cf344f4925a6727aebe38f27df27734dc070606abc7e9e624b35"} Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.708330 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-x969l" Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.721027 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-llvjw" event={"ID":"0c8b88b1-8f42-458c-933e-0bcd17da38cb","Type":"ContainerStarted","Data":"d2743f03da26d46dc03628275f73326bb71811db55e5b0d162e4c8ec41c4c653"} Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.721140 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-llvjw" Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.729648 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-vbhct" event={"ID":"e5ed9ba2-1499-42b0-9a16-213f7bd6336f","Type":"ContainerStarted","Data":"8b9d6a03e217f0ed8c88b534ac275ef2399db4655bb4d8e520e00306151c3d22"} Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.729803 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-vbhct" Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.742108 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-zlggr" event={"ID":"f77508f2-411f-4644-9b48-7edbefaf3bb4","Type":"ContainerStarted","Data":"7f4b60e33ecd932bc4781a901da7a521b00e23c1224b4e796600a5bb37c7c189"} Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.742206 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-zlggr" Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.748857 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8rvj" podStartSLOduration=3.363407955 podStartE2EDuration="15.748833806s" podCreationTimestamp="2026-02-27 01:21:30 +0000 UTC" firstStartedPulling="2026-02-27 01:21:32.0810394 +0000 UTC m=+1005.018600688" lastFinishedPulling="2026-02-27 01:21:44.466465231 +0000 UTC m=+1017.404026539" observedRunningTime="2026-02-27 01:21:45.741942087 +0000 UTC m=+1018.679503375" watchObservedRunningTime="2026-02-27 01:21:45.748833806 +0000 UTC m=+1018.686395094" Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.753262 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t65sw" event={"ID":"eb603c5e-cb7c-41e4-ac8a-f9a960141d16","Type":"ContainerStarted","Data":"87e1bc9d01abc0d7b6a76d5df75cef4a1792f43a5fc3efa1828fe77f69eb0e6f"} Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.753405 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t65sw" Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.764131 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2rpsh" event={"ID":"bef6603d-191e-4d4b-b824-4a8d4f81c991","Type":"ContainerStarted","Data":"8cea309db9828205b355a9ad7894a113f8ecdbef834fedf681420ef33a21b4ce"} Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.795614 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-mrvth" event={"ID":"8bd8d6ef-0025-4148-a530-1964ae763645","Type":"ContainerStarted","Data":"0c2917b2df2dcc821c9bba3de80008901c0c5f842c377607e5ea854a6db104cd"} Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.795656 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-d8xdb" event={"ID":"aece7f0f-11e5-4934-b818-f8c92e54439b","Type":"ContainerStarted","Data":"fa44e8c3aad0252cbd246890c32f04413a4e2d64cb1531453b3723917964266a"} Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.795673 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-mrvth" Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.795684 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-d8xdb" Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.804899 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-df8gr" event={"ID":"17dfc012-107f-437d-bbfd-13a1250857ed","Type":"ContainerStarted","Data":"6b6ab5df16794afca02ea49879b0f44b12b64f9a7a8f909566daefdf831abff0"} Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.804950 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-df8gr" Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.822792 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-mrvth" podStartSLOduration=3.472036822 podStartE2EDuration="15.822776785s" podCreationTimestamp="2026-02-27 01:21:30 +0000 UTC" firstStartedPulling="2026-02-27 01:21:32.096789951 +0000 UTC m=+1005.034351239" lastFinishedPulling="2026-02-27 01:21:44.447529914 +0000 UTC m=+1017.385091202" observedRunningTime="2026-02-27 01:21:45.821913461 +0000 UTC m=+1018.759474739" watchObservedRunningTime="2026-02-27 01:21:45.822776785 +0000 UTC m=+1018.760338073" Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.823266 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-snqrx" podStartSLOduration=4.036787837 podStartE2EDuration="15.823260548s" podCreationTimestamp="2026-02-27 01:21:30 +0000 UTC" firstStartedPulling="2026-02-27 01:21:31.580747486 +0000 UTC m=+1004.518308774" lastFinishedPulling="2026-02-27 01:21:43.367220197 +0000 UTC m=+1016.304781485" observedRunningTime="2026-02-27 01:21:45.796324493 +0000 UTC m=+1018.733885771" watchObservedRunningTime="2026-02-27 01:21:45.823260548 +0000 UTC m=+1018.760821836" Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.832351 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-w5hxv" event={"ID":"b563eec9-7160-44db-a640-4cf7e25bc893","Type":"ContainerStarted","Data":"3d8ee9726051119e2d5020a2999f5a6dac31c28a6ff2b6922744b09bca130978"} Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.832982 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-w5hxv" Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.855290 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2rpsh" podStartSLOduration=2.634021748 podStartE2EDuration="14.855266703s" podCreationTimestamp="2026-02-27 01:21:31 +0000 UTC" firstStartedPulling="2026-02-27 01:21:32.295993162 +0000 UTC m=+1005.233554440" lastFinishedPulling="2026-02-27 01:21:44.517238097 +0000 UTC m=+1017.454799395" observedRunningTime="2026-02-27 01:21:45.851985343 +0000 UTC m=+1018.789546631" watchObservedRunningTime="2026-02-27 01:21:45.855266703 +0000 UTC m=+1018.792827991" Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.888779 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-vbhct" podStartSLOduration=3.521345319 podStartE2EDuration="15.888758377s" podCreationTimestamp="2026-02-27 01:21:30 +0000 UTC" firstStartedPulling="2026-02-27 01:21:32.080889226 +0000 UTC m=+1005.018450514" lastFinishedPulling="2026-02-27 01:21:44.448302284 +0000 UTC m=+1017.385863572" observedRunningTime="2026-02-27 01:21:45.88192045 +0000 UTC m=+1018.819481738" watchObservedRunningTime="2026-02-27 01:21:45.888758377 +0000 UTC m=+1018.826319665" Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.917154 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-x969l" podStartSLOduration=4.240661315 podStartE2EDuration="15.917123962s" podCreationTimestamp="2026-02-27 01:21:30 +0000 UTC" firstStartedPulling="2026-02-27 01:21:32.098010234 +0000 UTC m=+1005.035571522" lastFinishedPulling="2026-02-27 01:21:43.774472881 +0000 UTC m=+1016.712034169" observedRunningTime="2026-02-27 01:21:45.914392397 +0000 UTC m=+1018.851953685" watchObservedRunningTime="2026-02-27 01:21:45.917123962 +0000 UTC m=+1018.854685250" Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.947645 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-zlggr" podStartSLOduration=3.761648853 podStartE2EDuration="15.947618465s" podCreationTimestamp="2026-02-27 01:21:30 +0000 UTC" firstStartedPulling="2026-02-27 01:21:31.588737675 +0000 UTC m=+1004.526298963" lastFinishedPulling="2026-02-27 01:21:43.774707287 +0000 UTC m=+1016.712268575" observedRunningTime="2026-02-27 01:21:45.945299641 +0000 UTC m=+1018.882860939" watchObservedRunningTime="2026-02-27 01:21:45.947618465 +0000 UTC m=+1018.885179763" Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.976055 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-llvjw" podStartSLOduration=3.617279829 podStartE2EDuration="15.976025931s" podCreationTimestamp="2026-02-27 01:21:30 +0000 UTC" firstStartedPulling="2026-02-27 01:21:32.104650435 +0000 UTC m=+1005.042211723" lastFinishedPulling="2026-02-27 01:21:44.463396517 +0000 UTC m=+1017.400957825" observedRunningTime="2026-02-27 01:21:45.966900672 +0000 UTC m=+1018.904461960" watchObservedRunningTime="2026-02-27 01:21:45.976025931 +0000 UTC m=+1018.913587209" Feb 27 01:21:45 crc kubenswrapper[4771]: I0227 01:21:45.995238 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t65sw" podStartSLOduration=3.419705982 podStartE2EDuration="15.995223155s" podCreationTimestamp="2026-02-27 01:21:30 +0000 UTC" firstStartedPulling="2026-02-27 01:21:32.016347673 +0000 UTC m=+1004.953908961" lastFinishedPulling="2026-02-27 01:21:44.591864806 +0000 UTC m=+1017.529426134" observedRunningTime="2026-02-27 01:21:45.992998234 +0000 UTC m=+1018.930559522" watchObservedRunningTime="2026-02-27 01:21:45.995223155 +0000 UTC m=+1018.932784443" Feb 27 01:21:46 crc kubenswrapper[4771]: I0227 01:21:46.009138 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-d8xdb" podStartSLOduration=4.747498949 podStartE2EDuration="16.009112755s" podCreationTimestamp="2026-02-27 01:21:30 +0000 UTC" firstStartedPulling="2026-02-27 01:21:32.105508009 +0000 UTC m=+1005.043069297" lastFinishedPulling="2026-02-27 01:21:43.367121815 +0000 UTC m=+1016.304683103" observedRunningTime="2026-02-27 01:21:46.003887012 +0000 UTC m=+1018.941448540" watchObservedRunningTime="2026-02-27 01:21:46.009112755 +0000 UTC m=+1018.946674043" Feb 27 01:21:46 crc kubenswrapper[4771]: I0227 01:21:46.019525 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-w5hxv" podStartSLOduration=3.639027752 podStartE2EDuration="16.019507368s" podCreationTimestamp="2026-02-27 01:21:30 +0000 UTC" firstStartedPulling="2026-02-27 01:21:32.080968208 +0000 UTC m=+1005.018529506" lastFinishedPulling="2026-02-27 01:21:44.461447794 +0000 UTC m=+1017.399009122" observedRunningTime="2026-02-27 01:21:46.018064959 +0000 UTC m=+1018.955626257" watchObservedRunningTime="2026-02-27 01:21:46.019507368 +0000 UTC m=+1018.957068656" Feb 27 01:21:46 crc kubenswrapper[4771]: I0227 01:21:46.047583 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-df8gr" podStartSLOduration=3.6975845019999998 podStartE2EDuration="16.047566154s" podCreationTimestamp="2026-02-27 01:21:30 +0000 UTC" firstStartedPulling="2026-02-27 01:21:32.098343263 +0000 UTC m=+1005.035904541" lastFinishedPulling="2026-02-27 01:21:44.448324885 +0000 UTC m=+1017.385886193" observedRunningTime="2026-02-27 01:21:46.043889435 +0000 UTC m=+1018.981450723" watchObservedRunningTime="2026-02-27 01:21:46.047566154 +0000 UTC m=+1018.985127442" Feb 27 01:21:46 crc kubenswrapper[4771]: I0227 01:21:46.261101 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ea9fc68-1ea7-48fe-b692-f99747dbd694-cert\") pod \"infra-operator-controller-manager-79d975b745-4p5fg\" (UID: \"5ea9fc68-1ea7-48fe-b692-f99747dbd694\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4p5fg" Feb 27 01:21:46 crc kubenswrapper[4771]: I0227 01:21:46.292954 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ea9fc68-1ea7-48fe-b692-f99747dbd694-cert\") pod \"infra-operator-controller-manager-79d975b745-4p5fg\" (UID: \"5ea9fc68-1ea7-48fe-b692-f99747dbd694\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-4p5fg" Feb 27 01:21:46 crc kubenswrapper[4771]: I0227 01:21:46.587749 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4p5fg" Feb 27 01:21:46 crc kubenswrapper[4771]: I0227 01:21:46.667215 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e01a3024-1558-41e4-bbb4-06451d536782-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq\" (UID: \"e01a3024-1558-41e4-bbb4-06451d536782\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq" Feb 27 01:21:46 crc kubenswrapper[4771]: I0227 01:21:46.672206 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e01a3024-1558-41e4-bbb4-06451d536782-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq\" (UID: \"e01a3024-1558-41e4-bbb4-06451d536782\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq" Feb 27 01:21:46 crc kubenswrapper[4771]: I0227 01:21:46.693052 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq" Feb 27 01:21:46 crc kubenswrapper[4771]: I0227 01:21:46.973654 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-webhook-certs\") pod \"openstack-operator-controller-manager-5dc6fb848b-7nk64\" (UID: \"b4a70780-ab41-4199-b1b8-09b01cd6a4ac\") " pod="openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64" Feb 27 01:21:46 crc kubenswrapper[4771]: I0227 01:21:46.974306 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-metrics-certs\") pod \"openstack-operator-controller-manager-5dc6fb848b-7nk64\" (UID: \"b4a70780-ab41-4199-b1b8-09b01cd6a4ac\") " pod="openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64" Feb 27 01:21:46 crc kubenswrapper[4771]: I0227 01:21:46.980234 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-webhook-certs\") pod \"openstack-operator-controller-manager-5dc6fb848b-7nk64\" (UID: \"b4a70780-ab41-4199-b1b8-09b01cd6a4ac\") " pod="openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64" Feb 27 01:21:46 crc kubenswrapper[4771]: I0227 01:21:46.984982 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4a70780-ab41-4199-b1b8-09b01cd6a4ac-metrics-certs\") pod \"openstack-operator-controller-manager-5dc6fb848b-7nk64\" (UID: \"b4a70780-ab41-4199-b1b8-09b01cd6a4ac\") " pod="openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64" Feb 27 01:21:47 crc kubenswrapper[4771]: I0227 01:21:47.130853 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64" Feb 27 01:21:47 crc kubenswrapper[4771]: I0227 01:21:47.158101 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-4p5fg"] Feb 27 01:21:47 crc kubenswrapper[4771]: I0227 01:21:47.163282 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq"] Feb 27 01:21:47 crc kubenswrapper[4771]: W0227 01:21:47.598567 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode01a3024_1558_41e4_bbb4_06451d536782.slice/crio-eeaf24db5538e2394fae9138710e39cc37e91cac3b61e45826008c3a31e2b9d8 WatchSource:0}: Error finding container eeaf24db5538e2394fae9138710e39cc37e91cac3b61e45826008c3a31e2b9d8: Status 404 returned error can't find the container with id eeaf24db5538e2394fae9138710e39cc37e91cac3b61e45826008c3a31e2b9d8 Feb 27 01:21:47 crc kubenswrapper[4771]: W0227 01:21:47.600197 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ea9fc68_1ea7_48fe_b692_f99747dbd694.slice/crio-c4bb1cd64897d43e17f531c8839d1523d902f7d1ae8da09bfbb3a9d1dba5a561 WatchSource:0}: Error finding container c4bb1cd64897d43e17f531c8839d1523d902f7d1ae8da09bfbb3a9d1dba5a561: Status 404 returned error can't find the container with id c4bb1cd64897d43e17f531c8839d1523d902f7d1ae8da09bfbb3a9d1dba5a561 Feb 27 01:21:47 crc kubenswrapper[4771]: I0227 01:21:47.846368 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4p5fg" event={"ID":"5ea9fc68-1ea7-48fe-b692-f99747dbd694","Type":"ContainerStarted","Data":"c4bb1cd64897d43e17f531c8839d1523d902f7d1ae8da09bfbb3a9d1dba5a561"} Feb 27 01:21:47 crc kubenswrapper[4771]: I0227 01:21:47.847634 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq" event={"ID":"e01a3024-1558-41e4-bbb4-06451d536782","Type":"ContainerStarted","Data":"eeaf24db5538e2394fae9138710e39cc37e91cac3b61e45826008c3a31e2b9d8"} Feb 27 01:21:48 crc kubenswrapper[4771]: I0227 01:21:48.781943 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64"] Feb 27 01:21:48 crc kubenswrapper[4771]: I0227 01:21:48.858599 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-65x55" event={"ID":"7a2d4cdb-cbb2-4910-a9b5-ae2adfb04205","Type":"ContainerStarted","Data":"e3ca745595eaf175e2c3bcd5ccada22d33ada703cd51d6a5469aa3ce6d341216"} Feb 27 01:21:48 crc kubenswrapper[4771]: I0227 01:21:48.859161 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-65x55" Feb 27 01:21:48 crc kubenswrapper[4771]: I0227 01:21:48.864845 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64" event={"ID":"b4a70780-ab41-4199-b1b8-09b01cd6a4ac","Type":"ContainerStarted","Data":"7fa1fcae3404403ed215da025f2e1d01b3654ffddbe71b0f1455738be4f07560"} Feb 27 01:21:48 crc kubenswrapper[4771]: I0227 01:21:48.869314 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-lg7vn" event={"ID":"981a63b0-1a15-42f0-8d4a-0dc24dbd87b1","Type":"ContainerStarted","Data":"9df06c1fcd6f50ce0f295076e86cc1930c658233a0d2598403636cffa5c785ee"} Feb 27 01:21:48 crc kubenswrapper[4771]: I0227 01:21:48.869586 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-lg7vn" Feb 27 01:21:48 crc kubenswrapper[4771]: I0227 01:21:48.888717 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-65x55" podStartSLOduration=2.816529059 podStartE2EDuration="18.888700594s" podCreationTimestamp="2026-02-27 01:21:30 +0000 UTC" firstStartedPulling="2026-02-27 01:21:32.296235788 +0000 UTC m=+1005.233797076" lastFinishedPulling="2026-02-27 01:21:48.368407313 +0000 UTC m=+1021.305968611" observedRunningTime="2026-02-27 01:21:48.885714242 +0000 UTC m=+1021.823275530" watchObservedRunningTime="2026-02-27 01:21:48.888700594 +0000 UTC m=+1021.826261882" Feb 27 01:21:49 crc kubenswrapper[4771]: I0227 01:21:49.875484 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64" event={"ID":"b4a70780-ab41-4199-b1b8-09b01cd6a4ac","Type":"ContainerStarted","Data":"d36d6669752ee4f44fdb5efc7a1735d11039f1854b09d9d41edfc7f98089e9e6"} Feb 27 01:21:49 crc kubenswrapper[4771]: I0227 01:21:49.876058 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64" Feb 27 01:21:49 crc kubenswrapper[4771]: I0227 01:21:49.898962 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-lg7vn" podStartSLOduration=3.869737885 podStartE2EDuration="19.898924266s" podCreationTimestamp="2026-02-27 01:21:30 +0000 UTC" firstStartedPulling="2026-02-27 01:21:32.339570182 +0000 UTC m=+1005.277131470" lastFinishedPulling="2026-02-27 01:21:48.368756523 +0000 UTC m=+1021.306317851" observedRunningTime="2026-02-27 01:21:48.903895309 +0000 UTC m=+1021.841456597" watchObservedRunningTime="2026-02-27 01:21:49.898924266 +0000 UTC m=+1022.836485554" Feb 27 01:21:50 crc kubenswrapper[4771]: I0227 01:21:50.568659 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-snqrx" Feb 27 01:21:50 crc kubenswrapper[4771]: I0227 01:21:50.590139 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64" podStartSLOduration=19.590119095 podStartE2EDuration="19.590119095s" podCreationTimestamp="2026-02-27 01:21:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:21:49.899849412 +0000 UTC m=+1022.837410700" watchObservedRunningTime="2026-02-27 01:21:50.590119095 +0000 UTC m=+1023.527680383" Feb 27 01:21:50 crc kubenswrapper[4771]: I0227 01:21:50.603258 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-zlggr" Feb 27 01:21:50 crc kubenswrapper[4771]: I0227 01:21:50.654654 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-x969l" Feb 27 01:21:50 crc kubenswrapper[4771]: I0227 01:21:50.690323 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8rvj" Feb 27 01:21:50 crc kubenswrapper[4771]: I0227 01:21:50.713344 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-w5hxv" Feb 27 01:21:50 crc kubenswrapper[4771]: I0227 01:21:50.758372 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t65sw" Feb 27 01:21:50 crc kubenswrapper[4771]: I0227 01:21:50.828983 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-mrvth" Feb 27 01:21:50 crc kubenswrapper[4771]: I0227 01:21:50.968324 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-df8gr" Feb 27 01:21:50 crc kubenswrapper[4771]: I0227 01:21:50.998096 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-4fsjk" Feb 27 01:21:51 crc kubenswrapper[4771]: I0227 01:21:51.050112 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-llvjw" Feb 27 01:21:51 crc kubenswrapper[4771]: I0227 01:21:51.145088 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-vbhct" Feb 27 01:21:51 crc kubenswrapper[4771]: I0227 01:21:51.175841 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-d8xdb" Feb 27 01:21:55 crc kubenswrapper[4771]: I0227 01:21:55.916918 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-wb7w9" event={"ID":"a7c97c14-2dc7-409a-bb85-7e10031e839b","Type":"ContainerStarted","Data":"a4e7f60326a0c58b843a96d3a03852d4e7ec9bae78b97c3e764a4e3dba02d539"} Feb 27 01:21:55 crc kubenswrapper[4771]: I0227 01:21:55.917419 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-wb7w9" Feb 27 01:21:55 crc kubenswrapper[4771]: I0227 01:21:55.918150 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-2qcds" event={"ID":"7cf10a28-d86e-4299-8b06-84888ca3dcb9","Type":"ContainerStarted","Data":"e0f7888b54f7bc152a70042c584cd7fde48dd69ecf3afcff611bbeebc3ed8577"} Feb 27 01:21:55 crc kubenswrapper[4771]: I0227 01:21:55.918308 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-2qcds" Feb 27 01:21:55 crc kubenswrapper[4771]: I0227 01:21:55.919339 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-6j9rs" event={"ID":"20a5fef1-ac14-40c6-bb97-6e6f39be1645","Type":"ContainerStarted","Data":"81f0a1b51189119b95df7c781e16323962ceefa02e43444022adc5cb5dedb913"} Feb 27 01:21:55 crc kubenswrapper[4771]: I0227 01:21:55.919519 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-6j9rs" Feb 27 01:21:55 crc kubenswrapper[4771]: I0227 01:21:55.920458 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-cp5l8" event={"ID":"987278ec-2526-4db5-a442-58b38687805c","Type":"ContainerStarted","Data":"4060adf87eab2879a3ea185083c7e64412d28420cdee14670eb6e590afc6e697"} Feb 27 01:21:55 crc kubenswrapper[4771]: I0227 01:21:55.920582 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-cp5l8" Feb 27 01:21:55 crc kubenswrapper[4771]: I0227 01:21:55.921933 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4p5fg" event={"ID":"5ea9fc68-1ea7-48fe-b692-f99747dbd694","Type":"ContainerStarted","Data":"99f4d3a83670f925d5454b033da06bc9673b02dee4c93ca135d5a939a8c32f1d"} Feb 27 01:21:55 crc kubenswrapper[4771]: I0227 01:21:55.922030 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4p5fg" Feb 27 01:21:55 crc kubenswrapper[4771]: I0227 01:21:55.923348 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq" event={"ID":"e01a3024-1558-41e4-bbb4-06451d536782","Type":"ContainerStarted","Data":"75cb3d61ace3f45fa887e172511cf81dd64ac0cd5da773b9d9b44df1a20d4412"} Feb 27 01:21:55 crc kubenswrapper[4771]: I0227 01:21:55.923394 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq" Feb 27 01:21:55 crc kubenswrapper[4771]: I0227 01:21:55.925139 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jpm2" event={"ID":"f882b343-7b46-4516-9a17-833858bbfda7","Type":"ContainerStarted","Data":"7db7a9e7d2cba41ba649e47939e2c50cc14d291a6a2004f6581d5d0a7686aa0b"} Feb 27 01:21:55 crc kubenswrapper[4771]: I0227 01:21:55.925518 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jpm2" Feb 27 01:21:55 crc kubenswrapper[4771]: I0227 01:21:55.937626 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-wb7w9" podStartSLOduration=2.783043845 podStartE2EDuration="25.937612029s" podCreationTimestamp="2026-02-27 01:21:30 +0000 UTC" firstStartedPulling="2026-02-27 01:21:32.326526026 +0000 UTC m=+1005.264087314" lastFinishedPulling="2026-02-27 01:21:55.48109421 +0000 UTC m=+1028.418655498" observedRunningTime="2026-02-27 01:21:55.933157728 +0000 UTC m=+1028.870719016" watchObservedRunningTime="2026-02-27 01:21:55.937612029 +0000 UTC m=+1028.875173317" Feb 27 01:21:55 crc kubenswrapper[4771]: I0227 01:21:55.979920 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq" podStartSLOduration=18.115024282 podStartE2EDuration="25.979908284s" podCreationTimestamp="2026-02-27 01:21:30 +0000 UTC" firstStartedPulling="2026-02-27 01:21:47.602006341 +0000 UTC m=+1020.539567629" lastFinishedPulling="2026-02-27 01:21:55.466890343 +0000 UTC m=+1028.404451631" observedRunningTime="2026-02-27 01:21:55.976370088 +0000 UTC m=+1028.913931376" watchObservedRunningTime="2026-02-27 01:21:55.979908284 +0000 UTC m=+1028.917469572" Feb 27 01:21:55 crc kubenswrapper[4771]: I0227 01:21:55.998313 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4p5fg" podStartSLOduration=18.136055156 podStartE2EDuration="25.998299796s" podCreationTimestamp="2026-02-27 01:21:30 +0000 UTC" firstStartedPulling="2026-02-27 01:21:47.602977867 +0000 UTC m=+1020.540539155" lastFinishedPulling="2026-02-27 01:21:55.465222507 +0000 UTC m=+1028.402783795" observedRunningTime="2026-02-27 01:21:55.993420764 +0000 UTC m=+1028.930982042" watchObservedRunningTime="2026-02-27 01:21:55.998299796 +0000 UTC m=+1028.935861084" Feb 27 01:21:56 crc kubenswrapper[4771]: I0227 01:21:56.012797 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-2qcds" podStartSLOduration=2.682258022 podStartE2EDuration="26.012783413s" podCreationTimestamp="2026-02-27 01:21:30 +0000 UTC" firstStartedPulling="2026-02-27 01:21:32.116674414 +0000 UTC m=+1005.054235702" lastFinishedPulling="2026-02-27 01:21:55.447199805 +0000 UTC m=+1028.384761093" observedRunningTime="2026-02-27 01:21:56.010150741 +0000 UTC m=+1028.947712029" watchObservedRunningTime="2026-02-27 01:21:56.012783413 +0000 UTC m=+1028.950344701" Feb 27 01:21:56 crc kubenswrapper[4771]: I0227 01:21:56.022392 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-cp5l8" podStartSLOduration=2.848964455 podStartE2EDuration="26.022374085s" podCreationTimestamp="2026-02-27 01:21:30 +0000 UTC" firstStartedPulling="2026-02-27 01:21:32.305034478 +0000 UTC m=+1005.242595776" lastFinishedPulling="2026-02-27 01:21:55.478444038 +0000 UTC m=+1028.416005406" observedRunningTime="2026-02-27 01:21:56.020857402 +0000 UTC m=+1028.958418690" watchObservedRunningTime="2026-02-27 01:21:56.022374085 +0000 UTC m=+1028.959935373" Feb 27 01:21:56 crc kubenswrapper[4771]: I0227 01:21:56.039905 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-6j9rs" podStartSLOduration=2.891886338 podStartE2EDuration="26.039888903s" podCreationTimestamp="2026-02-27 01:21:30 +0000 UTC" firstStartedPulling="2026-02-27 01:21:32.344825345 +0000 UTC m=+1005.282386633" lastFinishedPulling="2026-02-27 01:21:55.49282791 +0000 UTC m=+1028.430389198" observedRunningTime="2026-02-27 01:21:56.039184274 +0000 UTC m=+1028.976745562" watchObservedRunningTime="2026-02-27 01:21:56.039888903 +0000 UTC m=+1028.977450191" Feb 27 01:21:56 crc kubenswrapper[4771]: I0227 01:21:56.063467 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jpm2" podStartSLOduration=2.739190646 podStartE2EDuration="26.063451676s" podCreationTimestamp="2026-02-27 01:21:30 +0000 UTC" firstStartedPulling="2026-02-27 01:21:32.117133496 +0000 UTC m=+1005.054694784" lastFinishedPulling="2026-02-27 01:21:55.441394526 +0000 UTC m=+1028.378955814" observedRunningTime="2026-02-27 01:21:56.059746515 +0000 UTC m=+1028.997307803" watchObservedRunningTime="2026-02-27 01:21:56.063451676 +0000 UTC m=+1029.001012964" Feb 27 01:21:57 crc kubenswrapper[4771]: I0227 01:21:57.139305 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5dc6fb848b-7nk64" Feb 27 01:22:00 crc kubenswrapper[4771]: I0227 01:22:00.147322 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535922-bgv8w"] Feb 27 01:22:00 crc kubenswrapper[4771]: I0227 01:22:00.149136 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535922-bgv8w" Feb 27 01:22:00 crc kubenswrapper[4771]: I0227 01:22:00.151997 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:22:00 crc kubenswrapper[4771]: I0227 01:22:00.152055 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 01:22:00 crc kubenswrapper[4771]: I0227 01:22:00.153063 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:22:00 crc kubenswrapper[4771]: I0227 01:22:00.155587 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535922-bgv8w"] Feb 27 01:22:00 crc kubenswrapper[4771]: I0227 01:22:00.327302 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9cmb\" (UniqueName: \"kubernetes.io/projected/630dd7a9-ea0e-4a92-aedc-8f737ea48316-kube-api-access-k9cmb\") pod \"auto-csr-approver-29535922-bgv8w\" (UID: \"630dd7a9-ea0e-4a92-aedc-8f737ea48316\") " pod="openshift-infra/auto-csr-approver-29535922-bgv8w" Feb 27 01:22:00 crc kubenswrapper[4771]: I0227 01:22:00.428608 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9cmb\" (UniqueName: \"kubernetes.io/projected/630dd7a9-ea0e-4a92-aedc-8f737ea48316-kube-api-access-k9cmb\") pod \"auto-csr-approver-29535922-bgv8w\" (UID: \"630dd7a9-ea0e-4a92-aedc-8f737ea48316\") " pod="openshift-infra/auto-csr-approver-29535922-bgv8w" Feb 27 01:22:00 crc kubenswrapper[4771]: I0227 01:22:00.459927 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9cmb\" (UniqueName: \"kubernetes.io/projected/630dd7a9-ea0e-4a92-aedc-8f737ea48316-kube-api-access-k9cmb\") pod \"auto-csr-approver-29535922-bgv8w\" (UID: \"630dd7a9-ea0e-4a92-aedc-8f737ea48316\") " pod="openshift-infra/auto-csr-approver-29535922-bgv8w" Feb 27 01:22:00 crc kubenswrapper[4771]: I0227 01:22:00.474875 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535922-bgv8w" Feb 27 01:22:00 crc kubenswrapper[4771]: I0227 01:22:00.544835 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jpm2" Feb 27 01:22:00 crc kubenswrapper[4771]: I0227 01:22:00.768502 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535922-bgv8w"] Feb 27 01:22:00 crc kubenswrapper[4771]: W0227 01:22:00.779245 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod630dd7a9_ea0e_4a92_aedc_8f737ea48316.slice/crio-56cd8dff30ca976b98e1d49327765c9c0b7c9bf897f0c2889d09e35c216cb746 WatchSource:0}: Error finding container 56cd8dff30ca976b98e1d49327765c9c0b7c9bf897f0c2889d09e35c216cb746: Status 404 returned error can't find the container with id 56cd8dff30ca976b98e1d49327765c9c0b7c9bf897f0c2889d09e35c216cb746 Feb 27 01:22:00 crc kubenswrapper[4771]: I0227 01:22:00.966324 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535922-bgv8w" event={"ID":"630dd7a9-ea0e-4a92-aedc-8f737ea48316","Type":"ContainerStarted","Data":"56cd8dff30ca976b98e1d49327765c9c0b7c9bf897f0c2889d09e35c216cb746"} Feb 27 01:22:01 crc kubenswrapper[4771]: I0227 01:22:01.039363 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-65x55" Feb 27 01:22:01 crc kubenswrapper[4771]: I0227 01:22:01.073582 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-2qcds" Feb 27 01:22:01 crc kubenswrapper[4771]: I0227 01:22:01.262369 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-6j9rs" Feb 27 01:22:01 crc kubenswrapper[4771]: I0227 01:22:01.425624 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-wb7w9" Feb 27 01:22:01 crc kubenswrapper[4771]: I0227 01:22:01.498845 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-cp5l8" Feb 27 01:22:01 crc kubenswrapper[4771]: I0227 01:22:01.504031 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-lg7vn" Feb 27 01:22:06 crc kubenswrapper[4771]: I0227 01:22:06.013614 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535922-bgv8w" event={"ID":"630dd7a9-ea0e-4a92-aedc-8f737ea48316","Type":"ContainerDied","Data":"3d97010849da7ce7464351a42782e10357865d01cc58a899ca02dd5b8a8ff0f1"} Feb 27 01:22:06 crc kubenswrapper[4771]: I0227 01:22:06.013511 4771 generic.go:334] "Generic (PLEG): container finished" podID="630dd7a9-ea0e-4a92-aedc-8f737ea48316" containerID="3d97010849da7ce7464351a42782e10357865d01cc58a899ca02dd5b8a8ff0f1" exitCode=0 Feb 27 01:22:06 crc kubenswrapper[4771]: I0227 01:22:06.594858 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-4p5fg" Feb 27 01:22:06 crc kubenswrapper[4771]: I0227 01:22:06.700052 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq" Feb 27 01:22:07 crc kubenswrapper[4771]: I0227 01:22:07.371855 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535922-bgv8w" Feb 27 01:22:07 crc kubenswrapper[4771]: I0227 01:22:07.544158 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9cmb\" (UniqueName: \"kubernetes.io/projected/630dd7a9-ea0e-4a92-aedc-8f737ea48316-kube-api-access-k9cmb\") pod \"630dd7a9-ea0e-4a92-aedc-8f737ea48316\" (UID: \"630dd7a9-ea0e-4a92-aedc-8f737ea48316\") " Feb 27 01:22:07 crc kubenswrapper[4771]: I0227 01:22:07.553942 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/630dd7a9-ea0e-4a92-aedc-8f737ea48316-kube-api-access-k9cmb" (OuterVolumeSpecName: "kube-api-access-k9cmb") pod "630dd7a9-ea0e-4a92-aedc-8f737ea48316" (UID: "630dd7a9-ea0e-4a92-aedc-8f737ea48316"). InnerVolumeSpecName "kube-api-access-k9cmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:22:07 crc kubenswrapper[4771]: I0227 01:22:07.645359 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9cmb\" (UniqueName: \"kubernetes.io/projected/630dd7a9-ea0e-4a92-aedc-8f737ea48316-kube-api-access-k9cmb\") on node \"crc\" DevicePath \"\"" Feb 27 01:22:08 crc kubenswrapper[4771]: I0227 01:22:08.029042 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535922-bgv8w" event={"ID":"630dd7a9-ea0e-4a92-aedc-8f737ea48316","Type":"ContainerDied","Data":"56cd8dff30ca976b98e1d49327765c9c0b7c9bf897f0c2889d09e35c216cb746"} Feb 27 01:22:08 crc kubenswrapper[4771]: I0227 01:22:08.029342 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56cd8dff30ca976b98e1d49327765c9c0b7c9bf897f0c2889d09e35c216cb746" Feb 27 01:22:08 crc kubenswrapper[4771]: I0227 01:22:08.029075 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535922-bgv8w" Feb 27 01:22:08 crc kubenswrapper[4771]: I0227 01:22:08.435457 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535916-v5rj9"] Feb 27 01:22:08 crc kubenswrapper[4771]: I0227 01:22:08.440290 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535916-v5rj9"] Feb 27 01:22:09 crc kubenswrapper[4771]: I0227 01:22:09.787612 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="921de8cb-d569-47a5-97c8-ad7f94db475e" path="/var/lib/kubelet/pods/921de8cb-d569-47a5-97c8-ad7f94db475e/volumes" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.037839 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mhrfj"] Feb 27 01:22:25 crc kubenswrapper[4771]: E0227 01:22:25.038817 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630dd7a9-ea0e-4a92-aedc-8f737ea48316" containerName="oc" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.038834 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="630dd7a9-ea0e-4a92-aedc-8f737ea48316" containerName="oc" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.039034 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="630dd7a9-ea0e-4a92-aedc-8f737ea48316" containerName="oc" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.039970 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-mhrfj" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.042028 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.043463 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.043659 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.044927 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mhrfj"] Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.048942 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-4jxxf" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.095305 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-727vl"] Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.096766 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-727vl" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.100720 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.106289 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-727vl"] Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.217727 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rw4q\" (UniqueName: \"kubernetes.io/projected/1dfe8293-b488-48de-8990-c65bf4e63cd7-kube-api-access-5rw4q\") pod \"dnsmasq-dns-78dd6ddcc-727vl\" (UID: \"1dfe8293-b488-48de-8990-c65bf4e63cd7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-727vl" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.217781 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsgcm\" (UniqueName: \"kubernetes.io/projected/7668329a-cf46-4d1f-bf55-10197a60906f-kube-api-access-xsgcm\") pod \"dnsmasq-dns-675f4bcbfc-mhrfj\" (UID: \"7668329a-cf46-4d1f-bf55-10197a60906f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mhrfj" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.217835 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1dfe8293-b488-48de-8990-c65bf4e63cd7-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-727vl\" (UID: \"1dfe8293-b488-48de-8990-c65bf4e63cd7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-727vl" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.217853 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7668329a-cf46-4d1f-bf55-10197a60906f-config\") pod \"dnsmasq-dns-675f4bcbfc-mhrfj\" (UID: \"7668329a-cf46-4d1f-bf55-10197a60906f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mhrfj" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.217937 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dfe8293-b488-48de-8990-c65bf4e63cd7-config\") pod \"dnsmasq-dns-78dd6ddcc-727vl\" (UID: \"1dfe8293-b488-48de-8990-c65bf4e63cd7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-727vl" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.318983 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rw4q\" (UniqueName: \"kubernetes.io/projected/1dfe8293-b488-48de-8990-c65bf4e63cd7-kube-api-access-5rw4q\") pod \"dnsmasq-dns-78dd6ddcc-727vl\" (UID: \"1dfe8293-b488-48de-8990-c65bf4e63cd7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-727vl" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.319059 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsgcm\" (UniqueName: \"kubernetes.io/projected/7668329a-cf46-4d1f-bf55-10197a60906f-kube-api-access-xsgcm\") pod \"dnsmasq-dns-675f4bcbfc-mhrfj\" (UID: \"7668329a-cf46-4d1f-bf55-10197a60906f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mhrfj" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.319100 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1dfe8293-b488-48de-8990-c65bf4e63cd7-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-727vl\" (UID: \"1dfe8293-b488-48de-8990-c65bf4e63cd7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-727vl" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.319137 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7668329a-cf46-4d1f-bf55-10197a60906f-config\") pod \"dnsmasq-dns-675f4bcbfc-mhrfj\" (UID: \"7668329a-cf46-4d1f-bf55-10197a60906f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mhrfj" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.319258 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dfe8293-b488-48de-8990-c65bf4e63cd7-config\") pod \"dnsmasq-dns-78dd6ddcc-727vl\" (UID: \"1dfe8293-b488-48de-8990-c65bf4e63cd7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-727vl" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.320316 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7668329a-cf46-4d1f-bf55-10197a60906f-config\") pod \"dnsmasq-dns-675f4bcbfc-mhrfj\" (UID: \"7668329a-cf46-4d1f-bf55-10197a60906f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mhrfj" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.320326 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1dfe8293-b488-48de-8990-c65bf4e63cd7-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-727vl\" (UID: \"1dfe8293-b488-48de-8990-c65bf4e63cd7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-727vl" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.320649 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dfe8293-b488-48de-8990-c65bf4e63cd7-config\") pod \"dnsmasq-dns-78dd6ddcc-727vl\" (UID: \"1dfe8293-b488-48de-8990-c65bf4e63cd7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-727vl" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.339149 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsgcm\" (UniqueName: \"kubernetes.io/projected/7668329a-cf46-4d1f-bf55-10197a60906f-kube-api-access-xsgcm\") pod \"dnsmasq-dns-675f4bcbfc-mhrfj\" (UID: \"7668329a-cf46-4d1f-bf55-10197a60906f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mhrfj" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.342499 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rw4q\" (UniqueName: \"kubernetes.io/projected/1dfe8293-b488-48de-8990-c65bf4e63cd7-kube-api-access-5rw4q\") pod \"dnsmasq-dns-78dd6ddcc-727vl\" (UID: \"1dfe8293-b488-48de-8990-c65bf4e63cd7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-727vl" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.359691 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-mhrfj" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.414335 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-727vl" Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.823316 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mhrfj"] Feb 27 01:22:25 crc kubenswrapper[4771]: W0227 01:22:25.828135 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7668329a_cf46_4d1f_bf55_10197a60906f.slice/crio-e9e46524f6d23f58a804c6dceec19bf713145fdc3cfecdb77a8caabf5d3da120 WatchSource:0}: Error finding container e9e46524f6d23f58a804c6dceec19bf713145fdc3cfecdb77a8caabf5d3da120: Status 404 returned error can't find the container with id e9e46524f6d23f58a804c6dceec19bf713145fdc3cfecdb77a8caabf5d3da120 Feb 27 01:22:25 crc kubenswrapper[4771]: I0227 01:22:25.897459 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-727vl"] Feb 27 01:22:25 crc kubenswrapper[4771]: W0227 01:22:25.905659 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1dfe8293_b488_48de_8990_c65bf4e63cd7.slice/crio-a06b8862571e292f6258fa2b6c2c8cd9279f9f0bec0430999ed609172993ba64 WatchSource:0}: Error finding container a06b8862571e292f6258fa2b6c2c8cd9279f9f0bec0430999ed609172993ba64: Status 404 returned error can't find the container with id a06b8862571e292f6258fa2b6c2c8cd9279f9f0bec0430999ed609172993ba64 Feb 27 01:22:26 crc kubenswrapper[4771]: I0227 01:22:26.178441 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-mhrfj" event={"ID":"7668329a-cf46-4d1f-bf55-10197a60906f","Type":"ContainerStarted","Data":"e9e46524f6d23f58a804c6dceec19bf713145fdc3cfecdb77a8caabf5d3da120"} Feb 27 01:22:26 crc kubenswrapper[4771]: I0227 01:22:26.179800 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-727vl" event={"ID":"1dfe8293-b488-48de-8990-c65bf4e63cd7","Type":"ContainerStarted","Data":"a06b8862571e292f6258fa2b6c2c8cd9279f9f0bec0430999ed609172993ba64"} Feb 27 01:22:27 crc kubenswrapper[4771]: I0227 01:22:27.938104 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mhrfj"] Feb 27 01:22:27 crc kubenswrapper[4771]: I0227 01:22:27.961669 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-cnbf7"] Feb 27 01:22:27 crc kubenswrapper[4771]: I0227 01:22:27.962883 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-cnbf7" Feb 27 01:22:27 crc kubenswrapper[4771]: I0227 01:22:27.975827 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-cnbf7"] Feb 27 01:22:27 crc kubenswrapper[4771]: I0227 01:22:27.981619 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1-config\") pod \"dnsmasq-dns-666b6646f7-cnbf7\" (UID: \"c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1\") " pod="openstack/dnsmasq-dns-666b6646f7-cnbf7" Feb 27 01:22:27 crc kubenswrapper[4771]: I0227 01:22:27.981671 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1-dns-svc\") pod \"dnsmasq-dns-666b6646f7-cnbf7\" (UID: \"c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1\") " pod="openstack/dnsmasq-dns-666b6646f7-cnbf7" Feb 27 01:22:27 crc kubenswrapper[4771]: I0227 01:22:27.981740 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjwpc\" (UniqueName: \"kubernetes.io/projected/c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1-kube-api-access-vjwpc\") pod \"dnsmasq-dns-666b6646f7-cnbf7\" (UID: \"c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1\") " pod="openstack/dnsmasq-dns-666b6646f7-cnbf7" Feb 27 01:22:28 crc kubenswrapper[4771]: I0227 01:22:28.083272 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjwpc\" (UniqueName: \"kubernetes.io/projected/c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1-kube-api-access-vjwpc\") pod \"dnsmasq-dns-666b6646f7-cnbf7\" (UID: \"c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1\") " pod="openstack/dnsmasq-dns-666b6646f7-cnbf7" Feb 27 01:22:28 crc kubenswrapper[4771]: I0227 01:22:28.083337 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1-config\") pod \"dnsmasq-dns-666b6646f7-cnbf7\" (UID: \"c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1\") " pod="openstack/dnsmasq-dns-666b6646f7-cnbf7" Feb 27 01:22:28 crc kubenswrapper[4771]: I0227 01:22:28.083371 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1-dns-svc\") pod \"dnsmasq-dns-666b6646f7-cnbf7\" (UID: \"c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1\") " pod="openstack/dnsmasq-dns-666b6646f7-cnbf7" Feb 27 01:22:28 crc kubenswrapper[4771]: I0227 01:22:28.084446 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1-dns-svc\") pod \"dnsmasq-dns-666b6646f7-cnbf7\" (UID: \"c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1\") " pod="openstack/dnsmasq-dns-666b6646f7-cnbf7" Feb 27 01:22:28 crc kubenswrapper[4771]: I0227 01:22:28.085196 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1-config\") pod \"dnsmasq-dns-666b6646f7-cnbf7\" (UID: \"c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1\") " pod="openstack/dnsmasq-dns-666b6646f7-cnbf7" Feb 27 01:22:28 crc kubenswrapper[4771]: I0227 01:22:28.111509 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjwpc\" (UniqueName: \"kubernetes.io/projected/c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1-kube-api-access-vjwpc\") pod \"dnsmasq-dns-666b6646f7-cnbf7\" (UID: \"c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1\") " pod="openstack/dnsmasq-dns-666b6646f7-cnbf7" Feb 27 01:22:28 crc kubenswrapper[4771]: I0227 01:22:28.219714 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-727vl"] Feb 27 01:22:28 crc kubenswrapper[4771]: I0227 01:22:28.249153 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-cb5pm"] Feb 27 01:22:28 crc kubenswrapper[4771]: I0227 01:22:28.250343 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-cb5pm" Feb 27 01:22:28 crc kubenswrapper[4771]: I0227 01:22:28.260788 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-cb5pm"] Feb 27 01:22:28 crc kubenswrapper[4771]: I0227 01:22:28.282337 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-cnbf7" Feb 27 01:22:28 crc kubenswrapper[4771]: I0227 01:22:28.286052 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b40b4842-d003-44ce-aa40-f298d8deced5-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-cb5pm\" (UID: \"b40b4842-d003-44ce-aa40-f298d8deced5\") " pod="openstack/dnsmasq-dns-57d769cc4f-cb5pm" Feb 27 01:22:28 crc kubenswrapper[4771]: I0227 01:22:28.286121 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b40b4842-d003-44ce-aa40-f298d8deced5-config\") pod \"dnsmasq-dns-57d769cc4f-cb5pm\" (UID: \"b40b4842-d003-44ce-aa40-f298d8deced5\") " pod="openstack/dnsmasq-dns-57d769cc4f-cb5pm" Feb 27 01:22:28 crc kubenswrapper[4771]: I0227 01:22:28.286153 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rqzf\" (UniqueName: \"kubernetes.io/projected/b40b4842-d003-44ce-aa40-f298d8deced5-kube-api-access-9rqzf\") pod \"dnsmasq-dns-57d769cc4f-cb5pm\" (UID: \"b40b4842-d003-44ce-aa40-f298d8deced5\") " pod="openstack/dnsmasq-dns-57d769cc4f-cb5pm" Feb 27 01:22:28 crc kubenswrapper[4771]: I0227 01:22:28.386917 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rqzf\" (UniqueName: \"kubernetes.io/projected/b40b4842-d003-44ce-aa40-f298d8deced5-kube-api-access-9rqzf\") pod \"dnsmasq-dns-57d769cc4f-cb5pm\" (UID: \"b40b4842-d003-44ce-aa40-f298d8deced5\") " pod="openstack/dnsmasq-dns-57d769cc4f-cb5pm" Feb 27 01:22:28 crc kubenswrapper[4771]: I0227 01:22:28.387012 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b40b4842-d003-44ce-aa40-f298d8deced5-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-cb5pm\" (UID: \"b40b4842-d003-44ce-aa40-f298d8deced5\") " pod="openstack/dnsmasq-dns-57d769cc4f-cb5pm" Feb 27 01:22:28 crc kubenswrapper[4771]: I0227 01:22:28.387057 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b40b4842-d003-44ce-aa40-f298d8deced5-config\") pod \"dnsmasq-dns-57d769cc4f-cb5pm\" (UID: \"b40b4842-d003-44ce-aa40-f298d8deced5\") " pod="openstack/dnsmasq-dns-57d769cc4f-cb5pm" Feb 27 01:22:28 crc kubenswrapper[4771]: I0227 01:22:28.387953 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b40b4842-d003-44ce-aa40-f298d8deced5-config\") pod \"dnsmasq-dns-57d769cc4f-cb5pm\" (UID: \"b40b4842-d003-44ce-aa40-f298d8deced5\") " pod="openstack/dnsmasq-dns-57d769cc4f-cb5pm" Feb 27 01:22:28 crc kubenswrapper[4771]: I0227 01:22:28.388118 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b40b4842-d003-44ce-aa40-f298d8deced5-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-cb5pm\" (UID: \"b40b4842-d003-44ce-aa40-f298d8deced5\") " pod="openstack/dnsmasq-dns-57d769cc4f-cb5pm" Feb 27 01:22:28 crc kubenswrapper[4771]: I0227 01:22:28.409283 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rqzf\" (UniqueName: \"kubernetes.io/projected/b40b4842-d003-44ce-aa40-f298d8deced5-kube-api-access-9rqzf\") pod \"dnsmasq-dns-57d769cc4f-cb5pm\" (UID: \"b40b4842-d003-44ce-aa40-f298d8deced5\") " pod="openstack/dnsmasq-dns-57d769cc4f-cb5pm" Feb 27 01:22:28 crc kubenswrapper[4771]: I0227 01:22:28.574860 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-cb5pm" Feb 27 01:22:28 crc kubenswrapper[4771]: I0227 01:22:28.916863 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-cnbf7"] Feb 27 01:22:28 crc kubenswrapper[4771]: W0227 01:22:28.923715 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2d32d09_97a5_4e81_adf1_4f7be9ad8fc1.slice/crio-cb5f7a8b9a9b4c57ba4ef9ddd84d972b443b4aa02c24892cd1c4552e67f6c66d WatchSource:0}: Error finding container cb5f7a8b9a9b4c57ba4ef9ddd84d972b443b4aa02c24892cd1c4552e67f6c66d: Status 404 returned error can't find the container with id cb5f7a8b9a9b4c57ba4ef9ddd84d972b443b4aa02c24892cd1c4552e67f6c66d Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.055399 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-cb5pm"] Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.104333 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.106058 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.107995 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.109535 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.116176 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.116355 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-sl82b" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.116403 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.116517 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.116367 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.121937 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.212854 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-cnbf7" event={"ID":"c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1","Type":"ContainerStarted","Data":"cb5f7a8b9a9b4c57ba4ef9ddd84d972b443b4aa02c24892cd1c4552e67f6c66d"} Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.213923 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-cb5pm" event={"ID":"b40b4842-d003-44ce-aa40-f298d8deced5","Type":"ContainerStarted","Data":"a0bdbbc6c4fdd8389c8f8cb66277ce7d153d256f6c8fe838e45cd633974cc85f"} Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.297296 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a2c84581-5806-46dd-b352-390ef2d9826c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.297346 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a2c84581-5806-46dd-b352-390ef2d9826c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.297413 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a2c84581-5806-46dd-b352-390ef2d9826c-config-data\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.297439 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a2c84581-5806-46dd-b352-390ef2d9826c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.297466 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mqvl\" (UniqueName: \"kubernetes.io/projected/a2c84581-5806-46dd-b352-390ef2d9826c-kube-api-access-6mqvl\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.297485 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.297519 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a2c84581-5806-46dd-b352-390ef2d9826c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.297536 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a2c84581-5806-46dd-b352-390ef2d9826c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.297573 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a2c84581-5806-46dd-b352-390ef2d9826c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.297593 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a2c84581-5806-46dd-b352-390ef2d9826c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.297611 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a2c84581-5806-46dd-b352-390ef2d9826c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.394601 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.395952 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.399415 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.399501 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.399691 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.399788 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.399822 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.400513 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-pd5wj" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.401309 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a2c84581-5806-46dd-b352-390ef2d9826c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.401356 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a2c84581-5806-46dd-b352-390ef2d9826c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.401384 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a2c84581-5806-46dd-b352-390ef2d9826c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.401415 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a2c84581-5806-46dd-b352-390ef2d9826c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.401439 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a2c84581-5806-46dd-b352-390ef2d9826c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.401503 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a2c84581-5806-46dd-b352-390ef2d9826c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.401530 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a2c84581-5806-46dd-b352-390ef2d9826c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.401583 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a2c84581-5806-46dd-b352-390ef2d9826c-config-data\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.401605 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a2c84581-5806-46dd-b352-390ef2d9826c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.401634 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mqvl\" (UniqueName: \"kubernetes.io/projected/a2c84581-5806-46dd-b352-390ef2d9826c-kube-api-access-6mqvl\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.401659 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.402049 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.403273 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a2c84581-5806-46dd-b352-390ef2d9826c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.403595 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.406315 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a2c84581-5806-46dd-b352-390ef2d9826c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.406894 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a2c84581-5806-46dd-b352-390ef2d9826c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.408582 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a2c84581-5806-46dd-b352-390ef2d9826c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.409341 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a2c84581-5806-46dd-b352-390ef2d9826c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.418513 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.419194 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a2c84581-5806-46dd-b352-390ef2d9826c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.420076 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a2c84581-5806-46dd-b352-390ef2d9826c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.425060 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a2c84581-5806-46dd-b352-390ef2d9826c-config-data\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.427752 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a2c84581-5806-46dd-b352-390ef2d9826c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.445119 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mqvl\" (UniqueName: \"kubernetes.io/projected/a2c84581-5806-46dd-b352-390ef2d9826c-kube-api-access-6mqvl\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.500687 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.505577 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a3aec8d2-008a-4b77-a30b-23f8e812e332-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.505774 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3aec8d2-008a-4b77-a30b-23f8e812e332-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.505844 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j2lp\" (UniqueName: \"kubernetes.io/projected/a3aec8d2-008a-4b77-a30b-23f8e812e332-kube-api-access-4j2lp\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.505910 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a3aec8d2-008a-4b77-a30b-23f8e812e332-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.505933 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a3aec8d2-008a-4b77-a30b-23f8e812e332-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.505986 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a3aec8d2-008a-4b77-a30b-23f8e812e332-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.506014 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a3aec8d2-008a-4b77-a30b-23f8e812e332-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.506054 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a3aec8d2-008a-4b77-a30b-23f8e812e332-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.506106 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a3aec8d2-008a-4b77-a30b-23f8e812e332-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.506230 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a3aec8d2-008a-4b77-a30b-23f8e812e332-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.506349 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.607775 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.607830 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a3aec8d2-008a-4b77-a30b-23f8e812e332-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.607869 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3aec8d2-008a-4b77-a30b-23f8e812e332-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.607898 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j2lp\" (UniqueName: \"kubernetes.io/projected/a3aec8d2-008a-4b77-a30b-23f8e812e332-kube-api-access-4j2lp\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.607943 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a3aec8d2-008a-4b77-a30b-23f8e812e332-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.607965 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a3aec8d2-008a-4b77-a30b-23f8e812e332-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.608006 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a3aec8d2-008a-4b77-a30b-23f8e812e332-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.608000 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.608027 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a3aec8d2-008a-4b77-a30b-23f8e812e332-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.608056 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a3aec8d2-008a-4b77-a30b-23f8e812e332-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.608095 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a3aec8d2-008a-4b77-a30b-23f8e812e332-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.608117 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a3aec8d2-008a-4b77-a30b-23f8e812e332-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.609006 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3aec8d2-008a-4b77-a30b-23f8e812e332-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.609094 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a3aec8d2-008a-4b77-a30b-23f8e812e332-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.609413 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a3aec8d2-008a-4b77-a30b-23f8e812e332-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.610585 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a3aec8d2-008a-4b77-a30b-23f8e812e332-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.611444 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a3aec8d2-008a-4b77-a30b-23f8e812e332-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.613651 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a3aec8d2-008a-4b77-a30b-23f8e812e332-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.614154 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a3aec8d2-008a-4b77-a30b-23f8e812e332-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.614919 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a3aec8d2-008a-4b77-a30b-23f8e812e332-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.617099 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a3aec8d2-008a-4b77-a30b-23f8e812e332-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.623162 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j2lp\" (UniqueName: \"kubernetes.io/projected/a3aec8d2-008a-4b77-a30b-23f8e812e332-kube-api-access-4j2lp\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.627912 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.759227 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 27 01:22:29 crc kubenswrapper[4771]: I0227 01:22:29.821535 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.580778 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.583449 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.586874 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.587047 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.587089 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-r9zjc" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.587487 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.593440 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.594020 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.726096 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8be4acd2-0f92-4f9f-9521-5da586b712f0-config-data-default\") pod \"openstack-galera-0\" (UID: \"8be4acd2-0f92-4f9f-9521-5da586b712f0\") " pod="openstack/openstack-galera-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.726442 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8be4acd2-0f92-4f9f-9521-5da586b712f0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8be4acd2-0f92-4f9f-9521-5da586b712f0\") " pod="openstack/openstack-galera-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.726585 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be4acd2-0f92-4f9f-9521-5da586b712f0-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8be4acd2-0f92-4f9f-9521-5da586b712f0\") " pod="openstack/openstack-galera-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.726716 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"8be4acd2-0f92-4f9f-9521-5da586b712f0\") " pod="openstack/openstack-galera-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.726809 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8be4acd2-0f92-4f9f-9521-5da586b712f0-kolla-config\") pod \"openstack-galera-0\" (UID: \"8be4acd2-0f92-4f9f-9521-5da586b712f0\") " pod="openstack/openstack-galera-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.726893 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8be4acd2-0f92-4f9f-9521-5da586b712f0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8be4acd2-0f92-4f9f-9521-5da586b712f0\") " pod="openstack/openstack-galera-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.726989 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8be4acd2-0f92-4f9f-9521-5da586b712f0-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8be4acd2-0f92-4f9f-9521-5da586b712f0\") " pod="openstack/openstack-galera-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.727117 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj69z\" (UniqueName: \"kubernetes.io/projected/8be4acd2-0f92-4f9f-9521-5da586b712f0-kube-api-access-sj69z\") pod \"openstack-galera-0\" (UID: \"8be4acd2-0f92-4f9f-9521-5da586b712f0\") " pod="openstack/openstack-galera-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.828701 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj69z\" (UniqueName: \"kubernetes.io/projected/8be4acd2-0f92-4f9f-9521-5da586b712f0-kube-api-access-sj69z\") pod \"openstack-galera-0\" (UID: \"8be4acd2-0f92-4f9f-9521-5da586b712f0\") " pod="openstack/openstack-galera-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.828767 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8be4acd2-0f92-4f9f-9521-5da586b712f0-config-data-default\") pod \"openstack-galera-0\" (UID: \"8be4acd2-0f92-4f9f-9521-5da586b712f0\") " pod="openstack/openstack-galera-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.828811 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8be4acd2-0f92-4f9f-9521-5da586b712f0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8be4acd2-0f92-4f9f-9521-5da586b712f0\") " pod="openstack/openstack-galera-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.828846 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be4acd2-0f92-4f9f-9521-5da586b712f0-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8be4acd2-0f92-4f9f-9521-5da586b712f0\") " pod="openstack/openstack-galera-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.828866 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"8be4acd2-0f92-4f9f-9521-5da586b712f0\") " pod="openstack/openstack-galera-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.828885 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8be4acd2-0f92-4f9f-9521-5da586b712f0-kolla-config\") pod \"openstack-galera-0\" (UID: \"8be4acd2-0f92-4f9f-9521-5da586b712f0\") " pod="openstack/openstack-galera-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.828900 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8be4acd2-0f92-4f9f-9521-5da586b712f0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8be4acd2-0f92-4f9f-9521-5da586b712f0\") " pod="openstack/openstack-galera-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.828921 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8be4acd2-0f92-4f9f-9521-5da586b712f0-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8be4acd2-0f92-4f9f-9521-5da586b712f0\") " pod="openstack/openstack-galera-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.829401 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8be4acd2-0f92-4f9f-9521-5da586b712f0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8be4acd2-0f92-4f9f-9521-5da586b712f0\") " pod="openstack/openstack-galera-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.829710 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8be4acd2-0f92-4f9f-9521-5da586b712f0-kolla-config\") pod \"openstack-galera-0\" (UID: \"8be4acd2-0f92-4f9f-9521-5da586b712f0\") " pod="openstack/openstack-galera-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.829758 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8be4acd2-0f92-4f9f-9521-5da586b712f0-config-data-default\") pod \"openstack-galera-0\" (UID: \"8be4acd2-0f92-4f9f-9521-5da586b712f0\") " pod="openstack/openstack-galera-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.830253 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"8be4acd2-0f92-4f9f-9521-5da586b712f0\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.830542 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8be4acd2-0f92-4f9f-9521-5da586b712f0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8be4acd2-0f92-4f9f-9521-5da586b712f0\") " pod="openstack/openstack-galera-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.843704 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8be4acd2-0f92-4f9f-9521-5da586b712f0-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8be4acd2-0f92-4f9f-9521-5da586b712f0\") " pod="openstack/openstack-galera-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.843933 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be4acd2-0f92-4f9f-9521-5da586b712f0-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8be4acd2-0f92-4f9f-9521-5da586b712f0\") " pod="openstack/openstack-galera-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.847934 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"8be4acd2-0f92-4f9f-9521-5da586b712f0\") " pod="openstack/openstack-galera-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.848188 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj69z\" (UniqueName: \"kubernetes.io/projected/8be4acd2-0f92-4f9f-9521-5da586b712f0-kube-api-access-sj69z\") pod \"openstack-galera-0\" (UID: \"8be4acd2-0f92-4f9f-9521-5da586b712f0\") " pod="openstack/openstack-galera-0" Feb 27 01:22:30 crc kubenswrapper[4771]: I0227 01:22:30.906689 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 27 01:22:31 crc kubenswrapper[4771]: I0227 01:22:31.993678 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 27 01:22:31 crc kubenswrapper[4771]: I0227 01:22:31.994998 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:31 crc kubenswrapper[4771]: I0227 01:22:31.998225 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 27 01:22:31 crc kubenswrapper[4771]: I0227 01:22:31.998508 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 27 01:22:31 crc kubenswrapper[4771]: I0227 01:22:31.998685 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-84hk9" Feb 27 01:22:31 crc kubenswrapper[4771]: I0227 01:22:31.998858 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.054331 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.149430 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"39fb27d1-e9a6-44e4-9f92-d5f0242a8007\") " pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.149480 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/39fb27d1-e9a6-44e4-9f92-d5f0242a8007-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"39fb27d1-e9a6-44e4-9f92-d5f0242a8007\") " pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.149532 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39fb27d1-e9a6-44e4-9f92-d5f0242a8007-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"39fb27d1-e9a6-44e4-9f92-d5f0242a8007\") " pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.149599 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/39fb27d1-e9a6-44e4-9f92-d5f0242a8007-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"39fb27d1-e9a6-44e4-9f92-d5f0242a8007\") " pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.149744 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39fb27d1-e9a6-44e4-9f92-d5f0242a8007-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"39fb27d1-e9a6-44e4-9f92-d5f0242a8007\") " pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.149791 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/39fb27d1-e9a6-44e4-9f92-d5f0242a8007-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"39fb27d1-e9a6-44e4-9f92-d5f0242a8007\") " pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.149948 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/39fb27d1-e9a6-44e4-9f92-d5f0242a8007-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"39fb27d1-e9a6-44e4-9f92-d5f0242a8007\") " pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.149967 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmbgv\" (UniqueName: \"kubernetes.io/projected/39fb27d1-e9a6-44e4-9f92-d5f0242a8007-kube-api-access-dmbgv\") pod \"openstack-cell1-galera-0\" (UID: \"39fb27d1-e9a6-44e4-9f92-d5f0242a8007\") " pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.152366 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.153437 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.155658 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-vq97s" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.157418 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.157880 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.186645 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.251484 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/39fb27d1-e9a6-44e4-9f92-d5f0242a8007-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"39fb27d1-e9a6-44e4-9f92-d5f0242a8007\") " pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.251530 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39fb27d1-e9a6-44e4-9f92-d5f0242a8007-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"39fb27d1-e9a6-44e4-9f92-d5f0242a8007\") " pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.251572 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60504948-6e27-4eb7-b057-4634a1951a8c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"60504948-6e27-4eb7-b057-4634a1951a8c\") " pod="openstack/memcached-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.251619 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpzfz\" (UniqueName: \"kubernetes.io/projected/60504948-6e27-4eb7-b057-4634a1951a8c-kube-api-access-vpzfz\") pod \"memcached-0\" (UID: \"60504948-6e27-4eb7-b057-4634a1951a8c\") " pod="openstack/memcached-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.251646 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/39fb27d1-e9a6-44e4-9f92-d5f0242a8007-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"39fb27d1-e9a6-44e4-9f92-d5f0242a8007\") " pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.251663 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmbgv\" (UniqueName: \"kubernetes.io/projected/39fb27d1-e9a6-44e4-9f92-d5f0242a8007-kube-api-access-dmbgv\") pod \"openstack-cell1-galera-0\" (UID: \"39fb27d1-e9a6-44e4-9f92-d5f0242a8007\") " pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.251694 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"39fb27d1-e9a6-44e4-9f92-d5f0242a8007\") " pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.251711 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/39fb27d1-e9a6-44e4-9f92-d5f0242a8007-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"39fb27d1-e9a6-44e4-9f92-d5f0242a8007\") " pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.251734 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39fb27d1-e9a6-44e4-9f92-d5f0242a8007-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"39fb27d1-e9a6-44e4-9f92-d5f0242a8007\") " pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.251752 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/60504948-6e27-4eb7-b057-4634a1951a8c-kolla-config\") pod \"memcached-0\" (UID: \"60504948-6e27-4eb7-b057-4634a1951a8c\") " pod="openstack/memcached-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.251779 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/39fb27d1-e9a6-44e4-9f92-d5f0242a8007-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"39fb27d1-e9a6-44e4-9f92-d5f0242a8007\") " pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.251799 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/60504948-6e27-4eb7-b057-4634a1951a8c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"60504948-6e27-4eb7-b057-4634a1951a8c\") " pod="openstack/memcached-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.251819 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/60504948-6e27-4eb7-b057-4634a1951a8c-config-data\") pod \"memcached-0\" (UID: \"60504948-6e27-4eb7-b057-4634a1951a8c\") " pod="openstack/memcached-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.252502 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/39fb27d1-e9a6-44e4-9f92-d5f0242a8007-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"39fb27d1-e9a6-44e4-9f92-d5f0242a8007\") " pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.252859 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"39fb27d1-e9a6-44e4-9f92-d5f0242a8007\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.253394 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/39fb27d1-e9a6-44e4-9f92-d5f0242a8007-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"39fb27d1-e9a6-44e4-9f92-d5f0242a8007\") " pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.253800 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/39fb27d1-e9a6-44e4-9f92-d5f0242a8007-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"39fb27d1-e9a6-44e4-9f92-d5f0242a8007\") " pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.254029 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39fb27d1-e9a6-44e4-9f92-d5f0242a8007-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"39fb27d1-e9a6-44e4-9f92-d5f0242a8007\") " pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.258074 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/39fb27d1-e9a6-44e4-9f92-d5f0242a8007-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"39fb27d1-e9a6-44e4-9f92-d5f0242a8007\") " pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.259181 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39fb27d1-e9a6-44e4-9f92-d5f0242a8007-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"39fb27d1-e9a6-44e4-9f92-d5f0242a8007\") " pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.270058 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmbgv\" (UniqueName: \"kubernetes.io/projected/39fb27d1-e9a6-44e4-9f92-d5f0242a8007-kube-api-access-dmbgv\") pod \"openstack-cell1-galera-0\" (UID: \"39fb27d1-e9a6-44e4-9f92-d5f0242a8007\") " pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.290543 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"39fb27d1-e9a6-44e4-9f92-d5f0242a8007\") " pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.317533 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.353220 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/60504948-6e27-4eb7-b057-4634a1951a8c-kolla-config\") pod \"memcached-0\" (UID: \"60504948-6e27-4eb7-b057-4634a1951a8c\") " pod="openstack/memcached-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.353300 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/60504948-6e27-4eb7-b057-4634a1951a8c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"60504948-6e27-4eb7-b057-4634a1951a8c\") " pod="openstack/memcached-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.353331 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/60504948-6e27-4eb7-b057-4634a1951a8c-config-data\") pod \"memcached-0\" (UID: \"60504948-6e27-4eb7-b057-4634a1951a8c\") " pod="openstack/memcached-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.353369 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60504948-6e27-4eb7-b057-4634a1951a8c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"60504948-6e27-4eb7-b057-4634a1951a8c\") " pod="openstack/memcached-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.353428 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpzfz\" (UniqueName: \"kubernetes.io/projected/60504948-6e27-4eb7-b057-4634a1951a8c-kube-api-access-vpzfz\") pod \"memcached-0\" (UID: \"60504948-6e27-4eb7-b057-4634a1951a8c\") " pod="openstack/memcached-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.354294 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/60504948-6e27-4eb7-b057-4634a1951a8c-kolla-config\") pod \"memcached-0\" (UID: \"60504948-6e27-4eb7-b057-4634a1951a8c\") " pod="openstack/memcached-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.354322 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/60504948-6e27-4eb7-b057-4634a1951a8c-config-data\") pod \"memcached-0\" (UID: \"60504948-6e27-4eb7-b057-4634a1951a8c\") " pod="openstack/memcached-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.356452 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60504948-6e27-4eb7-b057-4634a1951a8c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"60504948-6e27-4eb7-b057-4634a1951a8c\") " pod="openstack/memcached-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.361131 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/60504948-6e27-4eb7-b057-4634a1951a8c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"60504948-6e27-4eb7-b057-4634a1951a8c\") " pod="openstack/memcached-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.369928 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpzfz\" (UniqueName: \"kubernetes.io/projected/60504948-6e27-4eb7-b057-4634a1951a8c-kube-api-access-vpzfz\") pod \"memcached-0\" (UID: \"60504948-6e27-4eb7-b057-4634a1951a8c\") " pod="openstack/memcached-0" Feb 27 01:22:32 crc kubenswrapper[4771]: I0227 01:22:32.482784 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 27 01:22:34 crc kubenswrapper[4771]: I0227 01:22:34.349362 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 01:22:34 crc kubenswrapper[4771]: I0227 01:22:34.351342 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 27 01:22:34 crc kubenswrapper[4771]: I0227 01:22:34.354139 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-rmnz2" Feb 27 01:22:34 crc kubenswrapper[4771]: I0227 01:22:34.358505 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 01:22:34 crc kubenswrapper[4771]: I0227 01:22:34.490168 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpnzs\" (UniqueName: \"kubernetes.io/projected/3ba1222d-39ed-4c00-a636-86788e0f6db6-kube-api-access-qpnzs\") pod \"kube-state-metrics-0\" (UID: \"3ba1222d-39ed-4c00-a636-86788e0f6db6\") " pod="openstack/kube-state-metrics-0" Feb 27 01:22:34 crc kubenswrapper[4771]: I0227 01:22:34.592105 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpnzs\" (UniqueName: \"kubernetes.io/projected/3ba1222d-39ed-4c00-a636-86788e0f6db6-kube-api-access-qpnzs\") pod \"kube-state-metrics-0\" (UID: \"3ba1222d-39ed-4c00-a636-86788e0f6db6\") " pod="openstack/kube-state-metrics-0" Feb 27 01:22:34 crc kubenswrapper[4771]: I0227 01:22:34.610149 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpnzs\" (UniqueName: \"kubernetes.io/projected/3ba1222d-39ed-4c00-a636-86788e0f6db6-kube-api-access-qpnzs\") pod \"kube-state-metrics-0\" (UID: \"3ba1222d-39ed-4c00-a636-86788e0f6db6\") " pod="openstack/kube-state-metrics-0" Feb 27 01:22:34 crc kubenswrapper[4771]: I0227 01:22:34.667840 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 27 01:22:35 crc kubenswrapper[4771]: I0227 01:22:35.554372 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-92wll"] Feb 27 01:22:35 crc kubenswrapper[4771]: I0227 01:22:35.556260 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-92wll" Feb 27 01:22:35 crc kubenswrapper[4771]: I0227 01:22:35.575804 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-92wll"] Feb 27 01:22:35 crc kubenswrapper[4771]: I0227 01:22:35.706180 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbec25a6-8536-4f09-af33-bd1a36b9e051-utilities\") pod \"redhat-marketplace-92wll\" (UID: \"bbec25a6-8536-4f09-af33-bd1a36b9e051\") " pod="openshift-marketplace/redhat-marketplace-92wll" Feb 27 01:22:35 crc kubenswrapper[4771]: I0227 01:22:35.706302 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx6hk\" (UniqueName: \"kubernetes.io/projected/bbec25a6-8536-4f09-af33-bd1a36b9e051-kube-api-access-rx6hk\") pod \"redhat-marketplace-92wll\" (UID: \"bbec25a6-8536-4f09-af33-bd1a36b9e051\") " pod="openshift-marketplace/redhat-marketplace-92wll" Feb 27 01:22:35 crc kubenswrapper[4771]: I0227 01:22:35.706347 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbec25a6-8536-4f09-af33-bd1a36b9e051-catalog-content\") pod \"redhat-marketplace-92wll\" (UID: \"bbec25a6-8536-4f09-af33-bd1a36b9e051\") " pod="openshift-marketplace/redhat-marketplace-92wll" Feb 27 01:22:35 crc kubenswrapper[4771]: I0227 01:22:35.808845 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbec25a6-8536-4f09-af33-bd1a36b9e051-utilities\") pod \"redhat-marketplace-92wll\" (UID: \"bbec25a6-8536-4f09-af33-bd1a36b9e051\") " pod="openshift-marketplace/redhat-marketplace-92wll" Feb 27 01:22:35 crc kubenswrapper[4771]: I0227 01:22:35.809292 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx6hk\" (UniqueName: \"kubernetes.io/projected/bbec25a6-8536-4f09-af33-bd1a36b9e051-kube-api-access-rx6hk\") pod \"redhat-marketplace-92wll\" (UID: \"bbec25a6-8536-4f09-af33-bd1a36b9e051\") " pod="openshift-marketplace/redhat-marketplace-92wll" Feb 27 01:22:35 crc kubenswrapper[4771]: I0227 01:22:35.809367 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbec25a6-8536-4f09-af33-bd1a36b9e051-catalog-content\") pod \"redhat-marketplace-92wll\" (UID: \"bbec25a6-8536-4f09-af33-bd1a36b9e051\") " pod="openshift-marketplace/redhat-marketplace-92wll" Feb 27 01:22:35 crc kubenswrapper[4771]: I0227 01:22:35.809435 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbec25a6-8536-4f09-af33-bd1a36b9e051-utilities\") pod \"redhat-marketplace-92wll\" (UID: \"bbec25a6-8536-4f09-af33-bd1a36b9e051\") " pod="openshift-marketplace/redhat-marketplace-92wll" Feb 27 01:22:35 crc kubenswrapper[4771]: I0227 01:22:35.809772 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbec25a6-8536-4f09-af33-bd1a36b9e051-catalog-content\") pod \"redhat-marketplace-92wll\" (UID: \"bbec25a6-8536-4f09-af33-bd1a36b9e051\") " pod="openshift-marketplace/redhat-marketplace-92wll" Feb 27 01:22:35 crc kubenswrapper[4771]: I0227 01:22:35.830164 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx6hk\" (UniqueName: \"kubernetes.io/projected/bbec25a6-8536-4f09-af33-bd1a36b9e051-kube-api-access-rx6hk\") pod \"redhat-marketplace-92wll\" (UID: \"bbec25a6-8536-4f09-af33-bd1a36b9e051\") " pod="openshift-marketplace/redhat-marketplace-92wll" Feb 27 01:22:35 crc kubenswrapper[4771]: I0227 01:22:35.872544 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-92wll" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.519962 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.521402 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.526969 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.527246 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.527695 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.527890 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-l5c7j" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.528047 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.529583 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.640034 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13396b98-6f5b-4800-854f-7b7d6af4cda4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"13396b98-6f5b-4800-854f-7b7d6af4cda4\") " pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.640091 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13396b98-6f5b-4800-854f-7b7d6af4cda4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"13396b98-6f5b-4800-854f-7b7d6af4cda4\") " pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.640236 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf6t9\" (UniqueName: \"kubernetes.io/projected/13396b98-6f5b-4800-854f-7b7d6af4cda4-kube-api-access-vf6t9\") pod \"ovsdbserver-nb-0\" (UID: \"13396b98-6f5b-4800-854f-7b7d6af4cda4\") " pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.640352 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/13396b98-6f5b-4800-854f-7b7d6af4cda4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13396b98-6f5b-4800-854f-7b7d6af4cda4\") " pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.640438 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/13396b98-6f5b-4800-854f-7b7d6af4cda4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13396b98-6f5b-4800-854f-7b7d6af4cda4\") " pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.640520 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/13396b98-6f5b-4800-854f-7b7d6af4cda4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"13396b98-6f5b-4800-854f-7b7d6af4cda4\") " pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.640613 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13396b98-6f5b-4800-854f-7b7d6af4cda4-config\") pod \"ovsdbserver-nb-0\" (UID: \"13396b98-6f5b-4800-854f-7b7d6af4cda4\") " pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.640706 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"13396b98-6f5b-4800-854f-7b7d6af4cda4\") " pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.672580 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-s5lkp"] Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.673723 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s5lkp" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.680633 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.680899 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-7884f" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.680944 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.691345 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-tjchc"] Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.692892 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tjchc" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.720357 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s5lkp"] Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.732907 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tjchc"] Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.742427 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13396b98-6f5b-4800-854f-7b7d6af4cda4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"13396b98-6f5b-4800-854f-7b7d6af4cda4\") " pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.742470 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13396b98-6f5b-4800-854f-7b7d6af4cda4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"13396b98-6f5b-4800-854f-7b7d6af4cda4\") " pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.742524 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf6t9\" (UniqueName: \"kubernetes.io/projected/13396b98-6f5b-4800-854f-7b7d6af4cda4-kube-api-access-vf6t9\") pod \"ovsdbserver-nb-0\" (UID: \"13396b98-6f5b-4800-854f-7b7d6af4cda4\") " pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.742585 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/13396b98-6f5b-4800-854f-7b7d6af4cda4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13396b98-6f5b-4800-854f-7b7d6af4cda4\") " pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.742611 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/13396b98-6f5b-4800-854f-7b7d6af4cda4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13396b98-6f5b-4800-854f-7b7d6af4cda4\") " pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.742644 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/13396b98-6f5b-4800-854f-7b7d6af4cda4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"13396b98-6f5b-4800-854f-7b7d6af4cda4\") " pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.742682 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13396b98-6f5b-4800-854f-7b7d6af4cda4-config\") pod \"ovsdbserver-nb-0\" (UID: \"13396b98-6f5b-4800-854f-7b7d6af4cda4\") " pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.742725 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"13396b98-6f5b-4800-854f-7b7d6af4cda4\") " pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.743068 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"13396b98-6f5b-4800-854f-7b7d6af4cda4\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.743586 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/13396b98-6f5b-4800-854f-7b7d6af4cda4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"13396b98-6f5b-4800-854f-7b7d6af4cda4\") " pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.744215 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13396b98-6f5b-4800-854f-7b7d6af4cda4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"13396b98-6f5b-4800-854f-7b7d6af4cda4\") " pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.744533 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13396b98-6f5b-4800-854f-7b7d6af4cda4-config\") pod \"ovsdbserver-nb-0\" (UID: \"13396b98-6f5b-4800-854f-7b7d6af4cda4\") " pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.755984 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13396b98-6f5b-4800-854f-7b7d6af4cda4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"13396b98-6f5b-4800-854f-7b7d6af4cda4\") " pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.758571 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/13396b98-6f5b-4800-854f-7b7d6af4cda4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13396b98-6f5b-4800-854f-7b7d6af4cda4\") " pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.759904 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf6t9\" (UniqueName: \"kubernetes.io/projected/13396b98-6f5b-4800-854f-7b7d6af4cda4-kube-api-access-vf6t9\") pod \"ovsdbserver-nb-0\" (UID: \"13396b98-6f5b-4800-854f-7b7d6af4cda4\") " pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.760982 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/13396b98-6f5b-4800-854f-7b7d6af4cda4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13396b98-6f5b-4800-854f-7b7d6af4cda4\") " pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.784313 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"13396b98-6f5b-4800-854f-7b7d6af4cda4\") " pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.842916 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.843730 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c578c69-744e-425b-8bb1-76eec4b332ec-ovn-controller-tls-certs\") pod \"ovn-controller-s5lkp\" (UID: \"8c578c69-744e-425b-8bb1-76eec4b332ec\") " pod="openstack/ovn-controller-s5lkp" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.843780 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/000564b2-d16b-45fb-ba91-e65b85bd7fb5-var-log\") pod \"ovn-controller-ovs-tjchc\" (UID: \"000564b2-d16b-45fb-ba91-e65b85bd7fb5\") " pod="openstack/ovn-controller-ovs-tjchc" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.843815 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/000564b2-d16b-45fb-ba91-e65b85bd7fb5-var-run\") pod \"ovn-controller-ovs-tjchc\" (UID: \"000564b2-d16b-45fb-ba91-e65b85bd7fb5\") " pod="openstack/ovn-controller-ovs-tjchc" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.843861 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/000564b2-d16b-45fb-ba91-e65b85bd7fb5-var-lib\") pod \"ovn-controller-ovs-tjchc\" (UID: \"000564b2-d16b-45fb-ba91-e65b85bd7fb5\") " pod="openstack/ovn-controller-ovs-tjchc" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.843911 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c578c69-744e-425b-8bb1-76eec4b332ec-combined-ca-bundle\") pod \"ovn-controller-s5lkp\" (UID: \"8c578c69-744e-425b-8bb1-76eec4b332ec\") " pod="openstack/ovn-controller-s5lkp" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.843954 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8c578c69-744e-425b-8bb1-76eec4b332ec-var-log-ovn\") pod \"ovn-controller-s5lkp\" (UID: \"8c578c69-744e-425b-8bb1-76eec4b332ec\") " pod="openstack/ovn-controller-s5lkp" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.844001 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c578c69-744e-425b-8bb1-76eec4b332ec-var-run-ovn\") pod \"ovn-controller-s5lkp\" (UID: \"8c578c69-744e-425b-8bb1-76eec4b332ec\") " pod="openstack/ovn-controller-s5lkp" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.844029 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/000564b2-d16b-45fb-ba91-e65b85bd7fb5-scripts\") pod \"ovn-controller-ovs-tjchc\" (UID: \"000564b2-d16b-45fb-ba91-e65b85bd7fb5\") " pod="openstack/ovn-controller-ovs-tjchc" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.844077 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7wt8\" (UniqueName: \"kubernetes.io/projected/8c578c69-744e-425b-8bb1-76eec4b332ec-kube-api-access-z7wt8\") pod \"ovn-controller-s5lkp\" (UID: \"8c578c69-744e-425b-8bb1-76eec4b332ec\") " pod="openstack/ovn-controller-s5lkp" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.844110 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/000564b2-d16b-45fb-ba91-e65b85bd7fb5-etc-ovs\") pod \"ovn-controller-ovs-tjchc\" (UID: \"000564b2-d16b-45fb-ba91-e65b85bd7fb5\") " pod="openstack/ovn-controller-ovs-tjchc" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.844134 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spq46\" (UniqueName: \"kubernetes.io/projected/000564b2-d16b-45fb-ba91-e65b85bd7fb5-kube-api-access-spq46\") pod \"ovn-controller-ovs-tjchc\" (UID: \"000564b2-d16b-45fb-ba91-e65b85bd7fb5\") " pod="openstack/ovn-controller-ovs-tjchc" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.844157 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c578c69-744e-425b-8bb1-76eec4b332ec-scripts\") pod \"ovn-controller-s5lkp\" (UID: \"8c578c69-744e-425b-8bb1-76eec4b332ec\") " pod="openstack/ovn-controller-s5lkp" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.844179 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c578c69-744e-425b-8bb1-76eec4b332ec-var-run\") pod \"ovn-controller-s5lkp\" (UID: \"8c578c69-744e-425b-8bb1-76eec4b332ec\") " pod="openstack/ovn-controller-s5lkp" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.945455 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c578c69-744e-425b-8bb1-76eec4b332ec-ovn-controller-tls-certs\") pod \"ovn-controller-s5lkp\" (UID: \"8c578c69-744e-425b-8bb1-76eec4b332ec\") " pod="openstack/ovn-controller-s5lkp" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.945505 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/000564b2-d16b-45fb-ba91-e65b85bd7fb5-var-log\") pod \"ovn-controller-ovs-tjchc\" (UID: \"000564b2-d16b-45fb-ba91-e65b85bd7fb5\") " pod="openstack/ovn-controller-ovs-tjchc" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.945535 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/000564b2-d16b-45fb-ba91-e65b85bd7fb5-var-run\") pod \"ovn-controller-ovs-tjchc\" (UID: \"000564b2-d16b-45fb-ba91-e65b85bd7fb5\") " pod="openstack/ovn-controller-ovs-tjchc" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.945612 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/000564b2-d16b-45fb-ba91-e65b85bd7fb5-var-lib\") pod \"ovn-controller-ovs-tjchc\" (UID: \"000564b2-d16b-45fb-ba91-e65b85bd7fb5\") " pod="openstack/ovn-controller-ovs-tjchc" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.945673 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c578c69-744e-425b-8bb1-76eec4b332ec-combined-ca-bundle\") pod \"ovn-controller-s5lkp\" (UID: \"8c578c69-744e-425b-8bb1-76eec4b332ec\") " pod="openstack/ovn-controller-s5lkp" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.945717 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8c578c69-744e-425b-8bb1-76eec4b332ec-var-log-ovn\") pod \"ovn-controller-s5lkp\" (UID: \"8c578c69-744e-425b-8bb1-76eec4b332ec\") " pod="openstack/ovn-controller-s5lkp" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.945747 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c578c69-744e-425b-8bb1-76eec4b332ec-var-run-ovn\") pod \"ovn-controller-s5lkp\" (UID: \"8c578c69-744e-425b-8bb1-76eec4b332ec\") " pod="openstack/ovn-controller-s5lkp" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.945779 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/000564b2-d16b-45fb-ba91-e65b85bd7fb5-scripts\") pod \"ovn-controller-ovs-tjchc\" (UID: \"000564b2-d16b-45fb-ba91-e65b85bd7fb5\") " pod="openstack/ovn-controller-ovs-tjchc" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.945807 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7wt8\" (UniqueName: \"kubernetes.io/projected/8c578c69-744e-425b-8bb1-76eec4b332ec-kube-api-access-z7wt8\") pod \"ovn-controller-s5lkp\" (UID: \"8c578c69-744e-425b-8bb1-76eec4b332ec\") " pod="openstack/ovn-controller-s5lkp" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.945832 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/000564b2-d16b-45fb-ba91-e65b85bd7fb5-etc-ovs\") pod \"ovn-controller-ovs-tjchc\" (UID: \"000564b2-d16b-45fb-ba91-e65b85bd7fb5\") " pod="openstack/ovn-controller-ovs-tjchc" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.945858 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spq46\" (UniqueName: \"kubernetes.io/projected/000564b2-d16b-45fb-ba91-e65b85bd7fb5-kube-api-access-spq46\") pod \"ovn-controller-ovs-tjchc\" (UID: \"000564b2-d16b-45fb-ba91-e65b85bd7fb5\") " pod="openstack/ovn-controller-ovs-tjchc" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.945884 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c578c69-744e-425b-8bb1-76eec4b332ec-scripts\") pod \"ovn-controller-s5lkp\" (UID: \"8c578c69-744e-425b-8bb1-76eec4b332ec\") " pod="openstack/ovn-controller-s5lkp" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.945907 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c578c69-744e-425b-8bb1-76eec4b332ec-var-run\") pod \"ovn-controller-s5lkp\" (UID: \"8c578c69-744e-425b-8bb1-76eec4b332ec\") " pod="openstack/ovn-controller-s5lkp" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.947667 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/000564b2-d16b-45fb-ba91-e65b85bd7fb5-var-lib\") pod \"ovn-controller-ovs-tjchc\" (UID: \"000564b2-d16b-45fb-ba91-e65b85bd7fb5\") " pod="openstack/ovn-controller-ovs-tjchc" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.947718 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/000564b2-d16b-45fb-ba91-e65b85bd7fb5-var-run\") pod \"ovn-controller-ovs-tjchc\" (UID: \"000564b2-d16b-45fb-ba91-e65b85bd7fb5\") " pod="openstack/ovn-controller-ovs-tjchc" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.947740 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c578c69-744e-425b-8bb1-76eec4b332ec-var-run\") pod \"ovn-controller-s5lkp\" (UID: \"8c578c69-744e-425b-8bb1-76eec4b332ec\") " pod="openstack/ovn-controller-s5lkp" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.947885 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/000564b2-d16b-45fb-ba91-e65b85bd7fb5-var-log\") pod \"ovn-controller-ovs-tjchc\" (UID: \"000564b2-d16b-45fb-ba91-e65b85bd7fb5\") " pod="openstack/ovn-controller-ovs-tjchc" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.947954 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8c578c69-744e-425b-8bb1-76eec4b332ec-var-log-ovn\") pod \"ovn-controller-s5lkp\" (UID: \"8c578c69-744e-425b-8bb1-76eec4b332ec\") " pod="openstack/ovn-controller-s5lkp" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.948119 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/000564b2-d16b-45fb-ba91-e65b85bd7fb5-etc-ovs\") pod \"ovn-controller-ovs-tjchc\" (UID: \"000564b2-d16b-45fb-ba91-e65b85bd7fb5\") " pod="openstack/ovn-controller-ovs-tjchc" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.948273 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c578c69-744e-425b-8bb1-76eec4b332ec-var-run-ovn\") pod \"ovn-controller-s5lkp\" (UID: \"8c578c69-744e-425b-8bb1-76eec4b332ec\") " pod="openstack/ovn-controller-s5lkp" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.949403 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c578c69-744e-425b-8bb1-76eec4b332ec-scripts\") pod \"ovn-controller-s5lkp\" (UID: \"8c578c69-744e-425b-8bb1-76eec4b332ec\") " pod="openstack/ovn-controller-s5lkp" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.949444 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/000564b2-d16b-45fb-ba91-e65b85bd7fb5-scripts\") pod \"ovn-controller-ovs-tjchc\" (UID: \"000564b2-d16b-45fb-ba91-e65b85bd7fb5\") " pod="openstack/ovn-controller-ovs-tjchc" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.949731 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c578c69-744e-425b-8bb1-76eec4b332ec-ovn-controller-tls-certs\") pod \"ovn-controller-s5lkp\" (UID: \"8c578c69-744e-425b-8bb1-76eec4b332ec\") " pod="openstack/ovn-controller-s5lkp" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.950747 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c578c69-744e-425b-8bb1-76eec4b332ec-combined-ca-bundle\") pod \"ovn-controller-s5lkp\" (UID: \"8c578c69-744e-425b-8bb1-76eec4b332ec\") " pod="openstack/ovn-controller-s5lkp" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.970892 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7wt8\" (UniqueName: \"kubernetes.io/projected/8c578c69-744e-425b-8bb1-76eec4b332ec-kube-api-access-z7wt8\") pod \"ovn-controller-s5lkp\" (UID: \"8c578c69-744e-425b-8bb1-76eec4b332ec\") " pod="openstack/ovn-controller-s5lkp" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.971410 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spq46\" (UniqueName: \"kubernetes.io/projected/000564b2-d16b-45fb-ba91-e65b85bd7fb5-kube-api-access-spq46\") pod \"ovn-controller-ovs-tjchc\" (UID: \"000564b2-d16b-45fb-ba91-e65b85bd7fb5\") " pod="openstack/ovn-controller-ovs-tjchc" Feb 27 01:22:37 crc kubenswrapper[4771]: I0227 01:22:37.993637 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s5lkp" Feb 27 01:22:38 crc kubenswrapper[4771]: I0227 01:22:38.012391 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tjchc" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.240826 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.244633 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.247996 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.248078 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.248270 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-vz9km" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.248376 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.262064 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.295802 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ed3a808f-7dba-4f32-a081-29eab07e84c0\") " pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.295892 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ed3a808f-7dba-4f32-a081-29eab07e84c0-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ed3a808f-7dba-4f32-a081-29eab07e84c0\") " pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.295919 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed3a808f-7dba-4f32-a081-29eab07e84c0-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ed3a808f-7dba-4f32-a081-29eab07e84c0\") " pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.295941 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rppsk\" (UniqueName: \"kubernetes.io/projected/ed3a808f-7dba-4f32-a081-29eab07e84c0-kube-api-access-rppsk\") pod \"ovsdbserver-sb-0\" (UID: \"ed3a808f-7dba-4f32-a081-29eab07e84c0\") " pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.295978 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed3a808f-7dba-4f32-a081-29eab07e84c0-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ed3a808f-7dba-4f32-a081-29eab07e84c0\") " pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.296121 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed3a808f-7dba-4f32-a081-29eab07e84c0-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ed3a808f-7dba-4f32-a081-29eab07e84c0\") " pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.296173 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed3a808f-7dba-4f32-a081-29eab07e84c0-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ed3a808f-7dba-4f32-a081-29eab07e84c0\") " pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.296275 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed3a808f-7dba-4f32-a081-29eab07e84c0-config\") pod \"ovsdbserver-sb-0\" (UID: \"ed3a808f-7dba-4f32-a081-29eab07e84c0\") " pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.397466 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed3a808f-7dba-4f32-a081-29eab07e84c0-config\") pod \"ovsdbserver-sb-0\" (UID: \"ed3a808f-7dba-4f32-a081-29eab07e84c0\") " pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.397512 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ed3a808f-7dba-4f32-a081-29eab07e84c0\") " pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.397622 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ed3a808f-7dba-4f32-a081-29eab07e84c0-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ed3a808f-7dba-4f32-a081-29eab07e84c0\") " pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.397648 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed3a808f-7dba-4f32-a081-29eab07e84c0-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ed3a808f-7dba-4f32-a081-29eab07e84c0\") " pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.397672 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rppsk\" (UniqueName: \"kubernetes.io/projected/ed3a808f-7dba-4f32-a081-29eab07e84c0-kube-api-access-rppsk\") pod \"ovsdbserver-sb-0\" (UID: \"ed3a808f-7dba-4f32-a081-29eab07e84c0\") " pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.397695 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed3a808f-7dba-4f32-a081-29eab07e84c0-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ed3a808f-7dba-4f32-a081-29eab07e84c0\") " pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.397716 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed3a808f-7dba-4f32-a081-29eab07e84c0-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ed3a808f-7dba-4f32-a081-29eab07e84c0\") " pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.397734 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed3a808f-7dba-4f32-a081-29eab07e84c0-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ed3a808f-7dba-4f32-a081-29eab07e84c0\") " pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.398804 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed3a808f-7dba-4f32-a081-29eab07e84c0-config\") pod \"ovsdbserver-sb-0\" (UID: \"ed3a808f-7dba-4f32-a081-29eab07e84c0\") " pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.398810 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ed3a808f-7dba-4f32-a081-29eab07e84c0\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.399301 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed3a808f-7dba-4f32-a081-29eab07e84c0-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ed3a808f-7dba-4f32-a081-29eab07e84c0\") " pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.399671 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ed3a808f-7dba-4f32-a081-29eab07e84c0-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ed3a808f-7dba-4f32-a081-29eab07e84c0\") " pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.405451 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed3a808f-7dba-4f32-a081-29eab07e84c0-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ed3a808f-7dba-4f32-a081-29eab07e84c0\") " pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.408272 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed3a808f-7dba-4f32-a081-29eab07e84c0-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ed3a808f-7dba-4f32-a081-29eab07e84c0\") " pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.421027 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed3a808f-7dba-4f32-a081-29eab07e84c0-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ed3a808f-7dba-4f32-a081-29eab07e84c0\") " pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.422365 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rppsk\" (UniqueName: \"kubernetes.io/projected/ed3a808f-7dba-4f32-a081-29eab07e84c0-kube-api-access-rppsk\") pod \"ovsdbserver-sb-0\" (UID: \"ed3a808f-7dba-4f32-a081-29eab07e84c0\") " pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.427336 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ed3a808f-7dba-4f32-a081-29eab07e84c0\") " pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:41 crc kubenswrapper[4771]: I0227 01:22:41.562676 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:42 crc kubenswrapper[4771]: E0227 01:22:42.457594 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 27 01:22:42 crc kubenswrapper[4771]: E0227 01:22:42.457906 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5rw4q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-727vl_openstack(1dfe8293-b488-48de-8990-c65bf4e63cd7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 01:22:42 crc kubenswrapper[4771]: E0227 01:22:42.460843 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-727vl" podUID="1dfe8293-b488-48de-8990-c65bf4e63cd7" Feb 27 01:22:42 crc kubenswrapper[4771]: E0227 01:22:42.500860 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 27 01:22:42 crc kubenswrapper[4771]: E0227 01:22:42.501185 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xsgcm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-mhrfj_openstack(7668329a-cf46-4d1f-bf55-10197a60906f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 01:22:42 crc kubenswrapper[4771]: E0227 01:22:42.502455 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-mhrfj" podUID="7668329a-cf46-4d1f-bf55-10197a60906f" Feb 27 01:22:42 crc kubenswrapper[4771]: I0227 01:22:42.643173 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 27 01:22:42 crc kubenswrapper[4771]: I0227 01:22:42.926926 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 01:22:43 crc kubenswrapper[4771]: W0227 01:22:43.025355 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3aec8d2_008a_4b77_a30b_23f8e812e332.slice/crio-9aa91a0925394f3ab132912f5de855e61d621abd853412282c32c069f18c50a5 WatchSource:0}: Error finding container 9aa91a0925394f3ab132912f5de855e61d621abd853412282c32c069f18c50a5: Status 404 returned error can't find the container with id 9aa91a0925394f3ab132912f5de855e61d621abd853412282c32c069f18c50a5 Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.069427 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.089345 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 01:22:43 crc kubenswrapper[4771]: W0227 01:22:43.093819 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2c84581_5806_46dd_b352_390ef2d9826c.slice/crio-ff6ef1cf89726a6dc0f950f4744bd95917409a77a16eeef7189f699af48b4915 WatchSource:0}: Error finding container ff6ef1cf89726a6dc0f950f4744bd95917409a77a16eeef7189f699af48b4915: Status 404 returned error can't find the container with id ff6ef1cf89726a6dc0f950f4744bd95917409a77a16eeef7189f699af48b4915 Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.176097 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-92wll"] Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.182476 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.281166 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.288740 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s5lkp"] Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.300776 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.309112 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a3aec8d2-008a-4b77-a30b-23f8e812e332","Type":"ContainerStarted","Data":"9aa91a0925394f3ab132912f5de855e61d621abd853412282c32c069f18c50a5"} Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.314473 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"60504948-6e27-4eb7-b057-4634a1951a8c","Type":"ContainerStarted","Data":"6d719c523cdbb78d971eebfb5d8978a9527d00191f33f038cbdcb3da22f5de64"} Feb 27 01:22:43 crc kubenswrapper[4771]: W0227 01:22:43.314589 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ba1222d_39ed_4c00_a636_86788e0f6db6.slice/crio-e88c2f71086568b73c70d3d780138057c9c8504470999892f2bef34040219e5f WatchSource:0}: Error finding container e88c2f71086568b73c70d3d780138057c9c8504470999892f2bef34040219e5f: Status 404 returned error can't find the container with id e88c2f71086568b73c70d3d780138057c9c8504470999892f2bef34040219e5f Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.316363 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"39fb27d1-e9a6-44e4-9f92-d5f0242a8007","Type":"ContainerStarted","Data":"a7662b6844f389e602e1bcadf857869e5e7b7880ee9b5e107c5ac7de30b934e8"} Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.319422 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-92wll" event={"ID":"bbec25a6-8536-4f09-af33-bd1a36b9e051","Type":"ContainerStarted","Data":"6c342f683c424c08563c348ad7c6b65e607176eb21825c0b47e11ebad3e33ebe"} Feb 27 01:22:43 crc kubenswrapper[4771]: W0227 01:22:43.320376 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c578c69_744e_425b_8bb1_76eec4b332ec.slice/crio-adaeb3265edb4c1f227cfff6c25198d46bd74d1ae45a4c474c1bf5d482dd6168 WatchSource:0}: Error finding container adaeb3265edb4c1f227cfff6c25198d46bd74d1ae45a4c474c1bf5d482dd6168: Status 404 returned error can't find the container with id adaeb3265edb4c1f227cfff6c25198d46bd74d1ae45a4c474c1bf5d482dd6168 Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.323193 4771 generic.go:334] "Generic (PLEG): container finished" podID="c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1" containerID="a69e76e733e036b97931e4445558078465728f0b271ce8e48e35248876d3b943" exitCode=0 Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.323234 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-cnbf7" event={"ID":"c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1","Type":"ContainerDied","Data":"a69e76e733e036b97931e4445558078465728f0b271ce8e48e35248876d3b943"} Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.328140 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8be4acd2-0f92-4f9f-9521-5da586b712f0","Type":"ContainerStarted","Data":"0c0fc5609f9411144f096eef2500cb7842d8b7104c3af19a9526b84bfac931a6"} Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.339586 4771 generic.go:334] "Generic (PLEG): container finished" podID="b40b4842-d003-44ce-aa40-f298d8deced5" containerID="22f2e8e34c04cdd22a4a765a63375b7cd357ed39c94c7bf40df8d9fd1a35c348" exitCode=0 Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.343426 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-cb5pm" event={"ID":"b40b4842-d003-44ce-aa40-f298d8deced5","Type":"ContainerDied","Data":"22f2e8e34c04cdd22a4a765a63375b7cd357ed39c94c7bf40df8d9fd1a35c348"} Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.355432 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a2c84581-5806-46dd-b352-390ef2d9826c","Type":"ContainerStarted","Data":"ff6ef1cf89726a6dc0f950f4744bd95917409a77a16eeef7189f699af48b4915"} Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.388041 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tjchc"] Feb 27 01:22:43 crc kubenswrapper[4771]: W0227 01:22:43.446105 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod000564b2_d16b_45fb_ba91_e65b85bd7fb5.slice/crio-390da1b6d9901be3185a09e1b109165ce8f9488aa5e94e81b0e7ab976cb5fc34 WatchSource:0}: Error finding container 390da1b6d9901be3185a09e1b109165ce8f9488aa5e94e81b0e7ab976cb5fc34: Status 404 returned error can't find the container with id 390da1b6d9901be3185a09e1b109165ce8f9488aa5e94e81b0e7ab976cb5fc34 Feb 27 01:22:43 crc kubenswrapper[4771]: E0227 01:22:43.607487 4771 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 27 01:22:43 crc kubenswrapper[4771]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 27 01:22:43 crc kubenswrapper[4771]: > podSandboxID="cb5f7a8b9a9b4c57ba4ef9ddd84d972b443b4aa02c24892cd1c4552e67f6c66d" Feb 27 01:22:43 crc kubenswrapper[4771]: E0227 01:22:43.607634 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 01:22:43 crc kubenswrapper[4771]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vjwpc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-cnbf7_openstack(c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 27 01:22:43 crc kubenswrapper[4771]: > logger="UnhandledError" Feb 27 01:22:43 crc kubenswrapper[4771]: E0227 01:22:43.608898 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-666b6646f7-cnbf7" podUID="c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1" Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.678930 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-727vl" Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.726487 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-mhrfj" Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.741918 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1dfe8293-b488-48de-8990-c65bf4e63cd7-dns-svc\") pod \"1dfe8293-b488-48de-8990-c65bf4e63cd7\" (UID: \"1dfe8293-b488-48de-8990-c65bf4e63cd7\") " Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.742259 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dfe8293-b488-48de-8990-c65bf4e63cd7-config\") pod \"1dfe8293-b488-48de-8990-c65bf4e63cd7\" (UID: \"1dfe8293-b488-48de-8990-c65bf4e63cd7\") " Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.742324 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rw4q\" (UniqueName: \"kubernetes.io/projected/1dfe8293-b488-48de-8990-c65bf4e63cd7-kube-api-access-5rw4q\") pod \"1dfe8293-b488-48de-8990-c65bf4e63cd7\" (UID: \"1dfe8293-b488-48de-8990-c65bf4e63cd7\") " Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.743496 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dfe8293-b488-48de-8990-c65bf4e63cd7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1dfe8293-b488-48de-8990-c65bf4e63cd7" (UID: "1dfe8293-b488-48de-8990-c65bf4e63cd7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.743530 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dfe8293-b488-48de-8990-c65bf4e63cd7-config" (OuterVolumeSpecName: "config") pod "1dfe8293-b488-48de-8990-c65bf4e63cd7" (UID: "1dfe8293-b488-48de-8990-c65bf4e63cd7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.755412 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dfe8293-b488-48de-8990-c65bf4e63cd7-kube-api-access-5rw4q" (OuterVolumeSpecName: "kube-api-access-5rw4q") pod "1dfe8293-b488-48de-8990-c65bf4e63cd7" (UID: "1dfe8293-b488-48de-8990-c65bf4e63cd7"). InnerVolumeSpecName "kube-api-access-5rw4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.844242 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7668329a-cf46-4d1f-bf55-10197a60906f-config\") pod \"7668329a-cf46-4d1f-bf55-10197a60906f\" (UID: \"7668329a-cf46-4d1f-bf55-10197a60906f\") " Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.844352 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsgcm\" (UniqueName: \"kubernetes.io/projected/7668329a-cf46-4d1f-bf55-10197a60906f-kube-api-access-xsgcm\") pod \"7668329a-cf46-4d1f-bf55-10197a60906f\" (UID: \"7668329a-cf46-4d1f-bf55-10197a60906f\") " Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.844770 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1dfe8293-b488-48de-8990-c65bf4e63cd7-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.844787 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dfe8293-b488-48de-8990-c65bf4e63cd7-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.844797 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rw4q\" (UniqueName: \"kubernetes.io/projected/1dfe8293-b488-48de-8990-c65bf4e63cd7-kube-api-access-5rw4q\") on node \"crc\" DevicePath \"\"" Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.844777 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7668329a-cf46-4d1f-bf55-10197a60906f-config" (OuterVolumeSpecName: "config") pod "7668329a-cf46-4d1f-bf55-10197a60906f" (UID: "7668329a-cf46-4d1f-bf55-10197a60906f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.847366 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7668329a-cf46-4d1f-bf55-10197a60906f-kube-api-access-xsgcm" (OuterVolumeSpecName: "kube-api-access-xsgcm") pod "7668329a-cf46-4d1f-bf55-10197a60906f" (UID: "7668329a-cf46-4d1f-bf55-10197a60906f"). InnerVolumeSpecName "kube-api-access-xsgcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.946673 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsgcm\" (UniqueName: \"kubernetes.io/projected/7668329a-cf46-4d1f-bf55-10197a60906f-kube-api-access-xsgcm\") on node \"crc\" DevicePath \"\"" Feb 27 01:22:43 crc kubenswrapper[4771]: I0227 01:22:43.946711 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7668329a-cf46-4d1f-bf55-10197a60906f-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:22:44 crc kubenswrapper[4771]: I0227 01:22:44.132153 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 27 01:22:44 crc kubenswrapper[4771]: I0227 01:22:44.364457 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tjchc" event={"ID":"000564b2-d16b-45fb-ba91-e65b85bd7fb5","Type":"ContainerStarted","Data":"390da1b6d9901be3185a09e1b109165ce8f9488aa5e94e81b0e7ab976cb5fc34"} Feb 27 01:22:44 crc kubenswrapper[4771]: I0227 01:22:44.366445 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-cb5pm" event={"ID":"b40b4842-d003-44ce-aa40-f298d8deced5","Type":"ContainerStarted","Data":"1c79557a1b8a1c60c006d2d2f5be6a0cbcabb7ea52a7dc22af2e3572c2d8a7fb"} Feb 27 01:22:44 crc kubenswrapper[4771]: I0227 01:22:44.366577 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-cb5pm" Feb 27 01:22:44 crc kubenswrapper[4771]: I0227 01:22:44.368129 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"13396b98-6f5b-4800-854f-7b7d6af4cda4","Type":"ContainerStarted","Data":"9fce98d70b42254bc8c4051c38c661c8c50c78a481181b160fefb0d8141dfc34"} Feb 27 01:22:44 crc kubenswrapper[4771]: I0227 01:22:44.369120 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-727vl" Feb 27 01:22:44 crc kubenswrapper[4771]: I0227 01:22:44.369114 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-727vl" event={"ID":"1dfe8293-b488-48de-8990-c65bf4e63cd7","Type":"ContainerDied","Data":"a06b8862571e292f6258fa2b6c2c8cd9279f9f0bec0430999ed609172993ba64"} Feb 27 01:22:44 crc kubenswrapper[4771]: I0227 01:22:44.371363 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s5lkp" event={"ID":"8c578c69-744e-425b-8bb1-76eec4b332ec","Type":"ContainerStarted","Data":"adaeb3265edb4c1f227cfff6c25198d46bd74d1ae45a4c474c1bf5d482dd6168"} Feb 27 01:22:44 crc kubenswrapper[4771]: I0227 01:22:44.372743 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3ba1222d-39ed-4c00-a636-86788e0f6db6","Type":"ContainerStarted","Data":"e88c2f71086568b73c70d3d780138057c9c8504470999892f2bef34040219e5f"} Feb 27 01:22:44 crc kubenswrapper[4771]: I0227 01:22:44.375322 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-mhrfj" event={"ID":"7668329a-cf46-4d1f-bf55-10197a60906f","Type":"ContainerDied","Data":"e9e46524f6d23f58a804c6dceec19bf713145fdc3cfecdb77a8caabf5d3da120"} Feb 27 01:22:44 crc kubenswrapper[4771]: I0227 01:22:44.375335 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-mhrfj" Feb 27 01:22:44 crc kubenswrapper[4771]: I0227 01:22:44.377154 4771 generic.go:334] "Generic (PLEG): container finished" podID="bbec25a6-8536-4f09-af33-bd1a36b9e051" containerID="d22c8c007adffb7c68f0acf215a1cc79ace8ae6dd8f5e5c52ca0a1ca6ef4a2dc" exitCode=0 Feb 27 01:22:44 crc kubenswrapper[4771]: I0227 01:22:44.377220 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-92wll" event={"ID":"bbec25a6-8536-4f09-af33-bd1a36b9e051","Type":"ContainerDied","Data":"d22c8c007adffb7c68f0acf215a1cc79ace8ae6dd8f5e5c52ca0a1ca6ef4a2dc"} Feb 27 01:22:44 crc kubenswrapper[4771]: I0227 01:22:44.382559 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-cb5pm" podStartSLOduration=2.8095837599999998 podStartE2EDuration="16.382533055s" podCreationTimestamp="2026-02-27 01:22:28 +0000 UTC" firstStartedPulling="2026-02-27 01:22:29.057110995 +0000 UTC m=+1061.994672283" lastFinishedPulling="2026-02-27 01:22:42.63006029 +0000 UTC m=+1075.567621578" observedRunningTime="2026-02-27 01:22:44.380597853 +0000 UTC m=+1077.318159161" watchObservedRunningTime="2026-02-27 01:22:44.382533055 +0000 UTC m=+1077.320094343" Feb 27 01:22:44 crc kubenswrapper[4771]: I0227 01:22:44.450985 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-727vl"] Feb 27 01:22:44 crc kubenswrapper[4771]: I0227 01:22:44.461951 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-727vl"] Feb 27 01:22:44 crc kubenswrapper[4771]: I0227 01:22:44.497527 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mhrfj"] Feb 27 01:22:44 crc kubenswrapper[4771]: I0227 01:22:44.502166 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mhrfj"] Feb 27 01:22:45 crc kubenswrapper[4771]: I0227 01:22:45.386414 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ed3a808f-7dba-4f32-a081-29eab07e84c0","Type":"ContainerStarted","Data":"41ee46173852eb178ce5c724c2616bbbd0ce2c0d738ab6c07a74ddde878b0a64"} Feb 27 01:22:45 crc kubenswrapper[4771]: I0227 01:22:45.787744 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dfe8293-b488-48de-8990-c65bf4e63cd7" path="/var/lib/kubelet/pods/1dfe8293-b488-48de-8990-c65bf4e63cd7/volumes" Feb 27 01:22:45 crc kubenswrapper[4771]: I0227 01:22:45.788743 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7668329a-cf46-4d1f-bf55-10197a60906f" path="/var/lib/kubelet/pods/7668329a-cf46-4d1f-bf55-10197a60906f/volumes" Feb 27 01:22:48 crc kubenswrapper[4771]: I0227 01:22:48.577421 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-cb5pm" Feb 27 01:22:48 crc kubenswrapper[4771]: I0227 01:22:48.662912 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-cnbf7"] Feb 27 01:22:51 crc kubenswrapper[4771]: I0227 01:22:51.441755 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-92wll" event={"ID":"bbec25a6-8536-4f09-af33-bd1a36b9e051","Type":"ContainerStarted","Data":"cba0ad000fd21c0eb073d5bad8b06a4c81fdbcb1e87462eaa0140eb85c17715a"} Feb 27 01:22:52 crc kubenswrapper[4771]: I0227 01:22:52.452211 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"39fb27d1-e9a6-44e4-9f92-d5f0242a8007","Type":"ContainerStarted","Data":"53a372b7f66b29cd8396bc554057c88b8f20095a4618cb80c6d418231d4cabdc"} Feb 27 01:22:52 crc kubenswrapper[4771]: I0227 01:22:52.455811 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3ba1222d-39ed-4c00-a636-86788e0f6db6","Type":"ContainerStarted","Data":"d2488f3c3f5a1d3c6a7dddb0cedd991b11c548193fbe969322d1495423e8e8ad"} Feb 27 01:22:52 crc kubenswrapper[4771]: I0227 01:22:52.455979 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 27 01:22:52 crc kubenswrapper[4771]: I0227 01:22:52.458174 4771 generic.go:334] "Generic (PLEG): container finished" podID="bbec25a6-8536-4f09-af33-bd1a36b9e051" containerID="cba0ad000fd21c0eb073d5bad8b06a4c81fdbcb1e87462eaa0140eb85c17715a" exitCode=0 Feb 27 01:22:52 crc kubenswrapper[4771]: I0227 01:22:52.458242 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-92wll" event={"ID":"bbec25a6-8536-4f09-af33-bd1a36b9e051","Type":"ContainerDied","Data":"cba0ad000fd21c0eb073d5bad8b06a4c81fdbcb1e87462eaa0140eb85c17715a"} Feb 27 01:22:52 crc kubenswrapper[4771]: I0227 01:22:52.460362 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ed3a808f-7dba-4f32-a081-29eab07e84c0","Type":"ContainerStarted","Data":"10fad153d8e2522fefab95105212a167a9edea8ad2b5b0ffad3335182235cb9d"} Feb 27 01:22:52 crc kubenswrapper[4771]: I0227 01:22:52.466121 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-cnbf7" event={"ID":"c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1","Type":"ContainerStarted","Data":"95c1a2b562b22ad041275f4caf77d35b149797520c8f8fd92b5248eabb1747c3"} Feb 27 01:22:52 crc kubenswrapper[4771]: I0227 01:22:52.466190 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-cnbf7" podUID="c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1" containerName="dnsmasq-dns" containerID="cri-o://95c1a2b562b22ad041275f4caf77d35b149797520c8f8fd92b5248eabb1747c3" gracePeriod=10 Feb 27 01:22:52 crc kubenswrapper[4771]: I0227 01:22:52.466270 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-cnbf7" Feb 27 01:22:52 crc kubenswrapper[4771]: I0227 01:22:52.470930 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"60504948-6e27-4eb7-b057-4634a1951a8c","Type":"ContainerStarted","Data":"3214f20419fdea223868e00500742a41d615501c06cb179a3dad68f4ff8040ff"} Feb 27 01:22:52 crc kubenswrapper[4771]: I0227 01:22:52.470993 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 27 01:22:52 crc kubenswrapper[4771]: I0227 01:22:52.476979 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8be4acd2-0f92-4f9f-9521-5da586b712f0","Type":"ContainerStarted","Data":"6dc4045f9ac794b2f079255b482601733897511a7189e688aafa9b9365e2a7e7"} Feb 27 01:22:52 crc kubenswrapper[4771]: I0227 01:22:52.485763 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tjchc" event={"ID":"000564b2-d16b-45fb-ba91-e65b85bd7fb5","Type":"ContainerStarted","Data":"73365ef406c64fbc746647a54db0e38b95a379a3f3c9f67f67df5773cebcdfb0"} Feb 27 01:22:52 crc kubenswrapper[4771]: I0227 01:22:52.487731 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"13396b98-6f5b-4800-854f-7b7d6af4cda4","Type":"ContainerStarted","Data":"a79f940b2e764627dcea0d1e021dd77ae4956ae7ce90fa5698f0c0626d3272b0"} Feb 27 01:22:52 crc kubenswrapper[4771]: I0227 01:22:52.491079 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s5lkp" event={"ID":"8c578c69-744e-425b-8bb1-76eec4b332ec","Type":"ContainerStarted","Data":"46bbefbf3beb9d9097c1e074ab1415ea09d58617b57ca29ea85bc06a911014ed"} Feb 27 01:22:52 crc kubenswrapper[4771]: I0227 01:22:52.491356 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-s5lkp" Feb 27 01:22:52 crc kubenswrapper[4771]: I0227 01:22:52.506094 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-cnbf7" podStartSLOduration=11.77947726 podStartE2EDuration="25.506068201s" podCreationTimestamp="2026-02-27 01:22:27 +0000 UTC" firstStartedPulling="2026-02-27 01:22:28.926226811 +0000 UTC m=+1061.863788099" lastFinishedPulling="2026-02-27 01:22:42.652817752 +0000 UTC m=+1075.590379040" observedRunningTime="2026-02-27 01:22:52.504741045 +0000 UTC m=+1085.442302373" watchObservedRunningTime="2026-02-27 01:22:52.506068201 +0000 UTC m=+1085.443629489" Feb 27 01:22:52 crc kubenswrapper[4771]: I0227 01:22:52.536847 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=12.649855807 podStartE2EDuration="20.536824802s" podCreationTimestamp="2026-02-27 01:22:32 +0000 UTC" firstStartedPulling="2026-02-27 01:22:42.670244398 +0000 UTC m=+1075.607805686" lastFinishedPulling="2026-02-27 01:22:50.557213393 +0000 UTC m=+1083.494774681" observedRunningTime="2026-02-27 01:22:52.526392336 +0000 UTC m=+1085.463953644" watchObservedRunningTime="2026-02-27 01:22:52.536824802 +0000 UTC m=+1085.474386090" Feb 27 01:22:52 crc kubenswrapper[4771]: I0227 01:22:52.557483 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.459361693 podStartE2EDuration="18.557463155s" podCreationTimestamp="2026-02-27 01:22:34 +0000 UTC" firstStartedPulling="2026-02-27 01:22:43.316986772 +0000 UTC m=+1076.254548060" lastFinishedPulling="2026-02-27 01:22:51.415088234 +0000 UTC m=+1084.352649522" observedRunningTime="2026-02-27 01:22:52.546208378 +0000 UTC m=+1085.483769676" watchObservedRunningTime="2026-02-27 01:22:52.557463155 +0000 UTC m=+1085.495024453" Feb 27 01:22:52 crc kubenswrapper[4771]: I0227 01:22:52.622403 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-s5lkp" podStartSLOduration=8.307607221 podStartE2EDuration="15.622377898s" podCreationTimestamp="2026-02-27 01:22:37 +0000 UTC" firstStartedPulling="2026-02-27 01:22:43.331440087 +0000 UTC m=+1076.269001375" lastFinishedPulling="2026-02-27 01:22:50.646210764 +0000 UTC m=+1083.583772052" observedRunningTime="2026-02-27 01:22:52.619340835 +0000 UTC m=+1085.556902123" watchObservedRunningTime="2026-02-27 01:22:52.622377898 +0000 UTC m=+1085.559939186" Feb 27 01:22:52 crc kubenswrapper[4771]: I0227 01:22:52.902472 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-cnbf7" Feb 27 01:22:53 crc kubenswrapper[4771]: I0227 01:22:53.014490 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1-dns-svc\") pod \"c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1\" (UID: \"c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1\") " Feb 27 01:22:53 crc kubenswrapper[4771]: I0227 01:22:53.014651 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjwpc\" (UniqueName: \"kubernetes.io/projected/c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1-kube-api-access-vjwpc\") pod \"c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1\" (UID: \"c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1\") " Feb 27 01:22:53 crc kubenswrapper[4771]: I0227 01:22:53.014673 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1-config\") pod \"c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1\" (UID: \"c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1\") " Feb 27 01:22:53 crc kubenswrapper[4771]: I0227 01:22:53.020764 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1-kube-api-access-vjwpc" (OuterVolumeSpecName: "kube-api-access-vjwpc") pod "c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1" (UID: "c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1"). InnerVolumeSpecName "kube-api-access-vjwpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:22:53 crc kubenswrapper[4771]: I0227 01:22:53.045664 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1-config" (OuterVolumeSpecName: "config") pod "c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1" (UID: "c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:22:53 crc kubenswrapper[4771]: I0227 01:22:53.050387 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1" (UID: "c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:22:53 crc kubenswrapper[4771]: I0227 01:22:53.116119 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:22:53 crc kubenswrapper[4771]: I0227 01:22:53.116159 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 01:22:53 crc kubenswrapper[4771]: I0227 01:22:53.116169 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjwpc\" (UniqueName: \"kubernetes.io/projected/c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1-kube-api-access-vjwpc\") on node \"crc\" DevicePath \"\"" Feb 27 01:22:53 crc kubenswrapper[4771]: I0227 01:22:53.500574 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a3aec8d2-008a-4b77-a30b-23f8e812e332","Type":"ContainerStarted","Data":"29e220567126edda63a240d888ea845a559902f6854e201515bb3868c5d74e70"} Feb 27 01:22:53 crc kubenswrapper[4771]: I0227 01:22:53.502812 4771 generic.go:334] "Generic (PLEG): container finished" podID="c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1" containerID="95c1a2b562b22ad041275f4caf77d35b149797520c8f8fd92b5248eabb1747c3" exitCode=0 Feb 27 01:22:53 crc kubenswrapper[4771]: I0227 01:22:53.502960 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-cnbf7" event={"ID":"c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1","Type":"ContainerDied","Data":"95c1a2b562b22ad041275f4caf77d35b149797520c8f8fd92b5248eabb1747c3"} Feb 27 01:22:53 crc kubenswrapper[4771]: I0227 01:22:53.503014 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-cnbf7" event={"ID":"c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1","Type":"ContainerDied","Data":"cb5f7a8b9a9b4c57ba4ef9ddd84d972b443b4aa02c24892cd1c4552e67f6c66d"} Feb 27 01:22:53 crc kubenswrapper[4771]: I0227 01:22:53.503098 4771 scope.go:117] "RemoveContainer" containerID="95c1a2b562b22ad041275f4caf77d35b149797520c8f8fd92b5248eabb1747c3" Feb 27 01:22:53 crc kubenswrapper[4771]: I0227 01:22:53.503856 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-cnbf7" Feb 27 01:22:53 crc kubenswrapper[4771]: I0227 01:22:53.507592 4771 generic.go:334] "Generic (PLEG): container finished" podID="000564b2-d16b-45fb-ba91-e65b85bd7fb5" containerID="73365ef406c64fbc746647a54db0e38b95a379a3f3c9f67f67df5773cebcdfb0" exitCode=0 Feb 27 01:22:53 crc kubenswrapper[4771]: I0227 01:22:53.507703 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tjchc" event={"ID":"000564b2-d16b-45fb-ba91-e65b85bd7fb5","Type":"ContainerDied","Data":"73365ef406c64fbc746647a54db0e38b95a379a3f3c9f67f67df5773cebcdfb0"} Feb 27 01:22:53 crc kubenswrapper[4771]: I0227 01:22:53.510114 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a2c84581-5806-46dd-b352-390ef2d9826c","Type":"ContainerStarted","Data":"21065341d65c55328d33fca19982cb91c451939d0b0dd32c90272cca9aecf888"} Feb 27 01:22:53 crc kubenswrapper[4771]: I0227 01:22:53.540163 4771 scope.go:117] "RemoveContainer" containerID="a69e76e733e036b97931e4445558078465728f0b271ce8e48e35248876d3b943" Feb 27 01:22:53 crc kubenswrapper[4771]: I0227 01:22:53.608397 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-cnbf7"] Feb 27 01:22:53 crc kubenswrapper[4771]: I0227 01:22:53.614785 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-cnbf7"] Feb 27 01:22:53 crc kubenswrapper[4771]: I0227 01:22:53.630443 4771 scope.go:117] "RemoveContainer" containerID="95c1a2b562b22ad041275f4caf77d35b149797520c8f8fd92b5248eabb1747c3" Feb 27 01:22:53 crc kubenswrapper[4771]: E0227 01:22:53.630829 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95c1a2b562b22ad041275f4caf77d35b149797520c8f8fd92b5248eabb1747c3\": container with ID starting with 95c1a2b562b22ad041275f4caf77d35b149797520c8f8fd92b5248eabb1747c3 not found: ID does not exist" containerID="95c1a2b562b22ad041275f4caf77d35b149797520c8f8fd92b5248eabb1747c3" Feb 27 01:22:53 crc kubenswrapper[4771]: I0227 01:22:53.630906 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c1a2b562b22ad041275f4caf77d35b149797520c8f8fd92b5248eabb1747c3"} err="failed to get container status \"95c1a2b562b22ad041275f4caf77d35b149797520c8f8fd92b5248eabb1747c3\": rpc error: code = NotFound desc = could not find container \"95c1a2b562b22ad041275f4caf77d35b149797520c8f8fd92b5248eabb1747c3\": container with ID starting with 95c1a2b562b22ad041275f4caf77d35b149797520c8f8fd92b5248eabb1747c3 not found: ID does not exist" Feb 27 01:22:53 crc kubenswrapper[4771]: I0227 01:22:53.630932 4771 scope.go:117] "RemoveContainer" containerID="a69e76e733e036b97931e4445558078465728f0b271ce8e48e35248876d3b943" Feb 27 01:22:53 crc kubenswrapper[4771]: E0227 01:22:53.631158 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a69e76e733e036b97931e4445558078465728f0b271ce8e48e35248876d3b943\": container with ID starting with a69e76e733e036b97931e4445558078465728f0b271ce8e48e35248876d3b943 not found: ID does not exist" containerID="a69e76e733e036b97931e4445558078465728f0b271ce8e48e35248876d3b943" Feb 27 01:22:53 crc kubenswrapper[4771]: I0227 01:22:53.631178 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a69e76e733e036b97931e4445558078465728f0b271ce8e48e35248876d3b943"} err="failed to get container status \"a69e76e733e036b97931e4445558078465728f0b271ce8e48e35248876d3b943\": rpc error: code = NotFound desc = could not find container \"a69e76e733e036b97931e4445558078465728f0b271ce8e48e35248876d3b943\": container with ID starting with a69e76e733e036b97931e4445558078465728f0b271ce8e48e35248876d3b943 not found: ID does not exist" Feb 27 01:22:53 crc kubenswrapper[4771]: I0227 01:22:53.784300 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1" path="/var/lib/kubelet/pods/c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1/volumes" Feb 27 01:22:54 crc kubenswrapper[4771]: I0227 01:22:54.518611 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"13396b98-6f5b-4800-854f-7b7d6af4cda4","Type":"ContainerStarted","Data":"03e5510ed728db6fcd7b0f351cf2182f3de9e9b2850015fee6260fffcfd0c28b"} Feb 27 01:22:54 crc kubenswrapper[4771]: I0227 01:22:54.522324 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-92wll" event={"ID":"bbec25a6-8536-4f09-af33-bd1a36b9e051","Type":"ContainerStarted","Data":"3032c5c43bde1114097f9c6dab5432057c8f29eb9bb89d07830e03212605ed12"} Feb 27 01:22:54 crc kubenswrapper[4771]: I0227 01:22:54.525660 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ed3a808f-7dba-4f32-a081-29eab07e84c0","Type":"ContainerStarted","Data":"b9d68a14240f0e75cc70ef391ff342c9c2589228767350bb0202020f6e5fb1ad"} Feb 27 01:22:54 crc kubenswrapper[4771]: I0227 01:22:54.531963 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tjchc" event={"ID":"000564b2-d16b-45fb-ba91-e65b85bd7fb5","Type":"ContainerStarted","Data":"be7bb4c0ceda35f230c78b1061c21bb5759a0e3ed11adaa6745d6ba4aa0d0d07"} Feb 27 01:22:54 crc kubenswrapper[4771]: I0227 01:22:54.532045 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tjchc" event={"ID":"000564b2-d16b-45fb-ba91-e65b85bd7fb5","Type":"ContainerStarted","Data":"abd6ce74d37e4b9a3c15382a90d2598b131f2b2f0c200b677b93b28c54d2b787"} Feb 27 01:22:54 crc kubenswrapper[4771]: I0227 01:22:54.532491 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tjchc" Feb 27 01:22:54 crc kubenswrapper[4771]: I0227 01:22:54.565108 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=8.214393631 podStartE2EDuration="18.565081998s" podCreationTimestamp="2026-02-27 01:22:36 +0000 UTC" firstStartedPulling="2026-02-27 01:22:43.306013422 +0000 UTC m=+1076.243574720" lastFinishedPulling="2026-02-27 01:22:53.656701799 +0000 UTC m=+1086.594263087" observedRunningTime="2026-02-27 01:22:54.55009909 +0000 UTC m=+1087.487660378" watchObservedRunningTime="2026-02-27 01:22:54.565081998 +0000 UTC m=+1087.502643346" Feb 27 01:22:54 crc kubenswrapper[4771]: I0227 01:22:54.619462 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=6.325015889 podStartE2EDuration="14.619439453s" podCreationTimestamp="2026-02-27 01:22:40 +0000 UTC" firstStartedPulling="2026-02-27 01:22:45.358752548 +0000 UTC m=+1078.296313836" lastFinishedPulling="2026-02-27 01:22:53.653176102 +0000 UTC m=+1086.590737400" observedRunningTime="2026-02-27 01:22:54.587898832 +0000 UTC m=+1087.525460130" watchObservedRunningTime="2026-02-27 01:22:54.619439453 +0000 UTC m=+1087.557000751" Feb 27 01:22:54 crc kubenswrapper[4771]: I0227 01:22:54.620990 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-tjchc" podStartSLOduration=10.516375789 podStartE2EDuration="17.620978875s" podCreationTimestamp="2026-02-27 01:22:37 +0000 UTC" firstStartedPulling="2026-02-27 01:22:43.452689579 +0000 UTC m=+1076.390250867" lastFinishedPulling="2026-02-27 01:22:50.557292665 +0000 UTC m=+1083.494853953" observedRunningTime="2026-02-27 01:22:54.615168266 +0000 UTC m=+1087.552729564" watchObservedRunningTime="2026-02-27 01:22:54.620978875 +0000 UTC m=+1087.558540173" Feb 27 01:22:54 crc kubenswrapper[4771]: I0227 01:22:54.634164 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-92wll" podStartSLOduration=10.496928132 podStartE2EDuration="19.634142755s" podCreationTimestamp="2026-02-27 01:22:35 +0000 UTC" firstStartedPulling="2026-02-27 01:22:44.518585521 +0000 UTC m=+1077.456146809" lastFinishedPulling="2026-02-27 01:22:53.655800144 +0000 UTC m=+1086.593361432" observedRunningTime="2026-02-27 01:22:54.633002024 +0000 UTC m=+1087.570563342" watchObservedRunningTime="2026-02-27 01:22:54.634142755 +0000 UTC m=+1087.571704053" Feb 27 01:22:55 crc kubenswrapper[4771]: I0227 01:22:55.539585 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tjchc" Feb 27 01:22:55 crc kubenswrapper[4771]: I0227 01:22:55.844064 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:55 crc kubenswrapper[4771]: I0227 01:22:55.873335 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-92wll" Feb 27 01:22:55 crc kubenswrapper[4771]: I0227 01:22:55.873383 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-92wll" Feb 27 01:22:55 crc kubenswrapper[4771]: I0227 01:22:55.896762 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:55 crc kubenswrapper[4771]: I0227 01:22:55.937955 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-92wll" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.556611 4771 generic.go:334] "Generic (PLEG): container finished" podID="39fb27d1-e9a6-44e4-9f92-d5f0242a8007" containerID="53a372b7f66b29cd8396bc554057c88b8f20095a4618cb80c6d418231d4cabdc" exitCode=0 Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.556730 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"39fb27d1-e9a6-44e4-9f92-d5f0242a8007","Type":"ContainerDied","Data":"53a372b7f66b29cd8396bc554057c88b8f20095a4618cb80c6d418231d4cabdc"} Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.561859 4771 generic.go:334] "Generic (PLEG): container finished" podID="8be4acd2-0f92-4f9f-9521-5da586b712f0" containerID="6dc4045f9ac794b2f079255b482601733897511a7189e688aafa9b9365e2a7e7" exitCode=0 Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.561969 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8be4acd2-0f92-4f9f-9521-5da586b712f0","Type":"ContainerDied","Data":"6dc4045f9ac794b2f079255b482601733897511a7189e688aafa9b9365e2a7e7"} Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.562384 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.562981 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.563010 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.623334 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.656259 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.808064 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-qtmgl"] Feb 27 01:22:56 crc kubenswrapper[4771]: E0227 01:22:56.808396 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1" containerName="init" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.808413 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1" containerName="init" Feb 27 01:22:56 crc kubenswrapper[4771]: E0227 01:22:56.808440 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1" containerName="dnsmasq-dns" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.808446 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1" containerName="dnsmasq-dns" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.808637 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2d32d09-97a5-4e81-adf1-4f7be9ad8fc1" containerName="dnsmasq-dns" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.810799 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-qtmgl" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.814270 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.821929 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-qtmgl"] Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.880662 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj2mg\" (UniqueName: \"kubernetes.io/projected/0ce2c2f6-b248-4790-b76d-9eda92ccfe78-kube-api-access-wj2mg\") pod \"dnsmasq-dns-7fd796d7df-qtmgl\" (UID: \"0ce2c2f6-b248-4790-b76d-9eda92ccfe78\") " pod="openstack/dnsmasq-dns-7fd796d7df-qtmgl" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.880883 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ce2c2f6-b248-4790-b76d-9eda92ccfe78-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-qtmgl\" (UID: \"0ce2c2f6-b248-4790-b76d-9eda92ccfe78\") " pod="openstack/dnsmasq-dns-7fd796d7df-qtmgl" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.880914 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ce2c2f6-b248-4790-b76d-9eda92ccfe78-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-qtmgl\" (UID: \"0ce2c2f6-b248-4790-b76d-9eda92ccfe78\") " pod="openstack/dnsmasq-dns-7fd796d7df-qtmgl" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.880951 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ce2c2f6-b248-4790-b76d-9eda92ccfe78-config\") pod \"dnsmasq-dns-7fd796d7df-qtmgl\" (UID: \"0ce2c2f6-b248-4790-b76d-9eda92ccfe78\") " pod="openstack/dnsmasq-dns-7fd796d7df-qtmgl" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.925889 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-fcmgm"] Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.926879 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-fcmgm" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.932386 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.938658 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-fcmgm"] Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.982189 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3ef0bfcb-87a8-4b1d-9084-3486da00981a-ovn-rundir\") pod \"ovn-controller-metrics-fcmgm\" (UID: \"3ef0bfcb-87a8-4b1d-9084-3486da00981a\") " pod="openstack/ovn-controller-metrics-fcmgm" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.982245 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ef0bfcb-87a8-4b1d-9084-3486da00981a-config\") pod \"ovn-controller-metrics-fcmgm\" (UID: \"3ef0bfcb-87a8-4b1d-9084-3486da00981a\") " pod="openstack/ovn-controller-metrics-fcmgm" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.982273 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl7k7\" (UniqueName: \"kubernetes.io/projected/3ef0bfcb-87a8-4b1d-9084-3486da00981a-kube-api-access-kl7k7\") pod \"ovn-controller-metrics-fcmgm\" (UID: \"3ef0bfcb-87a8-4b1d-9084-3486da00981a\") " pod="openstack/ovn-controller-metrics-fcmgm" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.982424 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef0bfcb-87a8-4b1d-9084-3486da00981a-combined-ca-bundle\") pod \"ovn-controller-metrics-fcmgm\" (UID: \"3ef0bfcb-87a8-4b1d-9084-3486da00981a\") " pod="openstack/ovn-controller-metrics-fcmgm" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.983047 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wj2mg\" (UniqueName: \"kubernetes.io/projected/0ce2c2f6-b248-4790-b76d-9eda92ccfe78-kube-api-access-wj2mg\") pod \"dnsmasq-dns-7fd796d7df-qtmgl\" (UID: \"0ce2c2f6-b248-4790-b76d-9eda92ccfe78\") " pod="openstack/dnsmasq-dns-7fd796d7df-qtmgl" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.983149 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ef0bfcb-87a8-4b1d-9084-3486da00981a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-fcmgm\" (UID: \"3ef0bfcb-87a8-4b1d-9084-3486da00981a\") " pod="openstack/ovn-controller-metrics-fcmgm" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.983199 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3ef0bfcb-87a8-4b1d-9084-3486da00981a-ovs-rundir\") pod \"ovn-controller-metrics-fcmgm\" (UID: \"3ef0bfcb-87a8-4b1d-9084-3486da00981a\") " pod="openstack/ovn-controller-metrics-fcmgm" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.983249 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ce2c2f6-b248-4790-b76d-9eda92ccfe78-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-qtmgl\" (UID: \"0ce2c2f6-b248-4790-b76d-9eda92ccfe78\") " pod="openstack/dnsmasq-dns-7fd796d7df-qtmgl" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.983284 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ce2c2f6-b248-4790-b76d-9eda92ccfe78-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-qtmgl\" (UID: \"0ce2c2f6-b248-4790-b76d-9eda92ccfe78\") " pod="openstack/dnsmasq-dns-7fd796d7df-qtmgl" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.984248 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ce2c2f6-b248-4790-b76d-9eda92ccfe78-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-qtmgl\" (UID: \"0ce2c2f6-b248-4790-b76d-9eda92ccfe78\") " pod="openstack/dnsmasq-dns-7fd796d7df-qtmgl" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.984257 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ce2c2f6-b248-4790-b76d-9eda92ccfe78-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-qtmgl\" (UID: \"0ce2c2f6-b248-4790-b76d-9eda92ccfe78\") " pod="openstack/dnsmasq-dns-7fd796d7df-qtmgl" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.984746 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ce2c2f6-b248-4790-b76d-9eda92ccfe78-config\") pod \"dnsmasq-dns-7fd796d7df-qtmgl\" (UID: \"0ce2c2f6-b248-4790-b76d-9eda92ccfe78\") " pod="openstack/dnsmasq-dns-7fd796d7df-qtmgl" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.986096 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ce2c2f6-b248-4790-b76d-9eda92ccfe78-config\") pod \"dnsmasq-dns-7fd796d7df-qtmgl\" (UID: \"0ce2c2f6-b248-4790-b76d-9eda92ccfe78\") " pod="openstack/dnsmasq-dns-7fd796d7df-qtmgl" Feb 27 01:22:56 crc kubenswrapper[4771]: I0227 01:22:56.998543 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj2mg\" (UniqueName: \"kubernetes.io/projected/0ce2c2f6-b248-4790-b76d-9eda92ccfe78-kube-api-access-wj2mg\") pod \"dnsmasq-dns-7fd796d7df-qtmgl\" (UID: \"0ce2c2f6-b248-4790-b76d-9eda92ccfe78\") " pod="openstack/dnsmasq-dns-7fd796d7df-qtmgl" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.086532 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ef0bfcb-87a8-4b1d-9084-3486da00981a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-fcmgm\" (UID: \"3ef0bfcb-87a8-4b1d-9084-3486da00981a\") " pod="openstack/ovn-controller-metrics-fcmgm" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.086599 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3ef0bfcb-87a8-4b1d-9084-3486da00981a-ovs-rundir\") pod \"ovn-controller-metrics-fcmgm\" (UID: \"3ef0bfcb-87a8-4b1d-9084-3486da00981a\") " pod="openstack/ovn-controller-metrics-fcmgm" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.086665 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3ef0bfcb-87a8-4b1d-9084-3486da00981a-ovn-rundir\") pod \"ovn-controller-metrics-fcmgm\" (UID: \"3ef0bfcb-87a8-4b1d-9084-3486da00981a\") " pod="openstack/ovn-controller-metrics-fcmgm" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.086709 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ef0bfcb-87a8-4b1d-9084-3486da00981a-config\") pod \"ovn-controller-metrics-fcmgm\" (UID: \"3ef0bfcb-87a8-4b1d-9084-3486da00981a\") " pod="openstack/ovn-controller-metrics-fcmgm" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.086742 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl7k7\" (UniqueName: \"kubernetes.io/projected/3ef0bfcb-87a8-4b1d-9084-3486da00981a-kube-api-access-kl7k7\") pod \"ovn-controller-metrics-fcmgm\" (UID: \"3ef0bfcb-87a8-4b1d-9084-3486da00981a\") " pod="openstack/ovn-controller-metrics-fcmgm" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.086790 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef0bfcb-87a8-4b1d-9084-3486da00981a-combined-ca-bundle\") pod \"ovn-controller-metrics-fcmgm\" (UID: \"3ef0bfcb-87a8-4b1d-9084-3486da00981a\") " pod="openstack/ovn-controller-metrics-fcmgm" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.087704 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3ef0bfcb-87a8-4b1d-9084-3486da00981a-ovn-rundir\") pod \"ovn-controller-metrics-fcmgm\" (UID: \"3ef0bfcb-87a8-4b1d-9084-3486da00981a\") " pod="openstack/ovn-controller-metrics-fcmgm" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.087704 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3ef0bfcb-87a8-4b1d-9084-3486da00981a-ovs-rundir\") pod \"ovn-controller-metrics-fcmgm\" (UID: \"3ef0bfcb-87a8-4b1d-9084-3486da00981a\") " pod="openstack/ovn-controller-metrics-fcmgm" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.088508 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ef0bfcb-87a8-4b1d-9084-3486da00981a-config\") pod \"ovn-controller-metrics-fcmgm\" (UID: \"3ef0bfcb-87a8-4b1d-9084-3486da00981a\") " pod="openstack/ovn-controller-metrics-fcmgm" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.091341 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef0bfcb-87a8-4b1d-9084-3486da00981a-combined-ca-bundle\") pod \"ovn-controller-metrics-fcmgm\" (UID: \"3ef0bfcb-87a8-4b1d-9084-3486da00981a\") " pod="openstack/ovn-controller-metrics-fcmgm" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.091771 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ef0bfcb-87a8-4b1d-9084-3486da00981a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-fcmgm\" (UID: \"3ef0bfcb-87a8-4b1d-9084-3486da00981a\") " pod="openstack/ovn-controller-metrics-fcmgm" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.115124 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl7k7\" (UniqueName: \"kubernetes.io/projected/3ef0bfcb-87a8-4b1d-9084-3486da00981a-kube-api-access-kl7k7\") pod \"ovn-controller-metrics-fcmgm\" (UID: \"3ef0bfcb-87a8-4b1d-9084-3486da00981a\") " pod="openstack/ovn-controller-metrics-fcmgm" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.133604 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-qtmgl" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.202047 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-qtmgl"] Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.233268 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8dc9q"] Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.237796 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.244897 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-fcmgm" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.245380 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.250117 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8dc9q"] Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.293003 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppgzq\" (UniqueName: \"kubernetes.io/projected/43291add-2cca-4fc7-9546-b1706148158a-kube-api-access-ppgzq\") pod \"dnsmasq-dns-86db49b7ff-8dc9q\" (UID: \"43291add-2cca-4fc7-9546-b1706148158a\") " pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.293131 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43291add-2cca-4fc7-9546-b1706148158a-config\") pod \"dnsmasq-dns-86db49b7ff-8dc9q\" (UID: \"43291add-2cca-4fc7-9546-b1706148158a\") " pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.293187 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43291add-2cca-4fc7-9546-b1706148158a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-8dc9q\" (UID: \"43291add-2cca-4fc7-9546-b1706148158a\") " pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.293209 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43291add-2cca-4fc7-9546-b1706148158a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-8dc9q\" (UID: \"43291add-2cca-4fc7-9546-b1706148158a\") " pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.293240 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43291add-2cca-4fc7-9546-b1706148158a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-8dc9q\" (UID: \"43291add-2cca-4fc7-9546-b1706148158a\") " pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.396314 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43291add-2cca-4fc7-9546-b1706148158a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-8dc9q\" (UID: \"43291add-2cca-4fc7-9546-b1706148158a\") " pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.396357 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43291add-2cca-4fc7-9546-b1706148158a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-8dc9q\" (UID: \"43291add-2cca-4fc7-9546-b1706148158a\") " pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.396388 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43291add-2cca-4fc7-9546-b1706148158a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-8dc9q\" (UID: \"43291add-2cca-4fc7-9546-b1706148158a\") " pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.396465 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppgzq\" (UniqueName: \"kubernetes.io/projected/43291add-2cca-4fc7-9546-b1706148158a-kube-api-access-ppgzq\") pod \"dnsmasq-dns-86db49b7ff-8dc9q\" (UID: \"43291add-2cca-4fc7-9546-b1706148158a\") " pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.396504 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43291add-2cca-4fc7-9546-b1706148158a-config\") pod \"dnsmasq-dns-86db49b7ff-8dc9q\" (UID: \"43291add-2cca-4fc7-9546-b1706148158a\") " pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.399092 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43291add-2cca-4fc7-9546-b1706148158a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-8dc9q\" (UID: \"43291add-2cca-4fc7-9546-b1706148158a\") " pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.399132 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43291add-2cca-4fc7-9546-b1706148158a-config\") pod \"dnsmasq-dns-86db49b7ff-8dc9q\" (UID: \"43291add-2cca-4fc7-9546-b1706148158a\") " pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.399096 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43291add-2cca-4fc7-9546-b1706148158a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-8dc9q\" (UID: \"43291add-2cca-4fc7-9546-b1706148158a\") " pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.400599 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43291add-2cca-4fc7-9546-b1706148158a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-8dc9q\" (UID: \"43291add-2cca-4fc7-9546-b1706148158a\") " pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.415933 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppgzq\" (UniqueName: \"kubernetes.io/projected/43291add-2cca-4fc7-9546-b1706148158a-kube-api-access-ppgzq\") pod \"dnsmasq-dns-86db49b7ff-8dc9q\" (UID: \"43291add-2cca-4fc7-9546-b1706148158a\") " pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.485936 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.571345 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.617888 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.619294 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-qtmgl"] Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.748927 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-fcmgm"] Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.871624 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.873288 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.878605 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.887117 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.887306 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-m7xlw" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.887412 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 27 01:22:57 crc kubenswrapper[4771]: I0227 01:22:57.887507 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.020908 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65f02053-1ff7-4e60-ae6e-e25c36df39da-config\") pod \"ovn-northd-0\" (UID: \"65f02053-1ff7-4e60-ae6e-e25c36df39da\") " pod="openstack/ovn-northd-0" Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.021174 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmljl\" (UniqueName: \"kubernetes.io/projected/65f02053-1ff7-4e60-ae6e-e25c36df39da-kube-api-access-nmljl\") pod \"ovn-northd-0\" (UID: \"65f02053-1ff7-4e60-ae6e-e25c36df39da\") " pod="openstack/ovn-northd-0" Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.021224 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/65f02053-1ff7-4e60-ae6e-e25c36df39da-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"65f02053-1ff7-4e60-ae6e-e25c36df39da\") " pod="openstack/ovn-northd-0" Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.021244 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/65f02053-1ff7-4e60-ae6e-e25c36df39da-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"65f02053-1ff7-4e60-ae6e-e25c36df39da\") " pod="openstack/ovn-northd-0" Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.021266 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65f02053-1ff7-4e60-ae6e-e25c36df39da-scripts\") pod \"ovn-northd-0\" (UID: \"65f02053-1ff7-4e60-ae6e-e25c36df39da\") " pod="openstack/ovn-northd-0" Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.021283 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/65f02053-1ff7-4e60-ae6e-e25c36df39da-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"65f02053-1ff7-4e60-ae6e-e25c36df39da\") " pod="openstack/ovn-northd-0" Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.021323 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65f02053-1ff7-4e60-ae6e-e25c36df39da-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"65f02053-1ff7-4e60-ae6e-e25c36df39da\") " pod="openstack/ovn-northd-0" Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.041005 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8dc9q"] Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.123326 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65f02053-1ff7-4e60-ae6e-e25c36df39da-config\") pod \"ovn-northd-0\" (UID: \"65f02053-1ff7-4e60-ae6e-e25c36df39da\") " pod="openstack/ovn-northd-0" Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.123369 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmljl\" (UniqueName: \"kubernetes.io/projected/65f02053-1ff7-4e60-ae6e-e25c36df39da-kube-api-access-nmljl\") pod \"ovn-northd-0\" (UID: \"65f02053-1ff7-4e60-ae6e-e25c36df39da\") " pod="openstack/ovn-northd-0" Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.123443 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/65f02053-1ff7-4e60-ae6e-e25c36df39da-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"65f02053-1ff7-4e60-ae6e-e25c36df39da\") " pod="openstack/ovn-northd-0" Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.123483 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/65f02053-1ff7-4e60-ae6e-e25c36df39da-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"65f02053-1ff7-4e60-ae6e-e25c36df39da\") " pod="openstack/ovn-northd-0" Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.123506 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65f02053-1ff7-4e60-ae6e-e25c36df39da-scripts\") pod \"ovn-northd-0\" (UID: \"65f02053-1ff7-4e60-ae6e-e25c36df39da\") " pod="openstack/ovn-northd-0" Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.123524 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/65f02053-1ff7-4e60-ae6e-e25c36df39da-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"65f02053-1ff7-4e60-ae6e-e25c36df39da\") " pod="openstack/ovn-northd-0" Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.123610 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65f02053-1ff7-4e60-ae6e-e25c36df39da-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"65f02053-1ff7-4e60-ae6e-e25c36df39da\") " pod="openstack/ovn-northd-0" Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.126078 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/65f02053-1ff7-4e60-ae6e-e25c36df39da-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"65f02053-1ff7-4e60-ae6e-e25c36df39da\") " pod="openstack/ovn-northd-0" Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.126780 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65f02053-1ff7-4e60-ae6e-e25c36df39da-config\") pod \"ovn-northd-0\" (UID: \"65f02053-1ff7-4e60-ae6e-e25c36df39da\") " pod="openstack/ovn-northd-0" Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.128074 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65f02053-1ff7-4e60-ae6e-e25c36df39da-scripts\") pod \"ovn-northd-0\" (UID: \"65f02053-1ff7-4e60-ae6e-e25c36df39da\") " pod="openstack/ovn-northd-0" Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.129051 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/65f02053-1ff7-4e60-ae6e-e25c36df39da-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"65f02053-1ff7-4e60-ae6e-e25c36df39da\") " pod="openstack/ovn-northd-0" Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.129251 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/65f02053-1ff7-4e60-ae6e-e25c36df39da-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"65f02053-1ff7-4e60-ae6e-e25c36df39da\") " pod="openstack/ovn-northd-0" Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.129513 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65f02053-1ff7-4e60-ae6e-e25c36df39da-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"65f02053-1ff7-4e60-ae6e-e25c36df39da\") " pod="openstack/ovn-northd-0" Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.143093 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmljl\" (UniqueName: \"kubernetes.io/projected/65f02053-1ff7-4e60-ae6e-e25c36df39da-kube-api-access-nmljl\") pod \"ovn-northd-0\" (UID: \"65f02053-1ff7-4e60-ae6e-e25c36df39da\") " pod="openstack/ovn-northd-0" Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.255760 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.576709 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-fcmgm" event={"ID":"3ef0bfcb-87a8-4b1d-9084-3486da00981a","Type":"ContainerStarted","Data":"59f2c7022a363fb996ffe3ebfad77e2e7b24d3705199d888d795905c92b7c33e"} Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.578088 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" event={"ID":"43291add-2cca-4fc7-9546-b1706148158a","Type":"ContainerStarted","Data":"991293fa3125b36790e0a82b7e93ff6d60a400f90bcc7bd58026132225d4f48a"} Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.579423 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-qtmgl" event={"ID":"0ce2c2f6-b248-4790-b76d-9eda92ccfe78","Type":"ContainerStarted","Data":"4d2b45a03a3cac99b6b638ff839c13de6d38090ae25fdc250d9b54ed048a245e"} Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.777398 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.953058 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:22:58 crc kubenswrapper[4771]: I0227 01:22:58.953143 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:22:59 crc kubenswrapper[4771]: I0227 01:22:59.588766 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"65f02053-1ff7-4e60-ae6e-e25c36df39da","Type":"ContainerStarted","Data":"572704fd529cafec4e33751bb7cb67c1aa854a5a081a9bba4d480ea242019bc0"} Feb 27 01:22:59 crc kubenswrapper[4771]: I0227 01:22:59.591277 4771 generic.go:334] "Generic (PLEG): container finished" podID="0ce2c2f6-b248-4790-b76d-9eda92ccfe78" containerID="4aebfa6ffb32ec2745dab51a9a3ee831b46ce0c56d024cb00f61ed4e35f87da1" exitCode=0 Feb 27 01:22:59 crc kubenswrapper[4771]: I0227 01:22:59.591324 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-qtmgl" event={"ID":"0ce2c2f6-b248-4790-b76d-9eda92ccfe78","Type":"ContainerDied","Data":"4aebfa6ffb32ec2745dab51a9a3ee831b46ce0c56d024cb00f61ed4e35f87da1"} Feb 27 01:22:59 crc kubenswrapper[4771]: I0227 01:22:59.593533 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"39fb27d1-e9a6-44e4-9f92-d5f0242a8007","Type":"ContainerStarted","Data":"28792c1ce84782ff5e2d236b81b3c74fcf8a1f5e3192d054eef72bd5de2ead11"} Feb 27 01:22:59 crc kubenswrapper[4771]: I0227 01:22:59.596130 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-fcmgm" event={"ID":"3ef0bfcb-87a8-4b1d-9084-3486da00981a","Type":"ContainerStarted","Data":"c7c1145ec391edab6ea5ec1f474edb5d3416ff4e1510e3712896debe30f8fff3"} Feb 27 01:22:59 crc kubenswrapper[4771]: I0227 01:22:59.617303 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" event={"ID":"43291add-2cca-4fc7-9546-b1706148158a","Type":"ContainerDied","Data":"bcbf06f6cdc97c8e2429006ebdd2c28e4d35d3b62059a57de6e1b4d43eb924de"} Feb 27 01:22:59 crc kubenswrapper[4771]: I0227 01:22:59.618346 4771 generic.go:334] "Generic (PLEG): container finished" podID="43291add-2cca-4fc7-9546-b1706148158a" containerID="bcbf06f6cdc97c8e2429006ebdd2c28e4d35d3b62059a57de6e1b4d43eb924de" exitCode=0 Feb 27 01:22:59 crc kubenswrapper[4771]: I0227 01:22:59.622747 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8be4acd2-0f92-4f9f-9521-5da586b712f0","Type":"ContainerStarted","Data":"5fb6ec12c26dff9d3225272e13abced2c0ca3b88b663b2f80c65eb99e8a7d16e"} Feb 27 01:22:59 crc kubenswrapper[4771]: I0227 01:22:59.682764 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-fcmgm" podStartSLOduration=3.682746238 podStartE2EDuration="3.682746238s" podCreationTimestamp="2026-02-27 01:22:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:22:59.670810521 +0000 UTC m=+1092.608371809" watchObservedRunningTime="2026-02-27 01:22:59.682746238 +0000 UTC m=+1092.620307526" Feb 27 01:22:59 crc kubenswrapper[4771]: I0227 01:22:59.707765 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=22.135696307 podStartE2EDuration="29.707746s" podCreationTimestamp="2026-02-27 01:22:30 +0000 UTC" firstStartedPulling="2026-02-27 01:22:43.07488158 +0000 UTC m=+1076.012442878" lastFinishedPulling="2026-02-27 01:22:50.646931283 +0000 UTC m=+1083.584492571" observedRunningTime="2026-02-27 01:22:59.696536004 +0000 UTC m=+1092.634097292" watchObservedRunningTime="2026-02-27 01:22:59.707746 +0000 UTC m=+1092.645307288" Feb 27 01:22:59 crc kubenswrapper[4771]: I0227 01:22:59.747972 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=23.287140342 podStartE2EDuration="30.747949568s" podCreationTimestamp="2026-02-27 01:22:29 +0000 UTC" firstStartedPulling="2026-02-27 01:22:43.206679909 +0000 UTC m=+1076.144241217" lastFinishedPulling="2026-02-27 01:22:50.667489155 +0000 UTC m=+1083.605050443" observedRunningTime="2026-02-27 01:22:59.736454644 +0000 UTC m=+1092.674015932" watchObservedRunningTime="2026-02-27 01:22:59.747949568 +0000 UTC m=+1092.685510856" Feb 27 01:22:59 crc kubenswrapper[4771]: I0227 01:22:59.965059 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-qtmgl" Feb 27 01:23:00 crc kubenswrapper[4771]: I0227 01:23:00.064236 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ce2c2f6-b248-4790-b76d-9eda92ccfe78-config\") pod \"0ce2c2f6-b248-4790-b76d-9eda92ccfe78\" (UID: \"0ce2c2f6-b248-4790-b76d-9eda92ccfe78\") " Feb 27 01:23:00 crc kubenswrapper[4771]: I0227 01:23:00.064391 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ce2c2f6-b248-4790-b76d-9eda92ccfe78-ovsdbserver-nb\") pod \"0ce2c2f6-b248-4790-b76d-9eda92ccfe78\" (UID: \"0ce2c2f6-b248-4790-b76d-9eda92ccfe78\") " Feb 27 01:23:00 crc kubenswrapper[4771]: I0227 01:23:00.064416 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wj2mg\" (UniqueName: \"kubernetes.io/projected/0ce2c2f6-b248-4790-b76d-9eda92ccfe78-kube-api-access-wj2mg\") pod \"0ce2c2f6-b248-4790-b76d-9eda92ccfe78\" (UID: \"0ce2c2f6-b248-4790-b76d-9eda92ccfe78\") " Feb 27 01:23:00 crc kubenswrapper[4771]: I0227 01:23:00.064445 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ce2c2f6-b248-4790-b76d-9eda92ccfe78-dns-svc\") pod \"0ce2c2f6-b248-4790-b76d-9eda92ccfe78\" (UID: \"0ce2c2f6-b248-4790-b76d-9eda92ccfe78\") " Feb 27 01:23:00 crc kubenswrapper[4771]: I0227 01:23:00.073288 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ce2c2f6-b248-4790-b76d-9eda92ccfe78-kube-api-access-wj2mg" (OuterVolumeSpecName: "kube-api-access-wj2mg") pod "0ce2c2f6-b248-4790-b76d-9eda92ccfe78" (UID: "0ce2c2f6-b248-4790-b76d-9eda92ccfe78"). InnerVolumeSpecName "kube-api-access-wj2mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:00 crc kubenswrapper[4771]: I0227 01:23:00.112314 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ce2c2f6-b248-4790-b76d-9eda92ccfe78-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0ce2c2f6-b248-4790-b76d-9eda92ccfe78" (UID: "0ce2c2f6-b248-4790-b76d-9eda92ccfe78"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:00 crc kubenswrapper[4771]: I0227 01:23:00.122928 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ce2c2f6-b248-4790-b76d-9eda92ccfe78-config" (OuterVolumeSpecName: "config") pod "0ce2c2f6-b248-4790-b76d-9eda92ccfe78" (UID: "0ce2c2f6-b248-4790-b76d-9eda92ccfe78"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:00 crc kubenswrapper[4771]: I0227 01:23:00.131978 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ce2c2f6-b248-4790-b76d-9eda92ccfe78-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0ce2c2f6-b248-4790-b76d-9eda92ccfe78" (UID: "0ce2c2f6-b248-4790-b76d-9eda92ccfe78"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:00 crc kubenswrapper[4771]: I0227 01:23:00.166806 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ce2c2f6-b248-4790-b76d-9eda92ccfe78-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:00 crc kubenswrapper[4771]: I0227 01:23:00.166838 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ce2c2f6-b248-4790-b76d-9eda92ccfe78-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:00 crc kubenswrapper[4771]: I0227 01:23:00.166848 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ce2c2f6-b248-4790-b76d-9eda92ccfe78-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:00 crc kubenswrapper[4771]: I0227 01:23:00.166859 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wj2mg\" (UniqueName: \"kubernetes.io/projected/0ce2c2f6-b248-4790-b76d-9eda92ccfe78-kube-api-access-wj2mg\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:00 crc kubenswrapper[4771]: I0227 01:23:00.631338 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-qtmgl" event={"ID":"0ce2c2f6-b248-4790-b76d-9eda92ccfe78","Type":"ContainerDied","Data":"4d2b45a03a3cac99b6b638ff839c13de6d38090ae25fdc250d9b54ed048a245e"} Feb 27 01:23:00 crc kubenswrapper[4771]: I0227 01:23:00.631369 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-qtmgl" Feb 27 01:23:00 crc kubenswrapper[4771]: I0227 01:23:00.631659 4771 scope.go:117] "RemoveContainer" containerID="4aebfa6ffb32ec2745dab51a9a3ee831b46ce0c56d024cb00f61ed4e35f87da1" Feb 27 01:23:00 crc kubenswrapper[4771]: I0227 01:23:00.636119 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" event={"ID":"43291add-2cca-4fc7-9546-b1706148158a","Type":"ContainerStarted","Data":"4b77ae0e92f64e3b0e6e4ba5c54ab302c61b3fabeb97455162d4b1115d7da23c"} Feb 27 01:23:00 crc kubenswrapper[4771]: I0227 01:23:00.636161 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" Feb 27 01:23:00 crc kubenswrapper[4771]: I0227 01:23:00.668681 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" podStartSLOduration=3.668663316 podStartE2EDuration="3.668663316s" podCreationTimestamp="2026-02-27 01:22:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:23:00.667458573 +0000 UTC m=+1093.605019871" watchObservedRunningTime="2026-02-27 01:23:00.668663316 +0000 UTC m=+1093.606224604" Feb 27 01:23:00 crc kubenswrapper[4771]: I0227 01:23:00.816015 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-qtmgl"] Feb 27 01:23:00 crc kubenswrapper[4771]: I0227 01:23:00.827229 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-qtmgl"] Feb 27 01:23:00 crc kubenswrapper[4771]: I0227 01:23:00.907661 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 27 01:23:00 crc kubenswrapper[4771]: I0227 01:23:00.907711 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 27 01:23:01 crc kubenswrapper[4771]: I0227 01:23:01.643907 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"65f02053-1ff7-4e60-ae6e-e25c36df39da","Type":"ContainerStarted","Data":"ea8a04e4945c437857799b6d3856a0268fbb51f9c5b260c95c097a3f18b07665"} Feb 27 01:23:01 crc kubenswrapper[4771]: I0227 01:23:01.644191 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"65f02053-1ff7-4e60-ae6e-e25c36df39da","Type":"ContainerStarted","Data":"b43c6f594e06db0fb0a3d3520119f1f5b69945ab734f6de727264d2b73989c49"} Feb 27 01:23:01 crc kubenswrapper[4771]: I0227 01:23:01.645196 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 27 01:23:01 crc kubenswrapper[4771]: I0227 01:23:01.681517 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.9769763129999998 podStartE2EDuration="4.681494149s" podCreationTimestamp="2026-02-27 01:22:57 +0000 UTC" firstStartedPulling="2026-02-27 01:22:58.779517877 +0000 UTC m=+1091.717079175" lastFinishedPulling="2026-02-27 01:23:00.484035703 +0000 UTC m=+1093.421597011" observedRunningTime="2026-02-27 01:23:01.668919865 +0000 UTC m=+1094.606481163" watchObservedRunningTime="2026-02-27 01:23:01.681494149 +0000 UTC m=+1094.619055447" Feb 27 01:23:01 crc kubenswrapper[4771]: I0227 01:23:01.784293 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ce2c2f6-b248-4790-b76d-9eda92ccfe78" path="/var/lib/kubelet/pods/0ce2c2f6-b248-4790-b76d-9eda92ccfe78/volumes" Feb 27 01:23:02 crc kubenswrapper[4771]: I0227 01:23:02.318469 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 27 01:23:02 crc kubenswrapper[4771]: I0227 01:23:02.319104 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 27 01:23:03 crc kubenswrapper[4771]: I0227 01:23:03.419100 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 27 01:23:03 crc kubenswrapper[4771]: I0227 01:23:03.490387 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 27 01:23:03 crc kubenswrapper[4771]: I0227 01:23:03.815532 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8b3d-account-create-update-phxws"] Feb 27 01:23:03 crc kubenswrapper[4771]: E0227 01:23:03.816108 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ce2c2f6-b248-4790-b76d-9eda92ccfe78" containerName="init" Feb 27 01:23:03 crc kubenswrapper[4771]: I0227 01:23:03.816187 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce2c2f6-b248-4790-b76d-9eda92ccfe78" containerName="init" Feb 27 01:23:03 crc kubenswrapper[4771]: I0227 01:23:03.816393 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ce2c2f6-b248-4790-b76d-9eda92ccfe78" containerName="init" Feb 27 01:23:03 crc kubenswrapper[4771]: I0227 01:23:03.817022 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8b3d-account-create-update-phxws" Feb 27 01:23:03 crc kubenswrapper[4771]: I0227 01:23:03.818722 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 27 01:23:03 crc kubenswrapper[4771]: I0227 01:23:03.828773 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8b3d-account-create-update-phxws"] Feb 27 01:23:03 crc kubenswrapper[4771]: I0227 01:23:03.870074 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-ggggn"] Feb 27 01:23:03 crc kubenswrapper[4771]: I0227 01:23:03.880822 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ggggn" Feb 27 01:23:03 crc kubenswrapper[4771]: I0227 01:23:03.894359 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-ggggn"] Feb 27 01:23:03 crc kubenswrapper[4771]: I0227 01:23:03.927883 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccc234ed-1171-4186-ad79-6029cc652fda-operator-scripts\") pod \"placement-db-create-ggggn\" (UID: \"ccc234ed-1171-4186-ad79-6029cc652fda\") " pod="openstack/placement-db-create-ggggn" Feb 27 01:23:03 crc kubenswrapper[4771]: I0227 01:23:03.927928 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgzdw\" (UniqueName: \"kubernetes.io/projected/ccc234ed-1171-4186-ad79-6029cc652fda-kube-api-access-wgzdw\") pod \"placement-db-create-ggggn\" (UID: \"ccc234ed-1171-4186-ad79-6029cc652fda\") " pod="openstack/placement-db-create-ggggn" Feb 27 01:23:03 crc kubenswrapper[4771]: I0227 01:23:03.927955 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b31ac243-4c5d-4cf2-9be0-a2adbcf42186-operator-scripts\") pod \"placement-8b3d-account-create-update-phxws\" (UID: \"b31ac243-4c5d-4cf2-9be0-a2adbcf42186\") " pod="openstack/placement-8b3d-account-create-update-phxws" Feb 27 01:23:03 crc kubenswrapper[4771]: I0227 01:23:03.928042 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt498\" (UniqueName: \"kubernetes.io/projected/b31ac243-4c5d-4cf2-9be0-a2adbcf42186-kube-api-access-wt498\") pod \"placement-8b3d-account-create-update-phxws\" (UID: \"b31ac243-4c5d-4cf2-9be0-a2adbcf42186\") " pod="openstack/placement-8b3d-account-create-update-phxws" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.029680 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt498\" (UniqueName: \"kubernetes.io/projected/b31ac243-4c5d-4cf2-9be0-a2adbcf42186-kube-api-access-wt498\") pod \"placement-8b3d-account-create-update-phxws\" (UID: \"b31ac243-4c5d-4cf2-9be0-a2adbcf42186\") " pod="openstack/placement-8b3d-account-create-update-phxws" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.029826 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccc234ed-1171-4186-ad79-6029cc652fda-operator-scripts\") pod \"placement-db-create-ggggn\" (UID: \"ccc234ed-1171-4186-ad79-6029cc652fda\") " pod="openstack/placement-db-create-ggggn" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.029861 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgzdw\" (UniqueName: \"kubernetes.io/projected/ccc234ed-1171-4186-ad79-6029cc652fda-kube-api-access-wgzdw\") pod \"placement-db-create-ggggn\" (UID: \"ccc234ed-1171-4186-ad79-6029cc652fda\") " pod="openstack/placement-db-create-ggggn" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.029905 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b31ac243-4c5d-4cf2-9be0-a2adbcf42186-operator-scripts\") pod \"placement-8b3d-account-create-update-phxws\" (UID: \"b31ac243-4c5d-4cf2-9be0-a2adbcf42186\") " pod="openstack/placement-8b3d-account-create-update-phxws" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.030854 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccc234ed-1171-4186-ad79-6029cc652fda-operator-scripts\") pod \"placement-db-create-ggggn\" (UID: \"ccc234ed-1171-4186-ad79-6029cc652fda\") " pod="openstack/placement-db-create-ggggn" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.031047 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b31ac243-4c5d-4cf2-9be0-a2adbcf42186-operator-scripts\") pod \"placement-8b3d-account-create-update-phxws\" (UID: \"b31ac243-4c5d-4cf2-9be0-a2adbcf42186\") " pod="openstack/placement-8b3d-account-create-update-phxws" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.051949 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgzdw\" (UniqueName: \"kubernetes.io/projected/ccc234ed-1171-4186-ad79-6029cc652fda-kube-api-access-wgzdw\") pod \"placement-db-create-ggggn\" (UID: \"ccc234ed-1171-4186-ad79-6029cc652fda\") " pod="openstack/placement-db-create-ggggn" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.052122 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt498\" (UniqueName: \"kubernetes.io/projected/b31ac243-4c5d-4cf2-9be0-a2adbcf42186-kube-api-access-wt498\") pod \"placement-8b3d-account-create-update-phxws\" (UID: \"b31ac243-4c5d-4cf2-9be0-a2adbcf42186\") " pod="openstack/placement-8b3d-account-create-update-phxws" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.142399 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8b3d-account-create-update-phxws" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.198076 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ggggn" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.689095 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.693742 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8b3d-account-create-update-phxws"] Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.727653 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8dc9q"] Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.727906 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" podUID="43291add-2cca-4fc7-9546-b1706148158a" containerName="dnsmasq-dns" containerID="cri-o://4b77ae0e92f64e3b0e6e4ba5c54ab302c61b3fabeb97455162d4b1115d7da23c" gracePeriod=10 Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.730857 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.750442 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-w4hf9"] Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.752121 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-w4hf9" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.787439 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-w4hf9"] Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.820245 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-ggggn"] Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.847519 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-config\") pod \"dnsmasq-dns-698758b865-w4hf9\" (UID: \"2d72b00b-f919-47dd-8f0b-428a4d08d0e8\") " pod="openstack/dnsmasq-dns-698758b865-w4hf9" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.847574 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-dns-svc\") pod \"dnsmasq-dns-698758b865-w4hf9\" (UID: \"2d72b00b-f919-47dd-8f0b-428a4d08d0e8\") " pod="openstack/dnsmasq-dns-698758b865-w4hf9" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.847609 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-w4hf9\" (UID: \"2d72b00b-f919-47dd-8f0b-428a4d08d0e8\") " pod="openstack/dnsmasq-dns-698758b865-w4hf9" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.847625 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpdsz\" (UniqueName: \"kubernetes.io/projected/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-kube-api-access-xpdsz\") pod \"dnsmasq-dns-698758b865-w4hf9\" (UID: \"2d72b00b-f919-47dd-8f0b-428a4d08d0e8\") " pod="openstack/dnsmasq-dns-698758b865-w4hf9" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.847962 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-w4hf9\" (UID: \"2d72b00b-f919-47dd-8f0b-428a4d08d0e8\") " pod="openstack/dnsmasq-dns-698758b865-w4hf9" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.949477 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-w4hf9\" (UID: \"2d72b00b-f919-47dd-8f0b-428a4d08d0e8\") " pod="openstack/dnsmasq-dns-698758b865-w4hf9" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.949561 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-config\") pod \"dnsmasq-dns-698758b865-w4hf9\" (UID: \"2d72b00b-f919-47dd-8f0b-428a4d08d0e8\") " pod="openstack/dnsmasq-dns-698758b865-w4hf9" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.949585 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-dns-svc\") pod \"dnsmasq-dns-698758b865-w4hf9\" (UID: \"2d72b00b-f919-47dd-8f0b-428a4d08d0e8\") " pod="openstack/dnsmasq-dns-698758b865-w4hf9" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.949631 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-w4hf9\" (UID: \"2d72b00b-f919-47dd-8f0b-428a4d08d0e8\") " pod="openstack/dnsmasq-dns-698758b865-w4hf9" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.949648 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpdsz\" (UniqueName: \"kubernetes.io/projected/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-kube-api-access-xpdsz\") pod \"dnsmasq-dns-698758b865-w4hf9\" (UID: \"2d72b00b-f919-47dd-8f0b-428a4d08d0e8\") " pod="openstack/dnsmasq-dns-698758b865-w4hf9" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.950880 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-w4hf9\" (UID: \"2d72b00b-f919-47dd-8f0b-428a4d08d0e8\") " pod="openstack/dnsmasq-dns-698758b865-w4hf9" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.950896 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-config\") pod \"dnsmasq-dns-698758b865-w4hf9\" (UID: \"2d72b00b-f919-47dd-8f0b-428a4d08d0e8\") " pod="openstack/dnsmasq-dns-698758b865-w4hf9" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.951321 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-dns-svc\") pod \"dnsmasq-dns-698758b865-w4hf9\" (UID: \"2d72b00b-f919-47dd-8f0b-428a4d08d0e8\") " pod="openstack/dnsmasq-dns-698758b865-w4hf9" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.951326 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-w4hf9\" (UID: \"2d72b00b-f919-47dd-8f0b-428a4d08d0e8\") " pod="openstack/dnsmasq-dns-698758b865-w4hf9" Feb 27 01:23:04 crc kubenswrapper[4771]: I0227 01:23:04.972426 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpdsz\" (UniqueName: \"kubernetes.io/projected/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-kube-api-access-xpdsz\") pod \"dnsmasq-dns-698758b865-w4hf9\" (UID: \"2d72b00b-f919-47dd-8f0b-428a4d08d0e8\") " pod="openstack/dnsmasq-dns-698758b865-w4hf9" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.098214 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-w4hf9" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.297317 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.356306 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43291add-2cca-4fc7-9546-b1706148158a-ovsdbserver-nb\") pod \"43291add-2cca-4fc7-9546-b1706148158a\" (UID: \"43291add-2cca-4fc7-9546-b1706148158a\") " Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.356440 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43291add-2cca-4fc7-9546-b1706148158a-config\") pod \"43291add-2cca-4fc7-9546-b1706148158a\" (UID: \"43291add-2cca-4fc7-9546-b1706148158a\") " Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.356487 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppgzq\" (UniqueName: \"kubernetes.io/projected/43291add-2cca-4fc7-9546-b1706148158a-kube-api-access-ppgzq\") pod \"43291add-2cca-4fc7-9546-b1706148158a\" (UID: \"43291add-2cca-4fc7-9546-b1706148158a\") " Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.356569 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43291add-2cca-4fc7-9546-b1706148158a-ovsdbserver-sb\") pod \"43291add-2cca-4fc7-9546-b1706148158a\" (UID: \"43291add-2cca-4fc7-9546-b1706148158a\") " Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.356619 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43291add-2cca-4fc7-9546-b1706148158a-dns-svc\") pod \"43291add-2cca-4fc7-9546-b1706148158a\" (UID: \"43291add-2cca-4fc7-9546-b1706148158a\") " Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.377700 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43291add-2cca-4fc7-9546-b1706148158a-kube-api-access-ppgzq" (OuterVolumeSpecName: "kube-api-access-ppgzq") pod "43291add-2cca-4fc7-9546-b1706148158a" (UID: "43291add-2cca-4fc7-9546-b1706148158a"). InnerVolumeSpecName "kube-api-access-ppgzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.390830 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43291add-2cca-4fc7-9546-b1706148158a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "43291add-2cca-4fc7-9546-b1706148158a" (UID: "43291add-2cca-4fc7-9546-b1706148158a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.398373 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.404246 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43291add-2cca-4fc7-9546-b1706148158a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "43291add-2cca-4fc7-9546-b1706148158a" (UID: "43291add-2cca-4fc7-9546-b1706148158a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.405866 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43291add-2cca-4fc7-9546-b1706148158a-config" (OuterVolumeSpecName: "config") pod "43291add-2cca-4fc7-9546-b1706148158a" (UID: "43291add-2cca-4fc7-9546-b1706148158a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.409148 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43291add-2cca-4fc7-9546-b1706148158a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "43291add-2cca-4fc7-9546-b1706148158a" (UID: "43291add-2cca-4fc7-9546-b1706148158a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.458413 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppgzq\" (UniqueName: \"kubernetes.io/projected/43291add-2cca-4fc7-9546-b1706148158a-kube-api-access-ppgzq\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.458446 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43291add-2cca-4fc7-9546-b1706148158a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.458461 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43291add-2cca-4fc7-9546-b1706148158a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.458473 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43291add-2cca-4fc7-9546-b1706148158a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.458487 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43291add-2cca-4fc7-9546-b1706148158a-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.482525 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.553073 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-w4hf9"] Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.675579 4771 generic.go:334] "Generic (PLEG): container finished" podID="43291add-2cca-4fc7-9546-b1706148158a" containerID="4b77ae0e92f64e3b0e6e4ba5c54ab302c61b3fabeb97455162d4b1115d7da23c" exitCode=0 Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.675744 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" event={"ID":"43291add-2cca-4fc7-9546-b1706148158a","Type":"ContainerDied","Data":"4b77ae0e92f64e3b0e6e4ba5c54ab302c61b3fabeb97455162d4b1115d7da23c"} Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.676181 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" event={"ID":"43291add-2cca-4fc7-9546-b1706148158a","Type":"ContainerDied","Data":"991293fa3125b36790e0a82b7e93ff6d60a400f90bcc7bd58026132225d4f48a"} Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.676259 4771 scope.go:117] "RemoveContainer" containerID="4b77ae0e92f64e3b0e6e4ba5c54ab302c61b3fabeb97455162d4b1115d7da23c" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.675833 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8dc9q" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.714089 4771 generic.go:334] "Generic (PLEG): container finished" podID="ccc234ed-1171-4186-ad79-6029cc652fda" containerID="71760be4e11a514bdbdf989a6e3e8a8920da84ab5247daa34dcb145e0089e4d1" exitCode=0 Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.714170 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ggggn" event={"ID":"ccc234ed-1171-4186-ad79-6029cc652fda","Type":"ContainerDied","Data":"71760be4e11a514bdbdf989a6e3e8a8920da84ab5247daa34dcb145e0089e4d1"} Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.714201 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ggggn" event={"ID":"ccc234ed-1171-4186-ad79-6029cc652fda","Type":"ContainerStarted","Data":"97346c062d08424b6f6a75c1fbd6c5062c282e6bece5b250a8151c62a5c1fd2a"} Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.719377 4771 generic.go:334] "Generic (PLEG): container finished" podID="b31ac243-4c5d-4cf2-9be0-a2adbcf42186" containerID="b24e6c7bf29978fa688d9704ce181fb0d1d5b851432febc23b8945cbf26408f0" exitCode=0 Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.719704 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8b3d-account-create-update-phxws" event={"ID":"b31ac243-4c5d-4cf2-9be0-a2adbcf42186","Type":"ContainerDied","Data":"b24e6c7bf29978fa688d9704ce181fb0d1d5b851432febc23b8945cbf26408f0"} Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.719743 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8b3d-account-create-update-phxws" event={"ID":"b31ac243-4c5d-4cf2-9be0-a2adbcf42186","Type":"ContainerStarted","Data":"6c52f811f58201acc80d5c2440f2a0f8fd6ad4db93002202e82c1fc5dbe24c74"} Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.734188 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-w4hf9" event={"ID":"2d72b00b-f919-47dd-8f0b-428a4d08d0e8","Type":"ContainerStarted","Data":"674ebc610c82d34ca1eaa4f8ba55e87fe412d6309e5024658e55439acfac4e44"} Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.741775 4771 scope.go:117] "RemoveContainer" containerID="bcbf06f6cdc97c8e2429006ebdd2c28e4d35d3b62059a57de6e1b4d43eb924de" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.765673 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8dc9q"] Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.769920 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8dc9q"] Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.805860 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43291add-2cca-4fc7-9546-b1706148158a" path="/var/lib/kubelet/pods/43291add-2cca-4fc7-9546-b1706148158a/volumes" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.832878 4771 scope.go:117] "RemoveContainer" containerID="4b77ae0e92f64e3b0e6e4ba5c54ab302c61b3fabeb97455162d4b1115d7da23c" Feb 27 01:23:05 crc kubenswrapper[4771]: E0227 01:23:05.843087 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b77ae0e92f64e3b0e6e4ba5c54ab302c61b3fabeb97455162d4b1115d7da23c\": container with ID starting with 4b77ae0e92f64e3b0e6e4ba5c54ab302c61b3fabeb97455162d4b1115d7da23c not found: ID does not exist" containerID="4b77ae0e92f64e3b0e6e4ba5c54ab302c61b3fabeb97455162d4b1115d7da23c" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.843329 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b77ae0e92f64e3b0e6e4ba5c54ab302c61b3fabeb97455162d4b1115d7da23c"} err="failed to get container status \"4b77ae0e92f64e3b0e6e4ba5c54ab302c61b3fabeb97455162d4b1115d7da23c\": rpc error: code = NotFound desc = could not find container \"4b77ae0e92f64e3b0e6e4ba5c54ab302c61b3fabeb97455162d4b1115d7da23c\": container with ID starting with 4b77ae0e92f64e3b0e6e4ba5c54ab302c61b3fabeb97455162d4b1115d7da23c not found: ID does not exist" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.843423 4771 scope.go:117] "RemoveContainer" containerID="bcbf06f6cdc97c8e2429006ebdd2c28e4d35d3b62059a57de6e1b4d43eb924de" Feb 27 01:23:05 crc kubenswrapper[4771]: E0227 01:23:05.846677 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcbf06f6cdc97c8e2429006ebdd2c28e4d35d3b62059a57de6e1b4d43eb924de\": container with ID starting with bcbf06f6cdc97c8e2429006ebdd2c28e4d35d3b62059a57de6e1b4d43eb924de not found: ID does not exist" containerID="bcbf06f6cdc97c8e2429006ebdd2c28e4d35d3b62059a57de6e1b4d43eb924de" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.846720 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcbf06f6cdc97c8e2429006ebdd2c28e4d35d3b62059a57de6e1b4d43eb924de"} err="failed to get container status \"bcbf06f6cdc97c8e2429006ebdd2c28e4d35d3b62059a57de6e1b4d43eb924de\": rpc error: code = NotFound desc = could not find container \"bcbf06f6cdc97c8e2429006ebdd2c28e4d35d3b62059a57de6e1b4d43eb924de\": container with ID starting with bcbf06f6cdc97c8e2429006ebdd2c28e4d35d3b62059a57de6e1b4d43eb924de not found: ID does not exist" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.854691 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 27 01:23:05 crc kubenswrapper[4771]: E0227 01:23:05.855700 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43291add-2cca-4fc7-9546-b1706148158a" containerName="dnsmasq-dns" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.855799 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="43291add-2cca-4fc7-9546-b1706148158a" containerName="dnsmasq-dns" Feb 27 01:23:05 crc kubenswrapper[4771]: E0227 01:23:05.855879 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43291add-2cca-4fc7-9546-b1706148158a" containerName="init" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.855932 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="43291add-2cca-4fc7-9546-b1706148158a" containerName="init" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.856135 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="43291add-2cca-4fc7-9546-b1706148158a" containerName="dnsmasq-dns" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.923574 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.926954 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.936593 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-lxglz" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.936710 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.936611 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.936681 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 27 01:23:05 crc kubenswrapper[4771]: I0227 01:23:05.973524 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-92wll" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.023976 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/251e5c6f-c762-4a6e-9253-81f94d592239-lock\") pod \"swift-storage-0\" (UID: \"251e5c6f-c762-4a6e-9253-81f94d592239\") " pod="openstack/swift-storage-0" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.024108 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/251e5c6f-c762-4a6e-9253-81f94d592239-cache\") pod \"swift-storage-0\" (UID: \"251e5c6f-c762-4a6e-9253-81f94d592239\") " pod="openstack/swift-storage-0" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.024176 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/251e5c6f-c762-4a6e-9253-81f94d592239-etc-swift\") pod \"swift-storage-0\" (UID: \"251e5c6f-c762-4a6e-9253-81f94d592239\") " pod="openstack/swift-storage-0" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.024195 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/251e5c6f-c762-4a6e-9253-81f94d592239-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"251e5c6f-c762-4a6e-9253-81f94d592239\") " pod="openstack/swift-storage-0" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.024221 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"251e5c6f-c762-4a6e-9253-81f94d592239\") " pod="openstack/swift-storage-0" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.024245 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl69z\" (UniqueName: \"kubernetes.io/projected/251e5c6f-c762-4a6e-9253-81f94d592239-kube-api-access-vl69z\") pod \"swift-storage-0\" (UID: \"251e5c6f-c762-4a6e-9253-81f94d592239\") " pod="openstack/swift-storage-0" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.028701 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-92wll"] Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.126372 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/251e5c6f-c762-4a6e-9253-81f94d592239-lock\") pod \"swift-storage-0\" (UID: \"251e5c6f-c762-4a6e-9253-81f94d592239\") " pod="openstack/swift-storage-0" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.126476 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/251e5c6f-c762-4a6e-9253-81f94d592239-cache\") pod \"swift-storage-0\" (UID: \"251e5c6f-c762-4a6e-9253-81f94d592239\") " pod="openstack/swift-storage-0" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.126642 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/251e5c6f-c762-4a6e-9253-81f94d592239-etc-swift\") pod \"swift-storage-0\" (UID: \"251e5c6f-c762-4a6e-9253-81f94d592239\") " pod="openstack/swift-storage-0" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.126702 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/251e5c6f-c762-4a6e-9253-81f94d592239-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"251e5c6f-c762-4a6e-9253-81f94d592239\") " pod="openstack/swift-storage-0" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.126752 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"251e5c6f-c762-4a6e-9253-81f94d592239\") " pod="openstack/swift-storage-0" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.126795 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl69z\" (UniqueName: \"kubernetes.io/projected/251e5c6f-c762-4a6e-9253-81f94d592239-kube-api-access-vl69z\") pod \"swift-storage-0\" (UID: \"251e5c6f-c762-4a6e-9253-81f94d592239\") " pod="openstack/swift-storage-0" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.127484 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/251e5c6f-c762-4a6e-9253-81f94d592239-cache\") pod \"swift-storage-0\" (UID: \"251e5c6f-c762-4a6e-9253-81f94d592239\") " pod="openstack/swift-storage-0" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.127514 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"251e5c6f-c762-4a6e-9253-81f94d592239\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/swift-storage-0" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.127987 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/251e5c6f-c762-4a6e-9253-81f94d592239-lock\") pod \"swift-storage-0\" (UID: \"251e5c6f-c762-4a6e-9253-81f94d592239\") " pod="openstack/swift-storage-0" Feb 27 01:23:06 crc kubenswrapper[4771]: E0227 01:23:06.127649 4771 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 27 01:23:06 crc kubenswrapper[4771]: E0227 01:23:06.128034 4771 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 27 01:23:06 crc kubenswrapper[4771]: E0227 01:23:06.128083 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/251e5c6f-c762-4a6e-9253-81f94d592239-etc-swift podName:251e5c6f-c762-4a6e-9253-81f94d592239 nodeName:}" failed. No retries permitted until 2026-02-27 01:23:06.628063957 +0000 UTC m=+1099.565625245 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/251e5c6f-c762-4a6e-9253-81f94d592239-etc-swift") pod "swift-storage-0" (UID: "251e5c6f-c762-4a6e-9253-81f94d592239") : configmap "swift-ring-files" not found Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.136223 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/251e5c6f-c762-4a6e-9253-81f94d592239-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"251e5c6f-c762-4a6e-9253-81f94d592239\") " pod="openstack/swift-storage-0" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.151837 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl69z\" (UniqueName: \"kubernetes.io/projected/251e5c6f-c762-4a6e-9253-81f94d592239-kube-api-access-vl69z\") pod \"swift-storage-0\" (UID: \"251e5c6f-c762-4a6e-9253-81f94d592239\") " pod="openstack/swift-storage-0" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.152210 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"251e5c6f-c762-4a6e-9253-81f94d592239\") " pod="openstack/swift-storage-0" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.307975 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-cm796"] Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.309180 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cm796" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.312828 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.312862 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.314176 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.318586 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-cm796"] Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.431096 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a59a151-f189-4128-b462-29557b12a8da-combined-ca-bundle\") pod \"swift-ring-rebalance-cm796\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " pod="openstack/swift-ring-rebalance-cm796" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.431173 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a59a151-f189-4128-b462-29557b12a8da-dispersionconf\") pod \"swift-ring-rebalance-cm796\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " pod="openstack/swift-ring-rebalance-cm796" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.431210 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a59a151-f189-4128-b462-29557b12a8da-ring-data-devices\") pod \"swift-ring-rebalance-cm796\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " pod="openstack/swift-ring-rebalance-cm796" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.431251 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a59a151-f189-4128-b462-29557b12a8da-scripts\") pod \"swift-ring-rebalance-cm796\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " pod="openstack/swift-ring-rebalance-cm796" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.431285 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a59a151-f189-4128-b462-29557b12a8da-etc-swift\") pod \"swift-ring-rebalance-cm796\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " pod="openstack/swift-ring-rebalance-cm796" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.431322 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a59a151-f189-4128-b462-29557b12a8da-swiftconf\") pod \"swift-ring-rebalance-cm796\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " pod="openstack/swift-ring-rebalance-cm796" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.431369 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj74q\" (UniqueName: \"kubernetes.io/projected/8a59a151-f189-4128-b462-29557b12a8da-kube-api-access-kj74q\") pod \"swift-ring-rebalance-cm796\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " pod="openstack/swift-ring-rebalance-cm796" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.532773 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a59a151-f189-4128-b462-29557b12a8da-combined-ca-bundle\") pod \"swift-ring-rebalance-cm796\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " pod="openstack/swift-ring-rebalance-cm796" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.532853 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a59a151-f189-4128-b462-29557b12a8da-dispersionconf\") pod \"swift-ring-rebalance-cm796\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " pod="openstack/swift-ring-rebalance-cm796" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.532879 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a59a151-f189-4128-b462-29557b12a8da-ring-data-devices\") pod \"swift-ring-rebalance-cm796\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " pod="openstack/swift-ring-rebalance-cm796" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.532914 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a59a151-f189-4128-b462-29557b12a8da-scripts\") pod \"swift-ring-rebalance-cm796\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " pod="openstack/swift-ring-rebalance-cm796" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.532944 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a59a151-f189-4128-b462-29557b12a8da-etc-swift\") pod \"swift-ring-rebalance-cm796\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " pod="openstack/swift-ring-rebalance-cm796" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.532959 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a59a151-f189-4128-b462-29557b12a8da-swiftconf\") pod \"swift-ring-rebalance-cm796\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " pod="openstack/swift-ring-rebalance-cm796" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.532991 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj74q\" (UniqueName: \"kubernetes.io/projected/8a59a151-f189-4128-b462-29557b12a8da-kube-api-access-kj74q\") pod \"swift-ring-rebalance-cm796\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " pod="openstack/swift-ring-rebalance-cm796" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.538146 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a59a151-f189-4128-b462-29557b12a8da-combined-ca-bundle\") pod \"swift-ring-rebalance-cm796\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " pod="openstack/swift-ring-rebalance-cm796" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.538299 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a59a151-f189-4128-b462-29557b12a8da-etc-swift\") pod \"swift-ring-rebalance-cm796\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " pod="openstack/swift-ring-rebalance-cm796" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.538319 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a59a151-f189-4128-b462-29557b12a8da-scripts\") pod \"swift-ring-rebalance-cm796\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " pod="openstack/swift-ring-rebalance-cm796" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.539064 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a59a151-f189-4128-b462-29557b12a8da-ring-data-devices\") pod \"swift-ring-rebalance-cm796\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " pod="openstack/swift-ring-rebalance-cm796" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.543970 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a59a151-f189-4128-b462-29557b12a8da-swiftconf\") pod \"swift-ring-rebalance-cm796\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " pod="openstack/swift-ring-rebalance-cm796" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.546719 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a59a151-f189-4128-b462-29557b12a8da-dispersionconf\") pod \"swift-ring-rebalance-cm796\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " pod="openstack/swift-ring-rebalance-cm796" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.549211 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj74q\" (UniqueName: \"kubernetes.io/projected/8a59a151-f189-4128-b462-29557b12a8da-kube-api-access-kj74q\") pod \"swift-ring-rebalance-cm796\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " pod="openstack/swift-ring-rebalance-cm796" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.624577 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cm796" Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.635242 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/251e5c6f-c762-4a6e-9253-81f94d592239-etc-swift\") pod \"swift-storage-0\" (UID: \"251e5c6f-c762-4a6e-9253-81f94d592239\") " pod="openstack/swift-storage-0" Feb 27 01:23:06 crc kubenswrapper[4771]: E0227 01:23:06.635419 4771 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 27 01:23:06 crc kubenswrapper[4771]: E0227 01:23:06.635444 4771 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 27 01:23:06 crc kubenswrapper[4771]: E0227 01:23:06.635493 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/251e5c6f-c762-4a6e-9253-81f94d592239-etc-swift podName:251e5c6f-c762-4a6e-9253-81f94d592239 nodeName:}" failed. No retries permitted until 2026-02-27 01:23:07.635476355 +0000 UTC m=+1100.573037663 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/251e5c6f-c762-4a6e-9253-81f94d592239-etc-swift") pod "swift-storage-0" (UID: "251e5c6f-c762-4a6e-9253-81f94d592239") : configmap "swift-ring-files" not found Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.789512 4771 generic.go:334] "Generic (PLEG): container finished" podID="2d72b00b-f919-47dd-8f0b-428a4d08d0e8" containerID="54112211ee0425cdc9f26c6c5c1245475e2e0ee45d665733fc638c86f483bb2a" exitCode=0 Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.789587 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-w4hf9" event={"ID":"2d72b00b-f919-47dd-8f0b-428a4d08d0e8","Type":"ContainerDied","Data":"54112211ee0425cdc9f26c6c5c1245475e2e0ee45d665733fc638c86f483bb2a"} Feb 27 01:23:06 crc kubenswrapper[4771]: I0227 01:23:06.790142 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-92wll" podUID="bbec25a6-8536-4f09-af33-bd1a36b9e051" containerName="registry-server" containerID="cri-o://3032c5c43bde1114097f9c6dab5432057c8f29eb9bb89d07830e03212605ed12" gracePeriod=2 Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.132601 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-cm796"] Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.322994 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ggggn" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.329078 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-92wll" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.332760 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8b3d-account-create-update-phxws" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.458936 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx6hk\" (UniqueName: \"kubernetes.io/projected/bbec25a6-8536-4f09-af33-bd1a36b9e051-kube-api-access-rx6hk\") pod \"bbec25a6-8536-4f09-af33-bd1a36b9e051\" (UID: \"bbec25a6-8536-4f09-af33-bd1a36b9e051\") " Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.459845 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt498\" (UniqueName: \"kubernetes.io/projected/b31ac243-4c5d-4cf2-9be0-a2adbcf42186-kube-api-access-wt498\") pod \"b31ac243-4c5d-4cf2-9be0-a2adbcf42186\" (UID: \"b31ac243-4c5d-4cf2-9be0-a2adbcf42186\") " Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.459901 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgzdw\" (UniqueName: \"kubernetes.io/projected/ccc234ed-1171-4186-ad79-6029cc652fda-kube-api-access-wgzdw\") pod \"ccc234ed-1171-4186-ad79-6029cc652fda\" (UID: \"ccc234ed-1171-4186-ad79-6029cc652fda\") " Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.459940 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbec25a6-8536-4f09-af33-bd1a36b9e051-catalog-content\") pod \"bbec25a6-8536-4f09-af33-bd1a36b9e051\" (UID: \"bbec25a6-8536-4f09-af33-bd1a36b9e051\") " Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.460088 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccc234ed-1171-4186-ad79-6029cc652fda-operator-scripts\") pod \"ccc234ed-1171-4186-ad79-6029cc652fda\" (UID: \"ccc234ed-1171-4186-ad79-6029cc652fda\") " Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.460135 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbec25a6-8536-4f09-af33-bd1a36b9e051-utilities\") pod \"bbec25a6-8536-4f09-af33-bd1a36b9e051\" (UID: \"bbec25a6-8536-4f09-af33-bd1a36b9e051\") " Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.460166 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b31ac243-4c5d-4cf2-9be0-a2adbcf42186-operator-scripts\") pod \"b31ac243-4c5d-4cf2-9be0-a2adbcf42186\" (UID: \"b31ac243-4c5d-4cf2-9be0-a2adbcf42186\") " Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.461061 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b31ac243-4c5d-4cf2-9be0-a2adbcf42186-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b31ac243-4c5d-4cf2-9be0-a2adbcf42186" (UID: "b31ac243-4c5d-4cf2-9be0-a2adbcf42186"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.461216 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbec25a6-8536-4f09-af33-bd1a36b9e051-utilities" (OuterVolumeSpecName: "utilities") pod "bbec25a6-8536-4f09-af33-bd1a36b9e051" (UID: "bbec25a6-8536-4f09-af33-bd1a36b9e051"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.461482 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccc234ed-1171-4186-ad79-6029cc652fda-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ccc234ed-1171-4186-ad79-6029cc652fda" (UID: "ccc234ed-1171-4186-ad79-6029cc652fda"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.464437 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b31ac243-4c5d-4cf2-9be0-a2adbcf42186-kube-api-access-wt498" (OuterVolumeSpecName: "kube-api-access-wt498") pod "b31ac243-4c5d-4cf2-9be0-a2adbcf42186" (UID: "b31ac243-4c5d-4cf2-9be0-a2adbcf42186"). InnerVolumeSpecName "kube-api-access-wt498". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.466832 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbec25a6-8536-4f09-af33-bd1a36b9e051-kube-api-access-rx6hk" (OuterVolumeSpecName: "kube-api-access-rx6hk") pod "bbec25a6-8536-4f09-af33-bd1a36b9e051" (UID: "bbec25a6-8536-4f09-af33-bd1a36b9e051"). InnerVolumeSpecName "kube-api-access-rx6hk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.466875 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccc234ed-1171-4186-ad79-6029cc652fda-kube-api-access-wgzdw" (OuterVolumeSpecName: "kube-api-access-wgzdw") pod "ccc234ed-1171-4186-ad79-6029cc652fda" (UID: "ccc234ed-1171-4186-ad79-6029cc652fda"). InnerVolumeSpecName "kube-api-access-wgzdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.492137 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbec25a6-8536-4f09-af33-bd1a36b9e051-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bbec25a6-8536-4f09-af33-bd1a36b9e051" (UID: "bbec25a6-8536-4f09-af33-bd1a36b9e051"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.562348 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt498\" (UniqueName: \"kubernetes.io/projected/b31ac243-4c5d-4cf2-9be0-a2adbcf42186-kube-api-access-wt498\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.562385 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgzdw\" (UniqueName: \"kubernetes.io/projected/ccc234ed-1171-4186-ad79-6029cc652fda-kube-api-access-wgzdw\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.562400 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbec25a6-8536-4f09-af33-bd1a36b9e051-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.562414 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccc234ed-1171-4186-ad79-6029cc652fda-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.562425 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbec25a6-8536-4f09-af33-bd1a36b9e051-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.562440 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b31ac243-4c5d-4cf2-9be0-a2adbcf42186-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.562452 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx6hk\" (UniqueName: \"kubernetes.io/projected/bbec25a6-8536-4f09-af33-bd1a36b9e051-kube-api-access-rx6hk\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.609492 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-c5xcw"] Feb 27 01:23:07 crc kubenswrapper[4771]: E0227 01:23:07.609882 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbec25a6-8536-4f09-af33-bd1a36b9e051" containerName="extract-content" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.609904 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbec25a6-8536-4f09-af33-bd1a36b9e051" containerName="extract-content" Feb 27 01:23:07 crc kubenswrapper[4771]: E0227 01:23:07.609929 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc234ed-1171-4186-ad79-6029cc652fda" containerName="mariadb-database-create" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.609938 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc234ed-1171-4186-ad79-6029cc652fda" containerName="mariadb-database-create" Feb 27 01:23:07 crc kubenswrapper[4771]: E0227 01:23:07.609956 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b31ac243-4c5d-4cf2-9be0-a2adbcf42186" containerName="mariadb-account-create-update" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.609963 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b31ac243-4c5d-4cf2-9be0-a2adbcf42186" containerName="mariadb-account-create-update" Feb 27 01:23:07 crc kubenswrapper[4771]: E0227 01:23:07.609986 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbec25a6-8536-4f09-af33-bd1a36b9e051" containerName="registry-server" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.609994 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbec25a6-8536-4f09-af33-bd1a36b9e051" containerName="registry-server" Feb 27 01:23:07 crc kubenswrapper[4771]: E0227 01:23:07.610004 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbec25a6-8536-4f09-af33-bd1a36b9e051" containerName="extract-utilities" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.610011 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbec25a6-8536-4f09-af33-bd1a36b9e051" containerName="extract-utilities" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.610206 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccc234ed-1171-4186-ad79-6029cc652fda" containerName="mariadb-database-create" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.610234 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b31ac243-4c5d-4cf2-9be0-a2adbcf42186" containerName="mariadb-account-create-update" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.610245 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbec25a6-8536-4f09-af33-bd1a36b9e051" containerName="registry-server" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.611102 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-c5xcw" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.625011 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-c5xcw"] Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.663862 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/251e5c6f-c762-4a6e-9253-81f94d592239-etc-swift\") pod \"swift-storage-0\" (UID: \"251e5c6f-c762-4a6e-9253-81f94d592239\") " pod="openstack/swift-storage-0" Feb 27 01:23:07 crc kubenswrapper[4771]: E0227 01:23:07.664109 4771 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 27 01:23:07 crc kubenswrapper[4771]: E0227 01:23:07.664149 4771 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 27 01:23:07 crc kubenswrapper[4771]: E0227 01:23:07.664213 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/251e5c6f-c762-4a6e-9253-81f94d592239-etc-swift podName:251e5c6f-c762-4a6e-9253-81f94d592239 nodeName:}" failed. No retries permitted until 2026-02-27 01:23:09.664192323 +0000 UTC m=+1102.601753621 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/251e5c6f-c762-4a6e-9253-81f94d592239-etc-swift") pod "swift-storage-0" (UID: "251e5c6f-c762-4a6e-9253-81f94d592239") : configmap "swift-ring-files" not found Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.701584 4771 scope.go:117] "RemoveContainer" containerID="f8477f3875ca09bdc0753dab90f4d9838358f9c121298b5d73e1a1a66cf13a2d" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.732667 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-d365-account-create-update-hxfxb"] Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.733585 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d365-account-create-update-hxfxb" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.735753 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.763418 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d365-account-create-update-hxfxb"] Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.767657 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gjsr\" (UniqueName: \"kubernetes.io/projected/ae89ba17-392c-48f6-b05f-5217350743fe-kube-api-access-5gjsr\") pod \"glance-db-create-c5xcw\" (UID: \"ae89ba17-392c-48f6-b05f-5217350743fe\") " pod="openstack/glance-db-create-c5xcw" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.767736 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae89ba17-392c-48f6-b05f-5217350743fe-operator-scripts\") pod \"glance-db-create-c5xcw\" (UID: \"ae89ba17-392c-48f6-b05f-5217350743fe\") " pod="openstack/glance-db-create-c5xcw" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.841684 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-w4hf9" event={"ID":"2d72b00b-f919-47dd-8f0b-428a4d08d0e8","Type":"ContainerStarted","Data":"9eac1ca5ec14913833db7f0abc8b1c19460bb4ed27ce8a4af8bb66832562b005"} Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.841905 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-w4hf9" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.846666 4771 generic.go:334] "Generic (PLEG): container finished" podID="bbec25a6-8536-4f09-af33-bd1a36b9e051" containerID="3032c5c43bde1114097f9c6dab5432057c8f29eb9bb89d07830e03212605ed12" exitCode=0 Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.846721 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-92wll" event={"ID":"bbec25a6-8536-4f09-af33-bd1a36b9e051","Type":"ContainerDied","Data":"3032c5c43bde1114097f9c6dab5432057c8f29eb9bb89d07830e03212605ed12"} Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.846746 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-92wll" event={"ID":"bbec25a6-8536-4f09-af33-bd1a36b9e051","Type":"ContainerDied","Data":"6c342f683c424c08563c348ad7c6b65e607176eb21825c0b47e11ebad3e33ebe"} Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.846762 4771 scope.go:117] "RemoveContainer" containerID="3032c5c43bde1114097f9c6dab5432057c8f29eb9bb89d07830e03212605ed12" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.846875 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-92wll" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.851019 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ggggn" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.851069 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ggggn" event={"ID":"ccc234ed-1171-4186-ad79-6029cc652fda","Type":"ContainerDied","Data":"97346c062d08424b6f6a75c1fbd6c5062c282e6bece5b250a8151c62a5c1fd2a"} Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.851112 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97346c062d08424b6f6a75c1fbd6c5062c282e6bece5b250a8151c62a5c1fd2a" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.853430 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8b3d-account-create-update-phxws" event={"ID":"b31ac243-4c5d-4cf2-9be0-a2adbcf42186","Type":"ContainerDied","Data":"6c52f811f58201acc80d5c2440f2a0f8fd6ad4db93002202e82c1fc5dbe24c74"} Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.853455 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c52f811f58201acc80d5c2440f2a0f8fd6ad4db93002202e82c1fc5dbe24c74" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.853493 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8b3d-account-create-update-phxws" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.856733 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cm796" event={"ID":"8a59a151-f189-4128-b462-29557b12a8da","Type":"ContainerStarted","Data":"cd9cf29f0bd64d79d0f1b30e92787f76dbf458be22b2631f900807cee18e818f"} Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.866604 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-w4hf9" podStartSLOduration=3.866581051 podStartE2EDuration="3.866581051s" podCreationTimestamp="2026-02-27 01:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:23:07.857602425 +0000 UTC m=+1100.795163713" watchObservedRunningTime="2026-02-27 01:23:07.866581051 +0000 UTC m=+1100.804142359" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.869044 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae89ba17-392c-48f6-b05f-5217350743fe-operator-scripts\") pod \"glance-db-create-c5xcw\" (UID: \"ae89ba17-392c-48f6-b05f-5217350743fe\") " pod="openstack/glance-db-create-c5xcw" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.869794 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gjsr\" (UniqueName: \"kubernetes.io/projected/ae89ba17-392c-48f6-b05f-5217350743fe-kube-api-access-5gjsr\") pod \"glance-db-create-c5xcw\" (UID: \"ae89ba17-392c-48f6-b05f-5217350743fe\") " pod="openstack/glance-db-create-c5xcw" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.869839 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8f679e1-32b8-4041-bee8-4686a9a9ae2e-operator-scripts\") pod \"glance-d365-account-create-update-hxfxb\" (UID: \"b8f679e1-32b8-4041-bee8-4686a9a9ae2e\") " pod="openstack/glance-d365-account-create-update-hxfxb" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.869865 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b9r5\" (UniqueName: \"kubernetes.io/projected/b8f679e1-32b8-4041-bee8-4686a9a9ae2e-kube-api-access-7b9r5\") pod \"glance-d365-account-create-update-hxfxb\" (UID: \"b8f679e1-32b8-4041-bee8-4686a9a9ae2e\") " pod="openstack/glance-d365-account-create-update-hxfxb" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.870096 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae89ba17-392c-48f6-b05f-5217350743fe-operator-scripts\") pod \"glance-db-create-c5xcw\" (UID: \"ae89ba17-392c-48f6-b05f-5217350743fe\") " pod="openstack/glance-db-create-c5xcw" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.884731 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-92wll"] Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.888455 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gjsr\" (UniqueName: \"kubernetes.io/projected/ae89ba17-392c-48f6-b05f-5217350743fe-kube-api-access-5gjsr\") pod \"glance-db-create-c5xcw\" (UID: \"ae89ba17-392c-48f6-b05f-5217350743fe\") " pod="openstack/glance-db-create-c5xcw" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.888781 4771 scope.go:117] "RemoveContainer" containerID="cba0ad000fd21c0eb073d5bad8b06a4c81fdbcb1e87462eaa0140eb85c17715a" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.892108 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-92wll"] Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.908851 4771 scope.go:117] "RemoveContainer" containerID="d22c8c007adffb7c68f0acf215a1cc79ace8ae6dd8f5e5c52ca0a1ca6ef4a2dc" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.926612 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-c5xcw" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.932246 4771 scope.go:117] "RemoveContainer" containerID="3032c5c43bde1114097f9c6dab5432057c8f29eb9bb89d07830e03212605ed12" Feb 27 01:23:07 crc kubenswrapper[4771]: E0227 01:23:07.932827 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3032c5c43bde1114097f9c6dab5432057c8f29eb9bb89d07830e03212605ed12\": container with ID starting with 3032c5c43bde1114097f9c6dab5432057c8f29eb9bb89d07830e03212605ed12 not found: ID does not exist" containerID="3032c5c43bde1114097f9c6dab5432057c8f29eb9bb89d07830e03212605ed12" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.932878 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3032c5c43bde1114097f9c6dab5432057c8f29eb9bb89d07830e03212605ed12"} err="failed to get container status \"3032c5c43bde1114097f9c6dab5432057c8f29eb9bb89d07830e03212605ed12\": rpc error: code = NotFound desc = could not find container \"3032c5c43bde1114097f9c6dab5432057c8f29eb9bb89d07830e03212605ed12\": container with ID starting with 3032c5c43bde1114097f9c6dab5432057c8f29eb9bb89d07830e03212605ed12 not found: ID does not exist" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.932909 4771 scope.go:117] "RemoveContainer" containerID="cba0ad000fd21c0eb073d5bad8b06a4c81fdbcb1e87462eaa0140eb85c17715a" Feb 27 01:23:07 crc kubenswrapper[4771]: E0227 01:23:07.933390 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cba0ad000fd21c0eb073d5bad8b06a4c81fdbcb1e87462eaa0140eb85c17715a\": container with ID starting with cba0ad000fd21c0eb073d5bad8b06a4c81fdbcb1e87462eaa0140eb85c17715a not found: ID does not exist" containerID="cba0ad000fd21c0eb073d5bad8b06a4c81fdbcb1e87462eaa0140eb85c17715a" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.933419 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cba0ad000fd21c0eb073d5bad8b06a4c81fdbcb1e87462eaa0140eb85c17715a"} err="failed to get container status \"cba0ad000fd21c0eb073d5bad8b06a4c81fdbcb1e87462eaa0140eb85c17715a\": rpc error: code = NotFound desc = could not find container \"cba0ad000fd21c0eb073d5bad8b06a4c81fdbcb1e87462eaa0140eb85c17715a\": container with ID starting with cba0ad000fd21c0eb073d5bad8b06a4c81fdbcb1e87462eaa0140eb85c17715a not found: ID does not exist" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.933440 4771 scope.go:117] "RemoveContainer" containerID="d22c8c007adffb7c68f0acf215a1cc79ace8ae6dd8f5e5c52ca0a1ca6ef4a2dc" Feb 27 01:23:07 crc kubenswrapper[4771]: E0227 01:23:07.934195 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d22c8c007adffb7c68f0acf215a1cc79ace8ae6dd8f5e5c52ca0a1ca6ef4a2dc\": container with ID starting with d22c8c007adffb7c68f0acf215a1cc79ace8ae6dd8f5e5c52ca0a1ca6ef4a2dc not found: ID does not exist" containerID="d22c8c007adffb7c68f0acf215a1cc79ace8ae6dd8f5e5c52ca0a1ca6ef4a2dc" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.934223 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d22c8c007adffb7c68f0acf215a1cc79ace8ae6dd8f5e5c52ca0a1ca6ef4a2dc"} err="failed to get container status \"d22c8c007adffb7c68f0acf215a1cc79ace8ae6dd8f5e5c52ca0a1ca6ef4a2dc\": rpc error: code = NotFound desc = could not find container \"d22c8c007adffb7c68f0acf215a1cc79ace8ae6dd8f5e5c52ca0a1ca6ef4a2dc\": container with ID starting with d22c8c007adffb7c68f0acf215a1cc79ace8ae6dd8f5e5c52ca0a1ca6ef4a2dc not found: ID does not exist" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.972285 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8f679e1-32b8-4041-bee8-4686a9a9ae2e-operator-scripts\") pod \"glance-d365-account-create-update-hxfxb\" (UID: \"b8f679e1-32b8-4041-bee8-4686a9a9ae2e\") " pod="openstack/glance-d365-account-create-update-hxfxb" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.971656 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8f679e1-32b8-4041-bee8-4686a9a9ae2e-operator-scripts\") pod \"glance-d365-account-create-update-hxfxb\" (UID: \"b8f679e1-32b8-4041-bee8-4686a9a9ae2e\") " pod="openstack/glance-d365-account-create-update-hxfxb" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.972381 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b9r5\" (UniqueName: \"kubernetes.io/projected/b8f679e1-32b8-4041-bee8-4686a9a9ae2e-kube-api-access-7b9r5\") pod \"glance-d365-account-create-update-hxfxb\" (UID: \"b8f679e1-32b8-4041-bee8-4686a9a9ae2e\") " pod="openstack/glance-d365-account-create-update-hxfxb" Feb 27 01:23:07 crc kubenswrapper[4771]: I0227 01:23:07.992898 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b9r5\" (UniqueName: \"kubernetes.io/projected/b8f679e1-32b8-4041-bee8-4686a9a9ae2e-kube-api-access-7b9r5\") pod \"glance-d365-account-create-update-hxfxb\" (UID: \"b8f679e1-32b8-4041-bee8-4686a9a9ae2e\") " pod="openstack/glance-d365-account-create-update-hxfxb" Feb 27 01:23:08 crc kubenswrapper[4771]: I0227 01:23:08.048901 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d365-account-create-update-hxfxb" Feb 27 01:23:08 crc kubenswrapper[4771]: I0227 01:23:08.403853 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-c5xcw"] Feb 27 01:23:08 crc kubenswrapper[4771]: I0227 01:23:08.556361 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d365-account-create-update-hxfxb"] Feb 27 01:23:08 crc kubenswrapper[4771]: W0227 01:23:08.561166 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8f679e1_32b8_4041_bee8_4686a9a9ae2e.slice/crio-cdeeb98a1811e6e3f9c1612209ececd07f15c7fb2f64129f285f66359c5bad6b WatchSource:0}: Error finding container cdeeb98a1811e6e3f9c1612209ececd07f15c7fb2f64129f285f66359c5bad6b: Status 404 returned error can't find the container with id cdeeb98a1811e6e3f9c1612209ececd07f15c7fb2f64129f285f66359c5bad6b Feb 27 01:23:08 crc kubenswrapper[4771]: I0227 01:23:08.869358 4771 generic.go:334] "Generic (PLEG): container finished" podID="ae89ba17-392c-48f6-b05f-5217350743fe" containerID="f4b40801898389d09b69135513f45175da3b95797febad5e12f39119006aed40" exitCode=0 Feb 27 01:23:08 crc kubenswrapper[4771]: I0227 01:23:08.869416 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-c5xcw" event={"ID":"ae89ba17-392c-48f6-b05f-5217350743fe","Type":"ContainerDied","Data":"f4b40801898389d09b69135513f45175da3b95797febad5e12f39119006aed40"} Feb 27 01:23:08 crc kubenswrapper[4771]: I0227 01:23:08.869440 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-c5xcw" event={"ID":"ae89ba17-392c-48f6-b05f-5217350743fe","Type":"ContainerStarted","Data":"58f936e4179ba2085c7b007d89144dbe7ea01833016b16d1b0a2c2aa4575ed6a"} Feb 27 01:23:08 crc kubenswrapper[4771]: I0227 01:23:08.877027 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d365-account-create-update-hxfxb" event={"ID":"b8f679e1-32b8-4041-bee8-4686a9a9ae2e","Type":"ContainerStarted","Data":"2e182beeca8a6f3ee67571563ae6acb6f8d973520fd0524931f48e9b19489ad9"} Feb 27 01:23:08 crc kubenswrapper[4771]: I0227 01:23:08.877071 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d365-account-create-update-hxfxb" event={"ID":"b8f679e1-32b8-4041-bee8-4686a9a9ae2e","Type":"ContainerStarted","Data":"cdeeb98a1811e6e3f9c1612209ececd07f15c7fb2f64129f285f66359c5bad6b"} Feb 27 01:23:08 crc kubenswrapper[4771]: I0227 01:23:08.899459 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-d365-account-create-update-hxfxb" podStartSLOduration=1.89944197 podStartE2EDuration="1.89944197s" podCreationTimestamp="2026-02-27 01:23:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:23:08.897843607 +0000 UTC m=+1101.835404895" watchObservedRunningTime="2026-02-27 01:23:08.89944197 +0000 UTC m=+1101.837003258" Feb 27 01:23:09 crc kubenswrapper[4771]: I0227 01:23:09.530613 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-lvns4"] Feb 27 01:23:09 crc kubenswrapper[4771]: I0227 01:23:09.532014 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lvns4" Feb 27 01:23:09 crc kubenswrapper[4771]: I0227 01:23:09.543883 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 27 01:23:09 crc kubenswrapper[4771]: I0227 01:23:09.558617 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-lvns4"] Feb 27 01:23:09 crc kubenswrapper[4771]: I0227 01:23:09.603309 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e76075d-e9ff-4d37-be85-85a02edaebd8-operator-scripts\") pod \"root-account-create-update-lvns4\" (UID: \"5e76075d-e9ff-4d37-be85-85a02edaebd8\") " pod="openstack/root-account-create-update-lvns4" Feb 27 01:23:09 crc kubenswrapper[4771]: I0227 01:23:09.603388 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jgm2\" (UniqueName: \"kubernetes.io/projected/5e76075d-e9ff-4d37-be85-85a02edaebd8-kube-api-access-6jgm2\") pod \"root-account-create-update-lvns4\" (UID: \"5e76075d-e9ff-4d37-be85-85a02edaebd8\") " pod="openstack/root-account-create-update-lvns4" Feb 27 01:23:09 crc kubenswrapper[4771]: I0227 01:23:09.705521 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e76075d-e9ff-4d37-be85-85a02edaebd8-operator-scripts\") pod \"root-account-create-update-lvns4\" (UID: \"5e76075d-e9ff-4d37-be85-85a02edaebd8\") " pod="openstack/root-account-create-update-lvns4" Feb 27 01:23:09 crc kubenswrapper[4771]: I0227 01:23:09.705638 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/251e5c6f-c762-4a6e-9253-81f94d592239-etc-swift\") pod \"swift-storage-0\" (UID: \"251e5c6f-c762-4a6e-9253-81f94d592239\") " pod="openstack/swift-storage-0" Feb 27 01:23:09 crc kubenswrapper[4771]: I0227 01:23:09.705672 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jgm2\" (UniqueName: \"kubernetes.io/projected/5e76075d-e9ff-4d37-be85-85a02edaebd8-kube-api-access-6jgm2\") pod \"root-account-create-update-lvns4\" (UID: \"5e76075d-e9ff-4d37-be85-85a02edaebd8\") " pod="openstack/root-account-create-update-lvns4" Feb 27 01:23:09 crc kubenswrapper[4771]: E0227 01:23:09.706033 4771 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 27 01:23:09 crc kubenswrapper[4771]: E0227 01:23:09.706063 4771 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 27 01:23:09 crc kubenswrapper[4771]: E0227 01:23:09.706130 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/251e5c6f-c762-4a6e-9253-81f94d592239-etc-swift podName:251e5c6f-c762-4a6e-9253-81f94d592239 nodeName:}" failed. No retries permitted until 2026-02-27 01:23:13.706107003 +0000 UTC m=+1106.643668291 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/251e5c6f-c762-4a6e-9253-81f94d592239-etc-swift") pod "swift-storage-0" (UID: "251e5c6f-c762-4a6e-9253-81f94d592239") : configmap "swift-ring-files" not found Feb 27 01:23:09 crc kubenswrapper[4771]: I0227 01:23:09.707021 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e76075d-e9ff-4d37-be85-85a02edaebd8-operator-scripts\") pod \"root-account-create-update-lvns4\" (UID: \"5e76075d-e9ff-4d37-be85-85a02edaebd8\") " pod="openstack/root-account-create-update-lvns4" Feb 27 01:23:09 crc kubenswrapper[4771]: I0227 01:23:09.726603 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jgm2\" (UniqueName: \"kubernetes.io/projected/5e76075d-e9ff-4d37-be85-85a02edaebd8-kube-api-access-6jgm2\") pod \"root-account-create-update-lvns4\" (UID: \"5e76075d-e9ff-4d37-be85-85a02edaebd8\") " pod="openstack/root-account-create-update-lvns4" Feb 27 01:23:09 crc kubenswrapper[4771]: I0227 01:23:09.788023 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbec25a6-8536-4f09-af33-bd1a36b9e051" path="/var/lib/kubelet/pods/bbec25a6-8536-4f09-af33-bd1a36b9e051/volumes" Feb 27 01:23:09 crc kubenswrapper[4771]: I0227 01:23:09.862902 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lvns4" Feb 27 01:23:09 crc kubenswrapper[4771]: I0227 01:23:09.901915 4771 generic.go:334] "Generic (PLEG): container finished" podID="b8f679e1-32b8-4041-bee8-4686a9a9ae2e" containerID="2e182beeca8a6f3ee67571563ae6acb6f8d973520fd0524931f48e9b19489ad9" exitCode=0 Feb 27 01:23:09 crc kubenswrapper[4771]: I0227 01:23:09.901956 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d365-account-create-update-hxfxb" event={"ID":"b8f679e1-32b8-4041-bee8-4686a9a9ae2e","Type":"ContainerDied","Data":"2e182beeca8a6f3ee67571563ae6acb6f8d973520fd0524931f48e9b19489ad9"} Feb 27 01:23:11 crc kubenswrapper[4771]: I0227 01:23:11.220222 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-c5xcw" Feb 27 01:23:11 crc kubenswrapper[4771]: I0227 01:23:11.227538 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d365-account-create-update-hxfxb" Feb 27 01:23:11 crc kubenswrapper[4771]: I0227 01:23:11.376231 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b9r5\" (UniqueName: \"kubernetes.io/projected/b8f679e1-32b8-4041-bee8-4686a9a9ae2e-kube-api-access-7b9r5\") pod \"b8f679e1-32b8-4041-bee8-4686a9a9ae2e\" (UID: \"b8f679e1-32b8-4041-bee8-4686a9a9ae2e\") " Feb 27 01:23:11 crc kubenswrapper[4771]: I0227 01:23:11.377511 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8f679e1-32b8-4041-bee8-4686a9a9ae2e-operator-scripts\") pod \"b8f679e1-32b8-4041-bee8-4686a9a9ae2e\" (UID: \"b8f679e1-32b8-4041-bee8-4686a9a9ae2e\") " Feb 27 01:23:11 crc kubenswrapper[4771]: I0227 01:23:11.377606 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gjsr\" (UniqueName: \"kubernetes.io/projected/ae89ba17-392c-48f6-b05f-5217350743fe-kube-api-access-5gjsr\") pod \"ae89ba17-392c-48f6-b05f-5217350743fe\" (UID: \"ae89ba17-392c-48f6-b05f-5217350743fe\") " Feb 27 01:23:11 crc kubenswrapper[4771]: I0227 01:23:11.377670 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae89ba17-392c-48f6-b05f-5217350743fe-operator-scripts\") pod \"ae89ba17-392c-48f6-b05f-5217350743fe\" (UID: \"ae89ba17-392c-48f6-b05f-5217350743fe\") " Feb 27 01:23:11 crc kubenswrapper[4771]: I0227 01:23:11.378169 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae89ba17-392c-48f6-b05f-5217350743fe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ae89ba17-392c-48f6-b05f-5217350743fe" (UID: "ae89ba17-392c-48f6-b05f-5217350743fe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:11 crc kubenswrapper[4771]: I0227 01:23:11.378280 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8f679e1-32b8-4041-bee8-4686a9a9ae2e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b8f679e1-32b8-4041-bee8-4686a9a9ae2e" (UID: "b8f679e1-32b8-4041-bee8-4686a9a9ae2e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:11 crc kubenswrapper[4771]: I0227 01:23:11.378595 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8f679e1-32b8-4041-bee8-4686a9a9ae2e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:11 crc kubenswrapper[4771]: I0227 01:23:11.378630 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae89ba17-392c-48f6-b05f-5217350743fe-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:11 crc kubenswrapper[4771]: I0227 01:23:11.381676 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae89ba17-392c-48f6-b05f-5217350743fe-kube-api-access-5gjsr" (OuterVolumeSpecName: "kube-api-access-5gjsr") pod "ae89ba17-392c-48f6-b05f-5217350743fe" (UID: "ae89ba17-392c-48f6-b05f-5217350743fe"). InnerVolumeSpecName "kube-api-access-5gjsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:11 crc kubenswrapper[4771]: I0227 01:23:11.382210 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8f679e1-32b8-4041-bee8-4686a9a9ae2e-kube-api-access-7b9r5" (OuterVolumeSpecName: "kube-api-access-7b9r5") pod "b8f679e1-32b8-4041-bee8-4686a9a9ae2e" (UID: "b8f679e1-32b8-4041-bee8-4686a9a9ae2e"). InnerVolumeSpecName "kube-api-access-7b9r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:11 crc kubenswrapper[4771]: I0227 01:23:11.481039 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gjsr\" (UniqueName: \"kubernetes.io/projected/ae89ba17-392c-48f6-b05f-5217350743fe-kube-api-access-5gjsr\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:11 crc kubenswrapper[4771]: I0227 01:23:11.481095 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b9r5\" (UniqueName: \"kubernetes.io/projected/b8f679e1-32b8-4041-bee8-4686a9a9ae2e-kube-api-access-7b9r5\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:11 crc kubenswrapper[4771]: I0227 01:23:11.503762 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-lvns4"] Feb 27 01:23:11 crc kubenswrapper[4771]: W0227 01:23:11.505415 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e76075d_e9ff_4d37_be85_85a02edaebd8.slice/crio-62e6d6900891a2779ab05f93448c0e3842e2c17b72e2facb8fdc049d9443c8b9 WatchSource:0}: Error finding container 62e6d6900891a2779ab05f93448c0e3842e2c17b72e2facb8fdc049d9443c8b9: Status 404 returned error can't find the container with id 62e6d6900891a2779ab05f93448c0e3842e2c17b72e2facb8fdc049d9443c8b9 Feb 27 01:23:11 crc kubenswrapper[4771]: I0227 01:23:11.924303 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-c5xcw" event={"ID":"ae89ba17-392c-48f6-b05f-5217350743fe","Type":"ContainerDied","Data":"58f936e4179ba2085c7b007d89144dbe7ea01833016b16d1b0a2c2aa4575ed6a"} Feb 27 01:23:11 crc kubenswrapper[4771]: I0227 01:23:11.924673 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58f936e4179ba2085c7b007d89144dbe7ea01833016b16d1b0a2c2aa4575ed6a" Feb 27 01:23:11 crc kubenswrapper[4771]: I0227 01:23:11.924758 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-c5xcw" Feb 27 01:23:11 crc kubenswrapper[4771]: I0227 01:23:11.931043 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d365-account-create-update-hxfxb" event={"ID":"b8f679e1-32b8-4041-bee8-4686a9a9ae2e","Type":"ContainerDied","Data":"cdeeb98a1811e6e3f9c1612209ececd07f15c7fb2f64129f285f66359c5bad6b"} Feb 27 01:23:11 crc kubenswrapper[4771]: I0227 01:23:11.931086 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdeeb98a1811e6e3f9c1612209ececd07f15c7fb2f64129f285f66359c5bad6b" Feb 27 01:23:11 crc kubenswrapper[4771]: I0227 01:23:11.931146 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d365-account-create-update-hxfxb" Feb 27 01:23:11 crc kubenswrapper[4771]: I0227 01:23:11.934629 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lvns4" event={"ID":"5e76075d-e9ff-4d37-be85-85a02edaebd8","Type":"ContainerStarted","Data":"5b673c146cecb1f36e1160b06cd8e4b5697929ed2b501711044d9f881a5cc945"} Feb 27 01:23:11 crc kubenswrapper[4771]: I0227 01:23:11.934707 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lvns4" event={"ID":"5e76075d-e9ff-4d37-be85-85a02edaebd8","Type":"ContainerStarted","Data":"62e6d6900891a2779ab05f93448c0e3842e2c17b72e2facb8fdc049d9443c8b9"} Feb 27 01:23:11 crc kubenswrapper[4771]: I0227 01:23:11.938904 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cm796" event={"ID":"8a59a151-f189-4128-b462-29557b12a8da","Type":"ContainerStarted","Data":"066c6a80c412fa45f3bcb574bd851d6ec0c819069a2947b3e39bf79643344c66"} Feb 27 01:23:11 crc kubenswrapper[4771]: I0227 01:23:11.968942 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-lvns4" podStartSLOduration=2.968922721 podStartE2EDuration="2.968922721s" podCreationTimestamp="2026-02-27 01:23:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:23:11.96003187 +0000 UTC m=+1104.897593168" watchObservedRunningTime="2026-02-27 01:23:11.968922721 +0000 UTC m=+1104.906484019" Feb 27 01:23:11 crc kubenswrapper[4771]: I0227 01:23:11.994313 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-cm796" podStartSLOduration=2.073682474 podStartE2EDuration="5.994258942s" podCreationTimestamp="2026-02-27 01:23:06 +0000 UTC" firstStartedPulling="2026-02-27 01:23:07.154102821 +0000 UTC m=+1100.091664109" lastFinishedPulling="2026-02-27 01:23:11.074679289 +0000 UTC m=+1104.012240577" observedRunningTime="2026-02-27 01:23:11.987325493 +0000 UTC m=+1104.924886801" watchObservedRunningTime="2026-02-27 01:23:11.994258942 +0000 UTC m=+1104.931820230" Feb 27 01:23:12 crc kubenswrapper[4771]: E0227 01:23:12.032102 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae89ba17_392c_48f6_b05f_5217350743fe.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae89ba17_392c_48f6_b05f_5217350743fe.slice/crio-58f936e4179ba2085c7b007d89144dbe7ea01833016b16d1b0a2c2aa4575ed6a\": RecentStats: unable to find data in memory cache]" Feb 27 01:23:12 crc kubenswrapper[4771]: I0227 01:23:12.891678 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-xxhc8"] Feb 27 01:23:12 crc kubenswrapper[4771]: E0227 01:23:12.893579 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f679e1-32b8-4041-bee8-4686a9a9ae2e" containerName="mariadb-account-create-update" Feb 27 01:23:12 crc kubenswrapper[4771]: I0227 01:23:12.893679 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f679e1-32b8-4041-bee8-4686a9a9ae2e" containerName="mariadb-account-create-update" Feb 27 01:23:12 crc kubenswrapper[4771]: E0227 01:23:12.893770 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae89ba17-392c-48f6-b05f-5217350743fe" containerName="mariadb-database-create" Feb 27 01:23:12 crc kubenswrapper[4771]: I0227 01:23:12.893841 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae89ba17-392c-48f6-b05f-5217350743fe" containerName="mariadb-database-create" Feb 27 01:23:12 crc kubenswrapper[4771]: I0227 01:23:12.894152 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae89ba17-392c-48f6-b05f-5217350743fe" containerName="mariadb-database-create" Feb 27 01:23:12 crc kubenswrapper[4771]: I0227 01:23:12.894252 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8f679e1-32b8-4041-bee8-4686a9a9ae2e" containerName="mariadb-account-create-update" Feb 27 01:23:12 crc kubenswrapper[4771]: I0227 01:23:12.895919 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xxhc8" Feb 27 01:23:12 crc kubenswrapper[4771]: I0227 01:23:12.899238 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-xxhc8"] Feb 27 01:23:12 crc kubenswrapper[4771]: I0227 01:23:12.903369 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-jpjnq" Feb 27 01:23:12 crc kubenswrapper[4771]: I0227 01:23:12.903369 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 27 01:23:12 crc kubenswrapper[4771]: I0227 01:23:12.948303 4771 generic.go:334] "Generic (PLEG): container finished" podID="5e76075d-e9ff-4d37-be85-85a02edaebd8" containerID="5b673c146cecb1f36e1160b06cd8e4b5697929ed2b501711044d9f881a5cc945" exitCode=0 Feb 27 01:23:12 crc kubenswrapper[4771]: I0227 01:23:12.948389 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lvns4" event={"ID":"5e76075d-e9ff-4d37-be85-85a02edaebd8","Type":"ContainerDied","Data":"5b673c146cecb1f36e1160b06cd8e4b5697929ed2b501711044d9f881a5cc945"} Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.015802 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1e019cf0-706d-444b-98cb-b07123d2a0d1-db-sync-config-data\") pod \"glance-db-sync-xxhc8\" (UID: \"1e019cf0-706d-444b-98cb-b07123d2a0d1\") " pod="openstack/glance-db-sync-xxhc8" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.015945 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e019cf0-706d-444b-98cb-b07123d2a0d1-config-data\") pod \"glance-db-sync-xxhc8\" (UID: \"1e019cf0-706d-444b-98cb-b07123d2a0d1\") " pod="openstack/glance-db-sync-xxhc8" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.016048 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9wfk\" (UniqueName: \"kubernetes.io/projected/1e019cf0-706d-444b-98cb-b07123d2a0d1-kube-api-access-w9wfk\") pod \"glance-db-sync-xxhc8\" (UID: \"1e019cf0-706d-444b-98cb-b07123d2a0d1\") " pod="openstack/glance-db-sync-xxhc8" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.016103 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e019cf0-706d-444b-98cb-b07123d2a0d1-combined-ca-bundle\") pod \"glance-db-sync-xxhc8\" (UID: \"1e019cf0-706d-444b-98cb-b07123d2a0d1\") " pod="openstack/glance-db-sync-xxhc8" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.117782 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1e019cf0-706d-444b-98cb-b07123d2a0d1-db-sync-config-data\") pod \"glance-db-sync-xxhc8\" (UID: \"1e019cf0-706d-444b-98cb-b07123d2a0d1\") " pod="openstack/glance-db-sync-xxhc8" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.117946 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e019cf0-706d-444b-98cb-b07123d2a0d1-config-data\") pod \"glance-db-sync-xxhc8\" (UID: \"1e019cf0-706d-444b-98cb-b07123d2a0d1\") " pod="openstack/glance-db-sync-xxhc8" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.118033 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9wfk\" (UniqueName: \"kubernetes.io/projected/1e019cf0-706d-444b-98cb-b07123d2a0d1-kube-api-access-w9wfk\") pod \"glance-db-sync-xxhc8\" (UID: \"1e019cf0-706d-444b-98cb-b07123d2a0d1\") " pod="openstack/glance-db-sync-xxhc8" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.118096 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e019cf0-706d-444b-98cb-b07123d2a0d1-combined-ca-bundle\") pod \"glance-db-sync-xxhc8\" (UID: \"1e019cf0-706d-444b-98cb-b07123d2a0d1\") " pod="openstack/glance-db-sync-xxhc8" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.130131 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1e019cf0-706d-444b-98cb-b07123d2a0d1-db-sync-config-data\") pod \"glance-db-sync-xxhc8\" (UID: \"1e019cf0-706d-444b-98cb-b07123d2a0d1\") " pod="openstack/glance-db-sync-xxhc8" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.130981 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e019cf0-706d-444b-98cb-b07123d2a0d1-config-data\") pod \"glance-db-sync-xxhc8\" (UID: \"1e019cf0-706d-444b-98cb-b07123d2a0d1\") " pod="openstack/glance-db-sync-xxhc8" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.131070 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e019cf0-706d-444b-98cb-b07123d2a0d1-combined-ca-bundle\") pod \"glance-db-sync-xxhc8\" (UID: \"1e019cf0-706d-444b-98cb-b07123d2a0d1\") " pod="openstack/glance-db-sync-xxhc8" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.141773 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9wfk\" (UniqueName: \"kubernetes.io/projected/1e019cf0-706d-444b-98cb-b07123d2a0d1-kube-api-access-w9wfk\") pod \"glance-db-sync-xxhc8\" (UID: \"1e019cf0-706d-444b-98cb-b07123d2a0d1\") " pod="openstack/glance-db-sync-xxhc8" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.227978 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xxhc8" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.528981 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-rnr4x"] Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.530201 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rnr4x" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.545660 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-rnr4x"] Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.622828 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-768f-account-create-update-gxklj"] Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.623912 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-768f-account-create-update-gxklj" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.624698 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f-operator-scripts\") pod \"keystone-db-create-rnr4x\" (UID: \"ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f\") " pod="openstack/keystone-db-create-rnr4x" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.624863 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45rsf\" (UniqueName: \"kubernetes.io/projected/ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f-kube-api-access-45rsf\") pod \"keystone-db-create-rnr4x\" (UID: \"ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f\") " pod="openstack/keystone-db-create-rnr4x" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.625445 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.634735 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-768f-account-create-update-gxklj"] Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.725893 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60076d22-0bfa-4f4e-adde-42e991825877-operator-scripts\") pod \"keystone-768f-account-create-update-gxklj\" (UID: \"60076d22-0bfa-4f4e-adde-42e991825877\") " pod="openstack/keystone-768f-account-create-update-gxklj" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.726000 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpjnd\" (UniqueName: \"kubernetes.io/projected/60076d22-0bfa-4f4e-adde-42e991825877-kube-api-access-lpjnd\") pod \"keystone-768f-account-create-update-gxklj\" (UID: \"60076d22-0bfa-4f4e-adde-42e991825877\") " pod="openstack/keystone-768f-account-create-update-gxklj" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.726043 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45rsf\" (UniqueName: \"kubernetes.io/projected/ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f-kube-api-access-45rsf\") pod \"keystone-db-create-rnr4x\" (UID: \"ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f\") " pod="openstack/keystone-db-create-rnr4x" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.726223 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/251e5c6f-c762-4a6e-9253-81f94d592239-etc-swift\") pod \"swift-storage-0\" (UID: \"251e5c6f-c762-4a6e-9253-81f94d592239\") " pod="openstack/swift-storage-0" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.726292 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f-operator-scripts\") pod \"keystone-db-create-rnr4x\" (UID: \"ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f\") " pod="openstack/keystone-db-create-rnr4x" Feb 27 01:23:13 crc kubenswrapper[4771]: E0227 01:23:13.726425 4771 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 27 01:23:13 crc kubenswrapper[4771]: E0227 01:23:13.726451 4771 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 27 01:23:13 crc kubenswrapper[4771]: E0227 01:23:13.726517 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/251e5c6f-c762-4a6e-9253-81f94d592239-etc-swift podName:251e5c6f-c762-4a6e-9253-81f94d592239 nodeName:}" failed. No retries permitted until 2026-02-27 01:23:21.726487796 +0000 UTC m=+1114.664049084 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/251e5c6f-c762-4a6e-9253-81f94d592239-etc-swift") pod "swift-storage-0" (UID: "251e5c6f-c762-4a6e-9253-81f94d592239") : configmap "swift-ring-files" not found Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.727227 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f-operator-scripts\") pod \"keystone-db-create-rnr4x\" (UID: \"ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f\") " pod="openstack/keystone-db-create-rnr4x" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.745244 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45rsf\" (UniqueName: \"kubernetes.io/projected/ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f-kube-api-access-45rsf\") pod \"keystone-db-create-rnr4x\" (UID: \"ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f\") " pod="openstack/keystone-db-create-rnr4x" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.751395 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-xxhc8"] Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.828413 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60076d22-0bfa-4f4e-adde-42e991825877-operator-scripts\") pod \"keystone-768f-account-create-update-gxklj\" (UID: \"60076d22-0bfa-4f4e-adde-42e991825877\") " pod="openstack/keystone-768f-account-create-update-gxklj" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.828797 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpjnd\" (UniqueName: \"kubernetes.io/projected/60076d22-0bfa-4f4e-adde-42e991825877-kube-api-access-lpjnd\") pod \"keystone-768f-account-create-update-gxklj\" (UID: \"60076d22-0bfa-4f4e-adde-42e991825877\") " pod="openstack/keystone-768f-account-create-update-gxklj" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.831048 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60076d22-0bfa-4f4e-adde-42e991825877-operator-scripts\") pod \"keystone-768f-account-create-update-gxklj\" (UID: \"60076d22-0bfa-4f4e-adde-42e991825877\") " pod="openstack/keystone-768f-account-create-update-gxklj" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.844427 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpjnd\" (UniqueName: \"kubernetes.io/projected/60076d22-0bfa-4f4e-adde-42e991825877-kube-api-access-lpjnd\") pod \"keystone-768f-account-create-update-gxklj\" (UID: \"60076d22-0bfa-4f4e-adde-42e991825877\") " pod="openstack/keystone-768f-account-create-update-gxklj" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.850739 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rnr4x" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.943013 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-768f-account-create-update-gxklj" Feb 27 01:23:13 crc kubenswrapper[4771]: I0227 01:23:13.964923 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xxhc8" event={"ID":"1e019cf0-706d-444b-98cb-b07123d2a0d1","Type":"ContainerStarted","Data":"eeae6e297cce627c7bea8b24179c57a1ea7da3a21f8338deb8d4eaa1e5ac88fd"} Feb 27 01:23:14 crc kubenswrapper[4771]: I0227 01:23:14.337610 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-rnr4x"] Feb 27 01:23:14 crc kubenswrapper[4771]: W0227 01:23:14.339307 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecdc6e63_c0a3_4fec_9fb5_19b41e507b4f.slice/crio-c617d7065c9ae2909279b714d092f3a2c13e91302132e4410774452973657059 WatchSource:0}: Error finding container c617d7065c9ae2909279b714d092f3a2c13e91302132e4410774452973657059: Status 404 returned error can't find the container with id c617d7065c9ae2909279b714d092f3a2c13e91302132e4410774452973657059 Feb 27 01:23:14 crc kubenswrapper[4771]: I0227 01:23:14.441855 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lvns4" Feb 27 01:23:14 crc kubenswrapper[4771]: I0227 01:23:14.465728 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-768f-account-create-update-gxklj"] Feb 27 01:23:14 crc kubenswrapper[4771]: I0227 01:23:14.541773 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e76075d-e9ff-4d37-be85-85a02edaebd8-operator-scripts\") pod \"5e76075d-e9ff-4d37-be85-85a02edaebd8\" (UID: \"5e76075d-e9ff-4d37-be85-85a02edaebd8\") " Feb 27 01:23:14 crc kubenswrapper[4771]: I0227 01:23:14.541872 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jgm2\" (UniqueName: \"kubernetes.io/projected/5e76075d-e9ff-4d37-be85-85a02edaebd8-kube-api-access-6jgm2\") pod \"5e76075d-e9ff-4d37-be85-85a02edaebd8\" (UID: \"5e76075d-e9ff-4d37-be85-85a02edaebd8\") " Feb 27 01:23:14 crc kubenswrapper[4771]: I0227 01:23:14.542780 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e76075d-e9ff-4d37-be85-85a02edaebd8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5e76075d-e9ff-4d37-be85-85a02edaebd8" (UID: "5e76075d-e9ff-4d37-be85-85a02edaebd8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:14 crc kubenswrapper[4771]: I0227 01:23:14.546829 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e76075d-e9ff-4d37-be85-85a02edaebd8-kube-api-access-6jgm2" (OuterVolumeSpecName: "kube-api-access-6jgm2") pod "5e76075d-e9ff-4d37-be85-85a02edaebd8" (UID: "5e76075d-e9ff-4d37-be85-85a02edaebd8"). InnerVolumeSpecName "kube-api-access-6jgm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:14 crc kubenswrapper[4771]: I0227 01:23:14.643303 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jgm2\" (UniqueName: \"kubernetes.io/projected/5e76075d-e9ff-4d37-be85-85a02edaebd8-kube-api-access-6jgm2\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:14 crc kubenswrapper[4771]: I0227 01:23:14.643707 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e76075d-e9ff-4d37-be85-85a02edaebd8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:14 crc kubenswrapper[4771]: I0227 01:23:14.976308 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lvns4" Feb 27 01:23:14 crc kubenswrapper[4771]: I0227 01:23:14.976300 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lvns4" event={"ID":"5e76075d-e9ff-4d37-be85-85a02edaebd8","Type":"ContainerDied","Data":"62e6d6900891a2779ab05f93448c0e3842e2c17b72e2facb8fdc049d9443c8b9"} Feb 27 01:23:14 crc kubenswrapper[4771]: I0227 01:23:14.976355 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62e6d6900891a2779ab05f93448c0e3842e2c17b72e2facb8fdc049d9443c8b9" Feb 27 01:23:14 crc kubenswrapper[4771]: I0227 01:23:14.979065 4771 generic.go:334] "Generic (PLEG): container finished" podID="ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f" containerID="727b38ebe58c1975f28912b967f22314cfe0bfe0e1164b4f82597f2eb40c1934" exitCode=0 Feb 27 01:23:14 crc kubenswrapper[4771]: I0227 01:23:14.979130 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rnr4x" event={"ID":"ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f","Type":"ContainerDied","Data":"727b38ebe58c1975f28912b967f22314cfe0bfe0e1164b4f82597f2eb40c1934"} Feb 27 01:23:14 crc kubenswrapper[4771]: I0227 01:23:14.979160 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rnr4x" event={"ID":"ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f","Type":"ContainerStarted","Data":"c617d7065c9ae2909279b714d092f3a2c13e91302132e4410774452973657059"} Feb 27 01:23:14 crc kubenswrapper[4771]: I0227 01:23:14.981276 4771 generic.go:334] "Generic (PLEG): container finished" podID="60076d22-0bfa-4f4e-adde-42e991825877" containerID="41d35fa3bc72ae3e6a241953dad69a0fe304a3eb8fee468bc3293f76983c9c95" exitCode=0 Feb 27 01:23:14 crc kubenswrapper[4771]: I0227 01:23:14.981308 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-768f-account-create-update-gxklj" event={"ID":"60076d22-0bfa-4f4e-adde-42e991825877","Type":"ContainerDied","Data":"41d35fa3bc72ae3e6a241953dad69a0fe304a3eb8fee468bc3293f76983c9c95"} Feb 27 01:23:14 crc kubenswrapper[4771]: I0227 01:23:14.981336 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-768f-account-create-update-gxklj" event={"ID":"60076d22-0bfa-4f4e-adde-42e991825877","Type":"ContainerStarted","Data":"614af36926dff2a37e11bb1011cd8fa840b1f9e14b52106da8fc6e19c4067ff9"} Feb 27 01:23:15 crc kubenswrapper[4771]: I0227 01:23:15.100694 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-w4hf9" Feb 27 01:23:15 crc kubenswrapper[4771]: I0227 01:23:15.162406 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-cb5pm"] Feb 27 01:23:15 crc kubenswrapper[4771]: I0227 01:23:15.162656 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-cb5pm" podUID="b40b4842-d003-44ce-aa40-f298d8deced5" containerName="dnsmasq-dns" containerID="cri-o://1c79557a1b8a1c60c006d2d2f5be6a0cbcabb7ea52a7dc22af2e3572c2d8a7fb" gracePeriod=10 Feb 27 01:23:15 crc kubenswrapper[4771]: I0227 01:23:15.756842 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-cb5pm" Feb 27 01:23:15 crc kubenswrapper[4771]: I0227 01:23:15.909130 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b40b4842-d003-44ce-aa40-f298d8deced5-dns-svc\") pod \"b40b4842-d003-44ce-aa40-f298d8deced5\" (UID: \"b40b4842-d003-44ce-aa40-f298d8deced5\") " Feb 27 01:23:15 crc kubenswrapper[4771]: I0227 01:23:15.909307 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rqzf\" (UniqueName: \"kubernetes.io/projected/b40b4842-d003-44ce-aa40-f298d8deced5-kube-api-access-9rqzf\") pod \"b40b4842-d003-44ce-aa40-f298d8deced5\" (UID: \"b40b4842-d003-44ce-aa40-f298d8deced5\") " Feb 27 01:23:15 crc kubenswrapper[4771]: I0227 01:23:15.909363 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b40b4842-d003-44ce-aa40-f298d8deced5-config\") pod \"b40b4842-d003-44ce-aa40-f298d8deced5\" (UID: \"b40b4842-d003-44ce-aa40-f298d8deced5\") " Feb 27 01:23:15 crc kubenswrapper[4771]: I0227 01:23:15.933095 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b40b4842-d003-44ce-aa40-f298d8deced5-kube-api-access-9rqzf" (OuterVolumeSpecName: "kube-api-access-9rqzf") pod "b40b4842-d003-44ce-aa40-f298d8deced5" (UID: "b40b4842-d003-44ce-aa40-f298d8deced5"). InnerVolumeSpecName "kube-api-access-9rqzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:15 crc kubenswrapper[4771]: I0227 01:23:15.977497 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b40b4842-d003-44ce-aa40-f298d8deced5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b40b4842-d003-44ce-aa40-f298d8deced5" (UID: "b40b4842-d003-44ce-aa40-f298d8deced5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:15 crc kubenswrapper[4771]: I0227 01:23:15.985469 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-lvns4"] Feb 27 01:23:15 crc kubenswrapper[4771]: I0227 01:23:15.991884 4771 generic.go:334] "Generic (PLEG): container finished" podID="b40b4842-d003-44ce-aa40-f298d8deced5" containerID="1c79557a1b8a1c60c006d2d2f5be6a0cbcabb7ea52a7dc22af2e3572c2d8a7fb" exitCode=0 Feb 27 01:23:15 crc kubenswrapper[4771]: I0227 01:23:15.991936 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-cb5pm" event={"ID":"b40b4842-d003-44ce-aa40-f298d8deced5","Type":"ContainerDied","Data":"1c79557a1b8a1c60c006d2d2f5be6a0cbcabb7ea52a7dc22af2e3572c2d8a7fb"} Feb 27 01:23:15 crc kubenswrapper[4771]: I0227 01:23:15.991988 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-cb5pm" event={"ID":"b40b4842-d003-44ce-aa40-f298d8deced5","Type":"ContainerDied","Data":"a0bdbbc6c4fdd8389c8f8cb66277ce7d153d256f6c8fe838e45cd633974cc85f"} Feb 27 01:23:15 crc kubenswrapper[4771]: I0227 01:23:15.991992 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-cb5pm" Feb 27 01:23:15 crc kubenswrapper[4771]: I0227 01:23:15.992012 4771 scope.go:117] "RemoveContainer" containerID="1c79557a1b8a1c60c006d2d2f5be6a0cbcabb7ea52a7dc22af2e3572c2d8a7fb" Feb 27 01:23:15 crc kubenswrapper[4771]: I0227 01:23:15.997478 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-lvns4"] Feb 27 01:23:15 crc kubenswrapper[4771]: I0227 01:23:15.997693 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b40b4842-d003-44ce-aa40-f298d8deced5-config" (OuterVolumeSpecName: "config") pod "b40b4842-d003-44ce-aa40-f298d8deced5" (UID: "b40b4842-d003-44ce-aa40-f298d8deced5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:16 crc kubenswrapper[4771]: I0227 01:23:16.009861 4771 scope.go:117] "RemoveContainer" containerID="22f2e8e34c04cdd22a4a765a63375b7cd357ed39c94c7bf40df8d9fd1a35c348" Feb 27 01:23:16 crc kubenswrapper[4771]: I0227 01:23:16.011515 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b40b4842-d003-44ce-aa40-f298d8deced5-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:16 crc kubenswrapper[4771]: I0227 01:23:16.011539 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rqzf\" (UniqueName: \"kubernetes.io/projected/b40b4842-d003-44ce-aa40-f298d8deced5-kube-api-access-9rqzf\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:16 crc kubenswrapper[4771]: I0227 01:23:16.011562 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b40b4842-d003-44ce-aa40-f298d8deced5-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:16 crc kubenswrapper[4771]: I0227 01:23:16.034035 4771 scope.go:117] "RemoveContainer" containerID="1c79557a1b8a1c60c006d2d2f5be6a0cbcabb7ea52a7dc22af2e3572c2d8a7fb" Feb 27 01:23:16 crc kubenswrapper[4771]: E0227 01:23:16.034769 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c79557a1b8a1c60c006d2d2f5be6a0cbcabb7ea52a7dc22af2e3572c2d8a7fb\": container with ID starting with 1c79557a1b8a1c60c006d2d2f5be6a0cbcabb7ea52a7dc22af2e3572c2d8a7fb not found: ID does not exist" containerID="1c79557a1b8a1c60c006d2d2f5be6a0cbcabb7ea52a7dc22af2e3572c2d8a7fb" Feb 27 01:23:16 crc kubenswrapper[4771]: I0227 01:23:16.034812 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c79557a1b8a1c60c006d2d2f5be6a0cbcabb7ea52a7dc22af2e3572c2d8a7fb"} err="failed to get container status \"1c79557a1b8a1c60c006d2d2f5be6a0cbcabb7ea52a7dc22af2e3572c2d8a7fb\": rpc error: code = NotFound desc = could not find container \"1c79557a1b8a1c60c006d2d2f5be6a0cbcabb7ea52a7dc22af2e3572c2d8a7fb\": container with ID starting with 1c79557a1b8a1c60c006d2d2f5be6a0cbcabb7ea52a7dc22af2e3572c2d8a7fb not found: ID does not exist" Feb 27 01:23:16 crc kubenswrapper[4771]: I0227 01:23:16.034836 4771 scope.go:117] "RemoveContainer" containerID="22f2e8e34c04cdd22a4a765a63375b7cd357ed39c94c7bf40df8d9fd1a35c348" Feb 27 01:23:16 crc kubenswrapper[4771]: E0227 01:23:16.035168 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22f2e8e34c04cdd22a4a765a63375b7cd357ed39c94c7bf40df8d9fd1a35c348\": container with ID starting with 22f2e8e34c04cdd22a4a765a63375b7cd357ed39c94c7bf40df8d9fd1a35c348 not found: ID does not exist" containerID="22f2e8e34c04cdd22a4a765a63375b7cd357ed39c94c7bf40df8d9fd1a35c348" Feb 27 01:23:16 crc kubenswrapper[4771]: I0227 01:23:16.035183 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22f2e8e34c04cdd22a4a765a63375b7cd357ed39c94c7bf40df8d9fd1a35c348"} err="failed to get container status \"22f2e8e34c04cdd22a4a765a63375b7cd357ed39c94c7bf40df8d9fd1a35c348\": rpc error: code = NotFound desc = could not find container \"22f2e8e34c04cdd22a4a765a63375b7cd357ed39c94c7bf40df8d9fd1a35c348\": container with ID starting with 22f2e8e34c04cdd22a4a765a63375b7cd357ed39c94c7bf40df8d9fd1a35c348 not found: ID does not exist" Feb 27 01:23:16 crc kubenswrapper[4771]: I0227 01:23:16.374975 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-cb5pm"] Feb 27 01:23:16 crc kubenswrapper[4771]: I0227 01:23:16.396618 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-cb5pm"] Feb 27 01:23:16 crc kubenswrapper[4771]: I0227 01:23:16.500748 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rnr4x" Feb 27 01:23:16 crc kubenswrapper[4771]: I0227 01:23:16.514136 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-768f-account-create-update-gxklj" Feb 27 01:23:16 crc kubenswrapper[4771]: I0227 01:23:16.620444 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpjnd\" (UniqueName: \"kubernetes.io/projected/60076d22-0bfa-4f4e-adde-42e991825877-kube-api-access-lpjnd\") pod \"60076d22-0bfa-4f4e-adde-42e991825877\" (UID: \"60076d22-0bfa-4f4e-adde-42e991825877\") " Feb 27 01:23:16 crc kubenswrapper[4771]: I0227 01:23:16.620598 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f-operator-scripts\") pod \"ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f\" (UID: \"ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f\") " Feb 27 01:23:16 crc kubenswrapper[4771]: I0227 01:23:16.620750 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45rsf\" (UniqueName: \"kubernetes.io/projected/ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f-kube-api-access-45rsf\") pod \"ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f\" (UID: \"ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f\") " Feb 27 01:23:16 crc kubenswrapper[4771]: I0227 01:23:16.620831 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60076d22-0bfa-4f4e-adde-42e991825877-operator-scripts\") pod \"60076d22-0bfa-4f4e-adde-42e991825877\" (UID: \"60076d22-0bfa-4f4e-adde-42e991825877\") " Feb 27 01:23:16 crc kubenswrapper[4771]: I0227 01:23:16.621374 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f" (UID: "ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:16 crc kubenswrapper[4771]: I0227 01:23:16.621572 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:16 crc kubenswrapper[4771]: I0227 01:23:16.622018 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60076d22-0bfa-4f4e-adde-42e991825877-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "60076d22-0bfa-4f4e-adde-42e991825877" (UID: "60076d22-0bfa-4f4e-adde-42e991825877"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:16 crc kubenswrapper[4771]: I0227 01:23:16.624750 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f-kube-api-access-45rsf" (OuterVolumeSpecName: "kube-api-access-45rsf") pod "ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f" (UID: "ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f"). InnerVolumeSpecName "kube-api-access-45rsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:16 crc kubenswrapper[4771]: I0227 01:23:16.625688 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60076d22-0bfa-4f4e-adde-42e991825877-kube-api-access-lpjnd" (OuterVolumeSpecName: "kube-api-access-lpjnd") pod "60076d22-0bfa-4f4e-adde-42e991825877" (UID: "60076d22-0bfa-4f4e-adde-42e991825877"). InnerVolumeSpecName "kube-api-access-lpjnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:16 crc kubenswrapper[4771]: I0227 01:23:16.722981 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpjnd\" (UniqueName: \"kubernetes.io/projected/60076d22-0bfa-4f4e-adde-42e991825877-kube-api-access-lpjnd\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:16 crc kubenswrapper[4771]: I0227 01:23:16.723025 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45rsf\" (UniqueName: \"kubernetes.io/projected/ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f-kube-api-access-45rsf\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:16 crc kubenswrapper[4771]: I0227 01:23:16.723036 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60076d22-0bfa-4f4e-adde-42e991825877-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:17 crc kubenswrapper[4771]: I0227 01:23:17.001577 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-768f-account-create-update-gxklj" event={"ID":"60076d22-0bfa-4f4e-adde-42e991825877","Type":"ContainerDied","Data":"614af36926dff2a37e11bb1011cd8fa840b1f9e14b52106da8fc6e19c4067ff9"} Feb 27 01:23:17 crc kubenswrapper[4771]: I0227 01:23:17.001616 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="614af36926dff2a37e11bb1011cd8fa840b1f9e14b52106da8fc6e19c4067ff9" Feb 27 01:23:17 crc kubenswrapper[4771]: I0227 01:23:17.001670 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-768f-account-create-update-gxklj" Feb 27 01:23:17 crc kubenswrapper[4771]: I0227 01:23:17.005084 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rnr4x" event={"ID":"ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f","Type":"ContainerDied","Data":"c617d7065c9ae2909279b714d092f3a2c13e91302132e4410774452973657059"} Feb 27 01:23:17 crc kubenswrapper[4771]: I0227 01:23:17.005136 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c617d7065c9ae2909279b714d092f3a2c13e91302132e4410774452973657059" Feb 27 01:23:17 crc kubenswrapper[4771]: I0227 01:23:17.005187 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rnr4x" Feb 27 01:23:17 crc kubenswrapper[4771]: I0227 01:23:17.791954 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e76075d-e9ff-4d37-be85-85a02edaebd8" path="/var/lib/kubelet/pods/5e76075d-e9ff-4d37-be85-85a02edaebd8/volumes" Feb 27 01:23:17 crc kubenswrapper[4771]: I0227 01:23:17.792674 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b40b4842-d003-44ce-aa40-f298d8deced5" path="/var/lib/kubelet/pods/b40b4842-d003-44ce-aa40-f298d8deced5/volumes" Feb 27 01:23:18 crc kubenswrapper[4771]: I0227 01:23:18.014620 4771 generic.go:334] "Generic (PLEG): container finished" podID="8a59a151-f189-4128-b462-29557b12a8da" containerID="066c6a80c412fa45f3bcb574bd851d6ec0c819069a2947b3e39bf79643344c66" exitCode=0 Feb 27 01:23:18 crc kubenswrapper[4771]: I0227 01:23:18.014669 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cm796" event={"ID":"8a59a151-f189-4128-b462-29557b12a8da","Type":"ContainerDied","Data":"066c6a80c412fa45f3bcb574bd851d6ec0c819069a2947b3e39bf79643344c66"} Feb 27 01:23:18 crc kubenswrapper[4771]: I0227 01:23:18.348802 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.500159 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cm796" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.592073 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a59a151-f189-4128-b462-29557b12a8da-ring-data-devices\") pod \"8a59a151-f189-4128-b462-29557b12a8da\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.592422 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a59a151-f189-4128-b462-29557b12a8da-swiftconf\") pod \"8a59a151-f189-4128-b462-29557b12a8da\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.592614 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a59a151-f189-4128-b462-29557b12a8da-dispersionconf\") pod \"8a59a151-f189-4128-b462-29557b12a8da\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.592770 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a59a151-f189-4128-b462-29557b12a8da-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8a59a151-f189-4128-b462-29557b12a8da" (UID: "8a59a151-f189-4128-b462-29557b12a8da"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.592909 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a59a151-f189-4128-b462-29557b12a8da-scripts\") pod \"8a59a151-f189-4128-b462-29557b12a8da\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.593147 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a59a151-f189-4128-b462-29557b12a8da-etc-swift\") pod \"8a59a151-f189-4128-b462-29557b12a8da\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.593275 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj74q\" (UniqueName: \"kubernetes.io/projected/8a59a151-f189-4128-b462-29557b12a8da-kube-api-access-kj74q\") pod \"8a59a151-f189-4128-b462-29557b12a8da\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.593378 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a59a151-f189-4128-b462-29557b12a8da-combined-ca-bundle\") pod \"8a59a151-f189-4128-b462-29557b12a8da\" (UID: \"8a59a151-f189-4128-b462-29557b12a8da\") " Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.593861 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a59a151-f189-4128-b462-29557b12a8da-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8a59a151-f189-4128-b462-29557b12a8da" (UID: "8a59a151-f189-4128-b462-29557b12a8da"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.594091 4771 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a59a151-f189-4128-b462-29557b12a8da-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.594208 4771 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a59a151-f189-4128-b462-29557b12a8da-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.601863 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a59a151-f189-4128-b462-29557b12a8da-kube-api-access-kj74q" (OuterVolumeSpecName: "kube-api-access-kj74q") pod "8a59a151-f189-4128-b462-29557b12a8da" (UID: "8a59a151-f189-4128-b462-29557b12a8da"). InnerVolumeSpecName "kube-api-access-kj74q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.602351 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a59a151-f189-4128-b462-29557b12a8da-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8a59a151-f189-4128-b462-29557b12a8da" (UID: "8a59a151-f189-4128-b462-29557b12a8da"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.624657 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a59a151-f189-4128-b462-29557b12a8da-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8a59a151-f189-4128-b462-29557b12a8da" (UID: "8a59a151-f189-4128-b462-29557b12a8da"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.646290 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-fjbn5"] Feb 27 01:23:19 crc kubenswrapper[4771]: E0227 01:23:19.646744 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60076d22-0bfa-4f4e-adde-42e991825877" containerName="mariadb-account-create-update" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.646787 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="60076d22-0bfa-4f4e-adde-42e991825877" containerName="mariadb-account-create-update" Feb 27 01:23:19 crc kubenswrapper[4771]: E0227 01:23:19.646806 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b40b4842-d003-44ce-aa40-f298d8deced5" containerName="init" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.646812 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b40b4842-d003-44ce-aa40-f298d8deced5" containerName="init" Feb 27 01:23:19 crc kubenswrapper[4771]: E0227 01:23:19.646822 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f" containerName="mariadb-database-create" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.646828 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f" containerName="mariadb-database-create" Feb 27 01:23:19 crc kubenswrapper[4771]: E0227 01:23:19.646901 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b40b4842-d003-44ce-aa40-f298d8deced5" containerName="dnsmasq-dns" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.646908 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b40b4842-d003-44ce-aa40-f298d8deced5" containerName="dnsmasq-dns" Feb 27 01:23:19 crc kubenswrapper[4771]: E0227 01:23:19.646920 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a59a151-f189-4128-b462-29557b12a8da" containerName="swift-ring-rebalance" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.646946 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a59a151-f189-4128-b462-29557b12a8da" containerName="swift-ring-rebalance" Feb 27 01:23:19 crc kubenswrapper[4771]: E0227 01:23:19.646963 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e76075d-e9ff-4d37-be85-85a02edaebd8" containerName="mariadb-account-create-update" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.646971 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e76075d-e9ff-4d37-be85-85a02edaebd8" containerName="mariadb-account-create-update" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.647158 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f" containerName="mariadb-database-create" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.647190 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a59a151-f189-4128-b462-29557b12a8da" containerName="swift-ring-rebalance" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.647201 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="60076d22-0bfa-4f4e-adde-42e991825877" containerName="mariadb-account-create-update" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.647214 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b40b4842-d003-44ce-aa40-f298d8deced5" containerName="dnsmasq-dns" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.647226 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e76075d-e9ff-4d37-be85-85a02edaebd8" containerName="mariadb-account-create-update" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.647370 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a59a151-f189-4128-b462-29557b12a8da-scripts" (OuterVolumeSpecName: "scripts") pod "8a59a151-f189-4128-b462-29557b12a8da" (UID: "8a59a151-f189-4128-b462-29557b12a8da"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.647859 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fjbn5" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.650267 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.650251 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a59a151-f189-4128-b462-29557b12a8da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a59a151-f189-4128-b462-29557b12a8da" (UID: "8a59a151-f189-4128-b462-29557b12a8da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.654074 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fjbn5"] Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.696492 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a026cce-a7cd-45e6-b4e0-6081d19b016a-operator-scripts\") pod \"root-account-create-update-fjbn5\" (UID: \"9a026cce-a7cd-45e6-b4e0-6081d19b016a\") " pod="openstack/root-account-create-update-fjbn5" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.696676 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg75x\" (UniqueName: \"kubernetes.io/projected/9a026cce-a7cd-45e6-b4e0-6081d19b016a-kube-api-access-jg75x\") pod \"root-account-create-update-fjbn5\" (UID: \"9a026cce-a7cd-45e6-b4e0-6081d19b016a\") " pod="openstack/root-account-create-update-fjbn5" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.696967 4771 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a59a151-f189-4128-b462-29557b12a8da-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.696999 4771 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a59a151-f189-4128-b462-29557b12a8da-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.697021 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a59a151-f189-4128-b462-29557b12a8da-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.697038 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj74q\" (UniqueName: \"kubernetes.io/projected/8a59a151-f189-4128-b462-29557b12a8da-kube-api-access-kj74q\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.697054 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a59a151-f189-4128-b462-29557b12a8da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.798132 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg75x\" (UniqueName: \"kubernetes.io/projected/9a026cce-a7cd-45e6-b4e0-6081d19b016a-kube-api-access-jg75x\") pod \"root-account-create-update-fjbn5\" (UID: \"9a026cce-a7cd-45e6-b4e0-6081d19b016a\") " pod="openstack/root-account-create-update-fjbn5" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.798421 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a026cce-a7cd-45e6-b4e0-6081d19b016a-operator-scripts\") pod \"root-account-create-update-fjbn5\" (UID: \"9a026cce-a7cd-45e6-b4e0-6081d19b016a\") " pod="openstack/root-account-create-update-fjbn5" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.799675 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a026cce-a7cd-45e6-b4e0-6081d19b016a-operator-scripts\") pod \"root-account-create-update-fjbn5\" (UID: \"9a026cce-a7cd-45e6-b4e0-6081d19b016a\") " pod="openstack/root-account-create-update-fjbn5" Feb 27 01:23:19 crc kubenswrapper[4771]: I0227 01:23:19.814335 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg75x\" (UniqueName: \"kubernetes.io/projected/9a026cce-a7cd-45e6-b4e0-6081d19b016a-kube-api-access-jg75x\") pod \"root-account-create-update-fjbn5\" (UID: \"9a026cce-a7cd-45e6-b4e0-6081d19b016a\") " pod="openstack/root-account-create-update-fjbn5" Feb 27 01:23:20 crc kubenswrapper[4771]: I0227 01:23:20.031375 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fjbn5" Feb 27 01:23:20 crc kubenswrapper[4771]: I0227 01:23:20.037783 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cm796" event={"ID":"8a59a151-f189-4128-b462-29557b12a8da","Type":"ContainerDied","Data":"cd9cf29f0bd64d79d0f1b30e92787f76dbf458be22b2631f900807cee18e818f"} Feb 27 01:23:20 crc kubenswrapper[4771]: I0227 01:23:20.037833 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd9cf29f0bd64d79d0f1b30e92787f76dbf458be22b2631f900807cee18e818f" Feb 27 01:23:20 crc kubenswrapper[4771]: I0227 01:23:20.037915 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cm796" Feb 27 01:23:21 crc kubenswrapper[4771]: I0227 01:23:21.728442 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/251e5c6f-c762-4a6e-9253-81f94d592239-etc-swift\") pod \"swift-storage-0\" (UID: \"251e5c6f-c762-4a6e-9253-81f94d592239\") " pod="openstack/swift-storage-0" Feb 27 01:23:21 crc kubenswrapper[4771]: I0227 01:23:21.741856 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/251e5c6f-c762-4a6e-9253-81f94d592239-etc-swift\") pod \"swift-storage-0\" (UID: \"251e5c6f-c762-4a6e-9253-81f94d592239\") " pod="openstack/swift-storage-0" Feb 27 01:23:21 crc kubenswrapper[4771]: I0227 01:23:21.840935 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 27 01:23:23 crc kubenswrapper[4771]: I0227 01:23:23.040160 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-s5lkp" podUID="8c578c69-744e-425b-8bb1-76eec4b332ec" containerName="ovn-controller" probeResult="failure" output=< Feb 27 01:23:23 crc kubenswrapper[4771]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 27 01:23:23 crc kubenswrapper[4771]: > Feb 27 01:23:23 crc kubenswrapper[4771]: I0227 01:23:23.059593 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tjchc" Feb 27 01:23:25 crc kubenswrapper[4771]: I0227 01:23:25.074643 4771 generic.go:334] "Generic (PLEG): container finished" podID="a3aec8d2-008a-4b77-a30b-23f8e812e332" containerID="29e220567126edda63a240d888ea845a559902f6854e201515bb3868c5d74e70" exitCode=0 Feb 27 01:23:25 crc kubenswrapper[4771]: I0227 01:23:25.074719 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a3aec8d2-008a-4b77-a30b-23f8e812e332","Type":"ContainerDied","Data":"29e220567126edda63a240d888ea845a559902f6854e201515bb3868c5d74e70"} Feb 27 01:23:25 crc kubenswrapper[4771]: I0227 01:23:25.077980 4771 generic.go:334] "Generic (PLEG): container finished" podID="a2c84581-5806-46dd-b352-390ef2d9826c" containerID="21065341d65c55328d33fca19982cb91c451939d0b0dd32c90272cca9aecf888" exitCode=0 Feb 27 01:23:25 crc kubenswrapper[4771]: I0227 01:23:25.078011 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a2c84581-5806-46dd-b352-390ef2d9826c","Type":"ContainerDied","Data":"21065341d65c55328d33fca19982cb91c451939d0b0dd32c90272cca9aecf888"} Feb 27 01:23:27 crc kubenswrapper[4771]: I0227 01:23:27.544515 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fjbn5"] Feb 27 01:23:27 crc kubenswrapper[4771]: I0227 01:23:27.582214 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 27 01:23:27 crc kubenswrapper[4771]: W0227 01:23:27.605970 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod251e5c6f_c762_4a6e_9253_81f94d592239.slice/crio-9ee1062d83fd0a4ae514acd5e89aa4a5509f42b4b7d28a05ac34523c9bdc0fc6 WatchSource:0}: Error finding container 9ee1062d83fd0a4ae514acd5e89aa4a5509f42b4b7d28a05ac34523c9bdc0fc6: Status 404 returned error can't find the container with id 9ee1062d83fd0a4ae514acd5e89aa4a5509f42b4b7d28a05ac34523c9bdc0fc6 Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.031060 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-s5lkp" podUID="8c578c69-744e-425b-8bb1-76eec4b332ec" containerName="ovn-controller" probeResult="failure" output=< Feb 27 01:23:28 crc kubenswrapper[4771]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 27 01:23:28 crc kubenswrapper[4771]: > Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.089390 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tjchc" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.120021 4771 generic.go:334] "Generic (PLEG): container finished" podID="9a026cce-a7cd-45e6-b4e0-6081d19b016a" containerID="00fa4b891b821b195565259e42c2ca98d82e3fa932565fb0a3a1e9128225c110" exitCode=0 Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.120080 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fjbn5" event={"ID":"9a026cce-a7cd-45e6-b4e0-6081d19b016a","Type":"ContainerDied","Data":"00fa4b891b821b195565259e42c2ca98d82e3fa932565fb0a3a1e9128225c110"} Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.120106 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fjbn5" event={"ID":"9a026cce-a7cd-45e6-b4e0-6081d19b016a","Type":"ContainerStarted","Data":"29f5fca2e9a02c0bd6df25e3e9ff80ed42f506d45e3b79eb094d86be200003b3"} Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.122122 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a2c84581-5806-46dd-b352-390ef2d9826c","Type":"ContainerStarted","Data":"15d2961cadf42189b71af6f1511da1e669276312eea977d283256c35c14a13bf"} Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.122754 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.124530 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a3aec8d2-008a-4b77-a30b-23f8e812e332","Type":"ContainerStarted","Data":"67b2e3fd5aad9a96898bb725ef413d83e0bc7369e8bc03e5c113578e5985f46f"} Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.124826 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.125942 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"251e5c6f-c762-4a6e-9253-81f94d592239","Type":"ContainerStarted","Data":"9ee1062d83fd0a4ae514acd5e89aa4a5509f42b4b7d28a05ac34523c9bdc0fc6"} Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.127113 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xxhc8" event={"ID":"1e019cf0-706d-444b-98cb-b07123d2a0d1","Type":"ContainerStarted","Data":"e0e45977b813e294d17ab1b1dd234d6ea49582deb369001e7d9b65560cfd6936"} Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.153596 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-xxhc8" podStartSLOduration=2.864891742 podStartE2EDuration="16.153579881s" podCreationTimestamp="2026-02-27 01:23:12 +0000 UTC" firstStartedPulling="2026-02-27 01:23:13.759665238 +0000 UTC m=+1106.697226566" lastFinishedPulling="2026-02-27 01:23:27.048353417 +0000 UTC m=+1119.985914705" observedRunningTime="2026-02-27 01:23:28.152625595 +0000 UTC m=+1121.090186883" watchObservedRunningTime="2026-02-27 01:23:28.153579881 +0000 UTC m=+1121.091141169" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.186395 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=52.549465017 podStartE2EDuration="1m0.186372893s" podCreationTimestamp="2026-02-27 01:22:28 +0000 UTC" firstStartedPulling="2026-02-27 01:22:43.030641661 +0000 UTC m=+1075.968202949" lastFinishedPulling="2026-02-27 01:22:50.667549537 +0000 UTC m=+1083.605110825" observedRunningTime="2026-02-27 01:23:28.171940031 +0000 UTC m=+1121.109501319" watchObservedRunningTime="2026-02-27 01:23:28.186372893 +0000 UTC m=+1121.123934181" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.207926 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=52.640353549 podStartE2EDuration="1m0.20790609s" podCreationTimestamp="2026-02-27 01:22:28 +0000 UTC" firstStartedPulling="2026-02-27 01:22:43.099703148 +0000 UTC m=+1076.037264436" lastFinishedPulling="2026-02-27 01:22:50.667255689 +0000 UTC m=+1083.604816977" observedRunningTime="2026-02-27 01:23:28.202031919 +0000 UTC m=+1121.139593227" watchObservedRunningTime="2026-02-27 01:23:28.20790609 +0000 UTC m=+1121.145467388" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.321948 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-s5lkp-config-stlxl"] Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.323273 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s5lkp-config-stlxl" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.326733 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.330984 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s5lkp-config-stlxl"] Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.494378 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ff84619c-6bf4-4704-ae56-0ea7d676aad9-additional-scripts\") pod \"ovn-controller-s5lkp-config-stlxl\" (UID: \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\") " pod="openstack/ovn-controller-s5lkp-config-stlxl" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.494755 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff84619c-6bf4-4704-ae56-0ea7d676aad9-var-log-ovn\") pod \"ovn-controller-s5lkp-config-stlxl\" (UID: \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\") " pod="openstack/ovn-controller-s5lkp-config-stlxl" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.494854 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff84619c-6bf4-4704-ae56-0ea7d676aad9-scripts\") pod \"ovn-controller-s5lkp-config-stlxl\" (UID: \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\") " pod="openstack/ovn-controller-s5lkp-config-stlxl" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.494934 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff84619c-6bf4-4704-ae56-0ea7d676aad9-var-run-ovn\") pod \"ovn-controller-s5lkp-config-stlxl\" (UID: \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\") " pod="openstack/ovn-controller-s5lkp-config-stlxl" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.494952 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k74tq\" (UniqueName: \"kubernetes.io/projected/ff84619c-6bf4-4704-ae56-0ea7d676aad9-kube-api-access-k74tq\") pod \"ovn-controller-s5lkp-config-stlxl\" (UID: \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\") " pod="openstack/ovn-controller-s5lkp-config-stlxl" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.494978 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff84619c-6bf4-4704-ae56-0ea7d676aad9-var-run\") pod \"ovn-controller-s5lkp-config-stlxl\" (UID: \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\") " pod="openstack/ovn-controller-s5lkp-config-stlxl" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.596374 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff84619c-6bf4-4704-ae56-0ea7d676aad9-var-run-ovn\") pod \"ovn-controller-s5lkp-config-stlxl\" (UID: \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\") " pod="openstack/ovn-controller-s5lkp-config-stlxl" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.596872 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k74tq\" (UniqueName: \"kubernetes.io/projected/ff84619c-6bf4-4704-ae56-0ea7d676aad9-kube-api-access-k74tq\") pod \"ovn-controller-s5lkp-config-stlxl\" (UID: \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\") " pod="openstack/ovn-controller-s5lkp-config-stlxl" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.596812 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff84619c-6bf4-4704-ae56-0ea7d676aad9-var-run-ovn\") pod \"ovn-controller-s5lkp-config-stlxl\" (UID: \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\") " pod="openstack/ovn-controller-s5lkp-config-stlxl" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.596970 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff84619c-6bf4-4704-ae56-0ea7d676aad9-var-run\") pod \"ovn-controller-s5lkp-config-stlxl\" (UID: \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\") " pod="openstack/ovn-controller-s5lkp-config-stlxl" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.597039 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ff84619c-6bf4-4704-ae56-0ea7d676aad9-additional-scripts\") pod \"ovn-controller-s5lkp-config-stlxl\" (UID: \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\") " pod="openstack/ovn-controller-s5lkp-config-stlxl" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.597109 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff84619c-6bf4-4704-ae56-0ea7d676aad9-var-run\") pod \"ovn-controller-s5lkp-config-stlxl\" (UID: \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\") " pod="openstack/ovn-controller-s5lkp-config-stlxl" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.597233 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff84619c-6bf4-4704-ae56-0ea7d676aad9-var-log-ovn\") pod \"ovn-controller-s5lkp-config-stlxl\" (UID: \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\") " pod="openstack/ovn-controller-s5lkp-config-stlxl" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.597324 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff84619c-6bf4-4704-ae56-0ea7d676aad9-var-log-ovn\") pod \"ovn-controller-s5lkp-config-stlxl\" (UID: \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\") " pod="openstack/ovn-controller-s5lkp-config-stlxl" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.597467 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff84619c-6bf4-4704-ae56-0ea7d676aad9-scripts\") pod \"ovn-controller-s5lkp-config-stlxl\" (UID: \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\") " pod="openstack/ovn-controller-s5lkp-config-stlxl" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.598213 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ff84619c-6bf4-4704-ae56-0ea7d676aad9-additional-scripts\") pod \"ovn-controller-s5lkp-config-stlxl\" (UID: \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\") " pod="openstack/ovn-controller-s5lkp-config-stlxl" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.599862 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff84619c-6bf4-4704-ae56-0ea7d676aad9-scripts\") pod \"ovn-controller-s5lkp-config-stlxl\" (UID: \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\") " pod="openstack/ovn-controller-s5lkp-config-stlxl" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.615303 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k74tq\" (UniqueName: \"kubernetes.io/projected/ff84619c-6bf4-4704-ae56-0ea7d676aad9-kube-api-access-k74tq\") pod \"ovn-controller-s5lkp-config-stlxl\" (UID: \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\") " pod="openstack/ovn-controller-s5lkp-config-stlxl" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.653250 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s5lkp-config-stlxl" Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.953315 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:23:28 crc kubenswrapper[4771]: I0227 01:23:28.953668 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:23:29 crc kubenswrapper[4771]: W0227 01:23:29.097004 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff84619c_6bf4_4704_ae56_0ea7d676aad9.slice/crio-83f860b8f9b8fe458ae2471dfb604c2c6415711dbff94bf6850253255172651a WatchSource:0}: Error finding container 83f860b8f9b8fe458ae2471dfb604c2c6415711dbff94bf6850253255172651a: Status 404 returned error can't find the container with id 83f860b8f9b8fe458ae2471dfb604c2c6415711dbff94bf6850253255172651a Feb 27 01:23:29 crc kubenswrapper[4771]: I0227 01:23:29.093203 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s5lkp-config-stlxl"] Feb 27 01:23:29 crc kubenswrapper[4771]: I0227 01:23:29.147760 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"251e5c6f-c762-4a6e-9253-81f94d592239","Type":"ContainerStarted","Data":"13025e80676729557392f9f3dcbf1b1bf1b3946ca8ae1be623ef27b4ca3020ba"} Feb 27 01:23:29 crc kubenswrapper[4771]: I0227 01:23:29.150201 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s5lkp-config-stlxl" event={"ID":"ff84619c-6bf4-4704-ae56-0ea7d676aad9","Type":"ContainerStarted","Data":"83f860b8f9b8fe458ae2471dfb604c2c6415711dbff94bf6850253255172651a"} Feb 27 01:23:30 crc kubenswrapper[4771]: I0227 01:23:29.528795 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fjbn5" Feb 27 01:23:30 crc kubenswrapper[4771]: I0227 01:23:29.646045 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg75x\" (UniqueName: \"kubernetes.io/projected/9a026cce-a7cd-45e6-b4e0-6081d19b016a-kube-api-access-jg75x\") pod \"9a026cce-a7cd-45e6-b4e0-6081d19b016a\" (UID: \"9a026cce-a7cd-45e6-b4e0-6081d19b016a\") " Feb 27 01:23:30 crc kubenswrapper[4771]: I0227 01:23:29.646244 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a026cce-a7cd-45e6-b4e0-6081d19b016a-operator-scripts\") pod \"9a026cce-a7cd-45e6-b4e0-6081d19b016a\" (UID: \"9a026cce-a7cd-45e6-b4e0-6081d19b016a\") " Feb 27 01:23:30 crc kubenswrapper[4771]: I0227 01:23:29.647130 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a026cce-a7cd-45e6-b4e0-6081d19b016a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9a026cce-a7cd-45e6-b4e0-6081d19b016a" (UID: "9a026cce-a7cd-45e6-b4e0-6081d19b016a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:30 crc kubenswrapper[4771]: I0227 01:23:29.650303 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a026cce-a7cd-45e6-b4e0-6081d19b016a-kube-api-access-jg75x" (OuterVolumeSpecName: "kube-api-access-jg75x") pod "9a026cce-a7cd-45e6-b4e0-6081d19b016a" (UID: "9a026cce-a7cd-45e6-b4e0-6081d19b016a"). InnerVolumeSpecName "kube-api-access-jg75x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:30 crc kubenswrapper[4771]: I0227 01:23:29.747517 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg75x\" (UniqueName: \"kubernetes.io/projected/9a026cce-a7cd-45e6-b4e0-6081d19b016a-kube-api-access-jg75x\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:30 crc kubenswrapper[4771]: I0227 01:23:29.747922 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a026cce-a7cd-45e6-b4e0-6081d19b016a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:30 crc kubenswrapper[4771]: I0227 01:23:30.161294 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"251e5c6f-c762-4a6e-9253-81f94d592239","Type":"ContainerStarted","Data":"37e60fe14daebab4176f6c1270cc2c784107d4962502286e25b1cede43483507"} Feb 27 01:23:30 crc kubenswrapper[4771]: I0227 01:23:30.161346 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"251e5c6f-c762-4a6e-9253-81f94d592239","Type":"ContainerStarted","Data":"f3e5183106ceae901d5f532620ce3cf94b9babaedda9be308d5068184be175d6"} Feb 27 01:23:30 crc kubenswrapper[4771]: I0227 01:23:30.161365 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"251e5c6f-c762-4a6e-9253-81f94d592239","Type":"ContainerStarted","Data":"cc3e660a70944916ae062bad0a5ef56154ede6b800472801fa6da8705f3d0c08"} Feb 27 01:23:30 crc kubenswrapper[4771]: I0227 01:23:30.164023 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fjbn5" event={"ID":"9a026cce-a7cd-45e6-b4e0-6081d19b016a","Type":"ContainerDied","Data":"29f5fca2e9a02c0bd6df25e3e9ff80ed42f506d45e3b79eb094d86be200003b3"} Feb 27 01:23:30 crc kubenswrapper[4771]: I0227 01:23:30.164048 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29f5fca2e9a02c0bd6df25e3e9ff80ed42f506d45e3b79eb094d86be200003b3" Feb 27 01:23:30 crc kubenswrapper[4771]: I0227 01:23:30.164110 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fjbn5" Feb 27 01:23:30 crc kubenswrapper[4771]: I0227 01:23:30.166377 4771 generic.go:334] "Generic (PLEG): container finished" podID="ff84619c-6bf4-4704-ae56-0ea7d676aad9" containerID="756ffee0e7508ceca49f673f36c03777cf26b5f9a2f5ebf61d40abff77ef74b3" exitCode=0 Feb 27 01:23:30 crc kubenswrapper[4771]: I0227 01:23:30.166403 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s5lkp-config-stlxl" event={"ID":"ff84619c-6bf4-4704-ae56-0ea7d676aad9","Type":"ContainerDied","Data":"756ffee0e7508ceca49f673f36c03777cf26b5f9a2f5ebf61d40abff77ef74b3"} Feb 27 01:23:31 crc kubenswrapper[4771]: I0227 01:23:31.009028 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-fjbn5"] Feb 27 01:23:31 crc kubenswrapper[4771]: I0227 01:23:31.016889 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-fjbn5"] Feb 27 01:23:31 crc kubenswrapper[4771]: I0227 01:23:31.182197 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"251e5c6f-c762-4a6e-9253-81f94d592239","Type":"ContainerStarted","Data":"643e2345735270d6ed148bad356186ad00e74d9279ad14fe53ef32adac7c50d9"} Feb 27 01:23:31 crc kubenswrapper[4771]: I0227 01:23:31.537379 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s5lkp-config-stlxl" Feb 27 01:23:31 crc kubenswrapper[4771]: I0227 01:23:31.679186 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff84619c-6bf4-4704-ae56-0ea7d676aad9-scripts\") pod \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\" (UID: \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\") " Feb 27 01:23:31 crc kubenswrapper[4771]: I0227 01:23:31.679629 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ff84619c-6bf4-4704-ae56-0ea7d676aad9-additional-scripts\") pod \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\" (UID: \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\") " Feb 27 01:23:31 crc kubenswrapper[4771]: I0227 01:23:31.679672 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff84619c-6bf4-4704-ae56-0ea7d676aad9-var-run\") pod \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\" (UID: \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\") " Feb 27 01:23:31 crc kubenswrapper[4771]: I0227 01:23:31.679730 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff84619c-6bf4-4704-ae56-0ea7d676aad9-var-run-ovn\") pod \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\" (UID: \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\") " Feb 27 01:23:31 crc kubenswrapper[4771]: I0227 01:23:31.679750 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k74tq\" (UniqueName: \"kubernetes.io/projected/ff84619c-6bf4-4704-ae56-0ea7d676aad9-kube-api-access-k74tq\") pod \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\" (UID: \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\") " Feb 27 01:23:31 crc kubenswrapper[4771]: I0227 01:23:31.679776 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff84619c-6bf4-4704-ae56-0ea7d676aad9-var-log-ovn\") pod \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\" (UID: \"ff84619c-6bf4-4704-ae56-0ea7d676aad9\") " Feb 27 01:23:31 crc kubenswrapper[4771]: I0227 01:23:31.679812 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff84619c-6bf4-4704-ae56-0ea7d676aad9-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ff84619c-6bf4-4704-ae56-0ea7d676aad9" (UID: "ff84619c-6bf4-4704-ae56-0ea7d676aad9"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:23:31 crc kubenswrapper[4771]: I0227 01:23:31.679880 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff84619c-6bf4-4704-ae56-0ea7d676aad9-var-run" (OuterVolumeSpecName: "var-run") pod "ff84619c-6bf4-4704-ae56-0ea7d676aad9" (UID: "ff84619c-6bf4-4704-ae56-0ea7d676aad9"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:23:31 crc kubenswrapper[4771]: I0227 01:23:31.679967 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff84619c-6bf4-4704-ae56-0ea7d676aad9-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ff84619c-6bf4-4704-ae56-0ea7d676aad9" (UID: "ff84619c-6bf4-4704-ae56-0ea7d676aad9"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:23:31 crc kubenswrapper[4771]: I0227 01:23:31.680172 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff84619c-6bf4-4704-ae56-0ea7d676aad9-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ff84619c-6bf4-4704-ae56-0ea7d676aad9" (UID: "ff84619c-6bf4-4704-ae56-0ea7d676aad9"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:31 crc kubenswrapper[4771]: I0227 01:23:31.680317 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff84619c-6bf4-4704-ae56-0ea7d676aad9-scripts" (OuterVolumeSpecName: "scripts") pod "ff84619c-6bf4-4704-ae56-0ea7d676aad9" (UID: "ff84619c-6bf4-4704-ae56-0ea7d676aad9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:31 crc kubenswrapper[4771]: I0227 01:23:31.680583 4771 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ff84619c-6bf4-4704-ae56-0ea7d676aad9-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:31 crc kubenswrapper[4771]: I0227 01:23:31.680597 4771 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff84619c-6bf4-4704-ae56-0ea7d676aad9-var-run\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:31 crc kubenswrapper[4771]: I0227 01:23:31.680605 4771 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff84619c-6bf4-4704-ae56-0ea7d676aad9-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:31 crc kubenswrapper[4771]: I0227 01:23:31.680614 4771 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff84619c-6bf4-4704-ae56-0ea7d676aad9-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:31 crc kubenswrapper[4771]: I0227 01:23:31.680624 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff84619c-6bf4-4704-ae56-0ea7d676aad9-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:31 crc kubenswrapper[4771]: I0227 01:23:31.685776 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff84619c-6bf4-4704-ae56-0ea7d676aad9-kube-api-access-k74tq" (OuterVolumeSpecName: "kube-api-access-k74tq") pod "ff84619c-6bf4-4704-ae56-0ea7d676aad9" (UID: "ff84619c-6bf4-4704-ae56-0ea7d676aad9"). InnerVolumeSpecName "kube-api-access-k74tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:31 crc kubenswrapper[4771]: I0227 01:23:31.781887 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k74tq\" (UniqueName: \"kubernetes.io/projected/ff84619c-6bf4-4704-ae56-0ea7d676aad9-kube-api-access-k74tq\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:31 crc kubenswrapper[4771]: I0227 01:23:31.783977 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a026cce-a7cd-45e6-b4e0-6081d19b016a" path="/var/lib/kubelet/pods/9a026cce-a7cd-45e6-b4e0-6081d19b016a/volumes" Feb 27 01:23:32 crc kubenswrapper[4771]: I0227 01:23:32.195209 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"251e5c6f-c762-4a6e-9253-81f94d592239","Type":"ContainerStarted","Data":"3accda68bc11aedc14891a201bf7ca4e29c60ef804bb55948c93123618a5c014"} Feb 27 01:23:32 crc kubenswrapper[4771]: I0227 01:23:32.197309 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s5lkp-config-stlxl" event={"ID":"ff84619c-6bf4-4704-ae56-0ea7d676aad9","Type":"ContainerDied","Data":"83f860b8f9b8fe458ae2471dfb604c2c6415711dbff94bf6850253255172651a"} Feb 27 01:23:32 crc kubenswrapper[4771]: I0227 01:23:32.197340 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83f860b8f9b8fe458ae2471dfb604c2c6415711dbff94bf6850253255172651a" Feb 27 01:23:32 crc kubenswrapper[4771]: I0227 01:23:32.197517 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s5lkp-config-stlxl" Feb 27 01:23:32 crc kubenswrapper[4771]: I0227 01:23:32.643718 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-s5lkp-config-stlxl"] Feb 27 01:23:32 crc kubenswrapper[4771]: I0227 01:23:32.648955 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-s5lkp-config-stlxl"] Feb 27 01:23:32 crc kubenswrapper[4771]: I0227 01:23:32.768451 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-s5lkp-config-zx6vm"] Feb 27 01:23:32 crc kubenswrapper[4771]: E0227 01:23:32.768877 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff84619c-6bf4-4704-ae56-0ea7d676aad9" containerName="ovn-config" Feb 27 01:23:32 crc kubenswrapper[4771]: I0227 01:23:32.768901 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff84619c-6bf4-4704-ae56-0ea7d676aad9" containerName="ovn-config" Feb 27 01:23:32 crc kubenswrapper[4771]: E0227 01:23:32.768920 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a026cce-a7cd-45e6-b4e0-6081d19b016a" containerName="mariadb-account-create-update" Feb 27 01:23:32 crc kubenswrapper[4771]: I0227 01:23:32.768929 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a026cce-a7cd-45e6-b4e0-6081d19b016a" containerName="mariadb-account-create-update" Feb 27 01:23:32 crc kubenswrapper[4771]: I0227 01:23:32.769152 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a026cce-a7cd-45e6-b4e0-6081d19b016a" containerName="mariadb-account-create-update" Feb 27 01:23:32 crc kubenswrapper[4771]: I0227 01:23:32.769178 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff84619c-6bf4-4704-ae56-0ea7d676aad9" containerName="ovn-config" Feb 27 01:23:32 crc kubenswrapper[4771]: I0227 01:23:32.769835 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s5lkp-config-zx6vm" Feb 27 01:23:32 crc kubenswrapper[4771]: I0227 01:23:32.771806 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 27 01:23:32 crc kubenswrapper[4771]: I0227 01:23:32.786901 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s5lkp-config-zx6vm"] Feb 27 01:23:32 crc kubenswrapper[4771]: I0227 01:23:32.906475 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bwzh\" (UniqueName: \"kubernetes.io/projected/4bf90520-b383-4270-99ee-ef5a80d5fb78-kube-api-access-7bwzh\") pod \"ovn-controller-s5lkp-config-zx6vm\" (UID: \"4bf90520-b383-4270-99ee-ef5a80d5fb78\") " pod="openstack/ovn-controller-s5lkp-config-zx6vm" Feb 27 01:23:32 crc kubenswrapper[4771]: I0227 01:23:32.906519 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4bf90520-b383-4270-99ee-ef5a80d5fb78-var-run-ovn\") pod \"ovn-controller-s5lkp-config-zx6vm\" (UID: \"4bf90520-b383-4270-99ee-ef5a80d5fb78\") " pod="openstack/ovn-controller-s5lkp-config-zx6vm" Feb 27 01:23:32 crc kubenswrapper[4771]: I0227 01:23:32.906577 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4bf90520-b383-4270-99ee-ef5a80d5fb78-scripts\") pod \"ovn-controller-s5lkp-config-zx6vm\" (UID: \"4bf90520-b383-4270-99ee-ef5a80d5fb78\") " pod="openstack/ovn-controller-s5lkp-config-zx6vm" Feb 27 01:23:32 crc kubenswrapper[4771]: I0227 01:23:32.906619 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4bf90520-b383-4270-99ee-ef5a80d5fb78-var-log-ovn\") pod \"ovn-controller-s5lkp-config-zx6vm\" (UID: \"4bf90520-b383-4270-99ee-ef5a80d5fb78\") " pod="openstack/ovn-controller-s5lkp-config-zx6vm" Feb 27 01:23:32 crc kubenswrapper[4771]: I0227 01:23:32.906676 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4bf90520-b383-4270-99ee-ef5a80d5fb78-additional-scripts\") pod \"ovn-controller-s5lkp-config-zx6vm\" (UID: \"4bf90520-b383-4270-99ee-ef5a80d5fb78\") " pod="openstack/ovn-controller-s5lkp-config-zx6vm" Feb 27 01:23:32 crc kubenswrapper[4771]: I0227 01:23:32.906719 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4bf90520-b383-4270-99ee-ef5a80d5fb78-var-run\") pod \"ovn-controller-s5lkp-config-zx6vm\" (UID: \"4bf90520-b383-4270-99ee-ef5a80d5fb78\") " pod="openstack/ovn-controller-s5lkp-config-zx6vm" Feb 27 01:23:33 crc kubenswrapper[4771]: I0227 01:23:33.008608 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4bf90520-b383-4270-99ee-ef5a80d5fb78-var-log-ovn\") pod \"ovn-controller-s5lkp-config-zx6vm\" (UID: \"4bf90520-b383-4270-99ee-ef5a80d5fb78\") " pod="openstack/ovn-controller-s5lkp-config-zx6vm" Feb 27 01:23:33 crc kubenswrapper[4771]: I0227 01:23:33.008859 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4bf90520-b383-4270-99ee-ef5a80d5fb78-additional-scripts\") pod \"ovn-controller-s5lkp-config-zx6vm\" (UID: \"4bf90520-b383-4270-99ee-ef5a80d5fb78\") " pod="openstack/ovn-controller-s5lkp-config-zx6vm" Feb 27 01:23:33 crc kubenswrapper[4771]: I0227 01:23:33.008972 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4bf90520-b383-4270-99ee-ef5a80d5fb78-var-log-ovn\") pod \"ovn-controller-s5lkp-config-zx6vm\" (UID: \"4bf90520-b383-4270-99ee-ef5a80d5fb78\") " pod="openstack/ovn-controller-s5lkp-config-zx6vm" Feb 27 01:23:33 crc kubenswrapper[4771]: I0227 01:23:33.009087 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4bf90520-b383-4270-99ee-ef5a80d5fb78-var-run\") pod \"ovn-controller-s5lkp-config-zx6vm\" (UID: \"4bf90520-b383-4270-99ee-ef5a80d5fb78\") " pod="openstack/ovn-controller-s5lkp-config-zx6vm" Feb 27 01:23:33 crc kubenswrapper[4771]: I0227 01:23:33.009159 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4bf90520-b383-4270-99ee-ef5a80d5fb78-var-run\") pod \"ovn-controller-s5lkp-config-zx6vm\" (UID: \"4bf90520-b383-4270-99ee-ef5a80d5fb78\") " pod="openstack/ovn-controller-s5lkp-config-zx6vm" Feb 27 01:23:33 crc kubenswrapper[4771]: I0227 01:23:33.009311 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bwzh\" (UniqueName: \"kubernetes.io/projected/4bf90520-b383-4270-99ee-ef5a80d5fb78-kube-api-access-7bwzh\") pod \"ovn-controller-s5lkp-config-zx6vm\" (UID: \"4bf90520-b383-4270-99ee-ef5a80d5fb78\") " pod="openstack/ovn-controller-s5lkp-config-zx6vm" Feb 27 01:23:33 crc kubenswrapper[4771]: I0227 01:23:33.009416 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4bf90520-b383-4270-99ee-ef5a80d5fb78-var-run-ovn\") pod \"ovn-controller-s5lkp-config-zx6vm\" (UID: \"4bf90520-b383-4270-99ee-ef5a80d5fb78\") " pod="openstack/ovn-controller-s5lkp-config-zx6vm" Feb 27 01:23:33 crc kubenswrapper[4771]: I0227 01:23:33.009546 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4bf90520-b383-4270-99ee-ef5a80d5fb78-scripts\") pod \"ovn-controller-s5lkp-config-zx6vm\" (UID: \"4bf90520-b383-4270-99ee-ef5a80d5fb78\") " pod="openstack/ovn-controller-s5lkp-config-zx6vm" Feb 27 01:23:33 crc kubenswrapper[4771]: I0227 01:23:33.009562 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4bf90520-b383-4270-99ee-ef5a80d5fb78-var-run-ovn\") pod \"ovn-controller-s5lkp-config-zx6vm\" (UID: \"4bf90520-b383-4270-99ee-ef5a80d5fb78\") " pod="openstack/ovn-controller-s5lkp-config-zx6vm" Feb 27 01:23:33 crc kubenswrapper[4771]: I0227 01:23:33.009723 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4bf90520-b383-4270-99ee-ef5a80d5fb78-additional-scripts\") pod \"ovn-controller-s5lkp-config-zx6vm\" (UID: \"4bf90520-b383-4270-99ee-ef5a80d5fb78\") " pod="openstack/ovn-controller-s5lkp-config-zx6vm" Feb 27 01:23:33 crc kubenswrapper[4771]: I0227 01:23:33.011400 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4bf90520-b383-4270-99ee-ef5a80d5fb78-scripts\") pod \"ovn-controller-s5lkp-config-zx6vm\" (UID: \"4bf90520-b383-4270-99ee-ef5a80d5fb78\") " pod="openstack/ovn-controller-s5lkp-config-zx6vm" Feb 27 01:23:33 crc kubenswrapper[4771]: I0227 01:23:33.041449 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bwzh\" (UniqueName: \"kubernetes.io/projected/4bf90520-b383-4270-99ee-ef5a80d5fb78-kube-api-access-7bwzh\") pod \"ovn-controller-s5lkp-config-zx6vm\" (UID: \"4bf90520-b383-4270-99ee-ef5a80d5fb78\") " pod="openstack/ovn-controller-s5lkp-config-zx6vm" Feb 27 01:23:33 crc kubenswrapper[4771]: I0227 01:23:33.050820 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-s5lkp" Feb 27 01:23:33 crc kubenswrapper[4771]: I0227 01:23:33.090308 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s5lkp-config-zx6vm" Feb 27 01:23:33 crc kubenswrapper[4771]: I0227 01:23:33.786960 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff84619c-6bf4-4704-ae56-0ea7d676aad9" path="/var/lib/kubelet/pods/ff84619c-6bf4-4704-ae56-0ea7d676aad9/volumes" Feb 27 01:23:34 crc kubenswrapper[4771]: I0227 01:23:34.674108 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-9528l"] Feb 27 01:23:34 crc kubenswrapper[4771]: I0227 01:23:34.680734 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9528l" Feb 27 01:23:34 crc kubenswrapper[4771]: I0227 01:23:34.689355 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9528l"] Feb 27 01:23:34 crc kubenswrapper[4771]: I0227 01:23:34.690461 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 27 01:23:34 crc kubenswrapper[4771]: I0227 01:23:34.837761 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16d958eb-cffb-4b89-bd08-7a15d68297e6-operator-scripts\") pod \"root-account-create-update-9528l\" (UID: \"16d958eb-cffb-4b89-bd08-7a15d68297e6\") " pod="openstack/root-account-create-update-9528l" Feb 27 01:23:34 crc kubenswrapper[4771]: I0227 01:23:34.838192 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2xpd\" (UniqueName: \"kubernetes.io/projected/16d958eb-cffb-4b89-bd08-7a15d68297e6-kube-api-access-t2xpd\") pod \"root-account-create-update-9528l\" (UID: \"16d958eb-cffb-4b89-bd08-7a15d68297e6\") " pod="openstack/root-account-create-update-9528l" Feb 27 01:23:34 crc kubenswrapper[4771]: I0227 01:23:34.940027 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16d958eb-cffb-4b89-bd08-7a15d68297e6-operator-scripts\") pod \"root-account-create-update-9528l\" (UID: \"16d958eb-cffb-4b89-bd08-7a15d68297e6\") " pod="openstack/root-account-create-update-9528l" Feb 27 01:23:34 crc kubenswrapper[4771]: I0227 01:23:34.940277 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2xpd\" (UniqueName: \"kubernetes.io/projected/16d958eb-cffb-4b89-bd08-7a15d68297e6-kube-api-access-t2xpd\") pod \"root-account-create-update-9528l\" (UID: \"16d958eb-cffb-4b89-bd08-7a15d68297e6\") " pod="openstack/root-account-create-update-9528l" Feb 27 01:23:34 crc kubenswrapper[4771]: I0227 01:23:34.946617 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16d958eb-cffb-4b89-bd08-7a15d68297e6-operator-scripts\") pod \"root-account-create-update-9528l\" (UID: \"16d958eb-cffb-4b89-bd08-7a15d68297e6\") " pod="openstack/root-account-create-update-9528l" Feb 27 01:23:34 crc kubenswrapper[4771]: I0227 01:23:34.968640 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2xpd\" (UniqueName: \"kubernetes.io/projected/16d958eb-cffb-4b89-bd08-7a15d68297e6-kube-api-access-t2xpd\") pod \"root-account-create-update-9528l\" (UID: \"16d958eb-cffb-4b89-bd08-7a15d68297e6\") " pod="openstack/root-account-create-update-9528l" Feb 27 01:23:35 crc kubenswrapper[4771]: I0227 01:23:35.005515 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9528l" Feb 27 01:23:35 crc kubenswrapper[4771]: I0227 01:23:35.222483 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"251e5c6f-c762-4a6e-9253-81f94d592239","Type":"ContainerStarted","Data":"1981134a5a98fa6bef33cf5776108066ede81842575913e251352cbed3a3a7c2"} Feb 27 01:23:35 crc kubenswrapper[4771]: I0227 01:23:35.287218 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9528l"] Feb 27 01:23:35 crc kubenswrapper[4771]: W0227 01:23:35.292373 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16d958eb_cffb_4b89_bd08_7a15d68297e6.slice/crio-a8c543eab4d09d7500ed2af035ec8269e5f0b1a1f723694db0f82837e7829660 WatchSource:0}: Error finding container a8c543eab4d09d7500ed2af035ec8269e5f0b1a1f723694db0f82837e7829660: Status 404 returned error can't find the container with id a8c543eab4d09d7500ed2af035ec8269e5f0b1a1f723694db0f82837e7829660 Feb 27 01:23:35 crc kubenswrapper[4771]: I0227 01:23:35.406426 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s5lkp-config-zx6vm"] Feb 27 01:23:35 crc kubenswrapper[4771]: W0227 01:23:35.407805 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bf90520_b383_4270_99ee_ef5a80d5fb78.slice/crio-f17367e644b5023868526443971b0a2d3a19d45b330588329e129c2ed4e76bd0 WatchSource:0}: Error finding container f17367e644b5023868526443971b0a2d3a19d45b330588329e129c2ed4e76bd0: Status 404 returned error can't find the container with id f17367e644b5023868526443971b0a2d3a19d45b330588329e129c2ed4e76bd0 Feb 27 01:23:36 crc kubenswrapper[4771]: I0227 01:23:36.234962 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"251e5c6f-c762-4a6e-9253-81f94d592239","Type":"ContainerStarted","Data":"fcf25bdcda6fc2019a585af54351f526a7771a20f3dda618fc97606852b99795"} Feb 27 01:23:36 crc kubenswrapper[4771]: I0227 01:23:36.237527 4771 generic.go:334] "Generic (PLEG): container finished" podID="16d958eb-cffb-4b89-bd08-7a15d68297e6" containerID="1e50491ccf3ca68206e4ef11def086bfb6d3b305b3a567410e82b4fc8382b8f3" exitCode=0 Feb 27 01:23:36 crc kubenswrapper[4771]: I0227 01:23:36.237602 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9528l" event={"ID":"16d958eb-cffb-4b89-bd08-7a15d68297e6","Type":"ContainerDied","Data":"1e50491ccf3ca68206e4ef11def086bfb6d3b305b3a567410e82b4fc8382b8f3"} Feb 27 01:23:36 crc kubenswrapper[4771]: I0227 01:23:36.237701 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9528l" event={"ID":"16d958eb-cffb-4b89-bd08-7a15d68297e6","Type":"ContainerStarted","Data":"a8c543eab4d09d7500ed2af035ec8269e5f0b1a1f723694db0f82837e7829660"} Feb 27 01:23:36 crc kubenswrapper[4771]: I0227 01:23:36.239592 4771 generic.go:334] "Generic (PLEG): container finished" podID="4bf90520-b383-4270-99ee-ef5a80d5fb78" containerID="4e1b659b45a1a370910ba8313405db513e67bf9f0b1146bc6f3fa60e4de0a02d" exitCode=0 Feb 27 01:23:36 crc kubenswrapper[4771]: I0227 01:23:36.239627 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s5lkp-config-zx6vm" event={"ID":"4bf90520-b383-4270-99ee-ef5a80d5fb78","Type":"ContainerDied","Data":"4e1b659b45a1a370910ba8313405db513e67bf9f0b1146bc6f3fa60e4de0a02d"} Feb 27 01:23:36 crc kubenswrapper[4771]: I0227 01:23:36.239647 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s5lkp-config-zx6vm" event={"ID":"4bf90520-b383-4270-99ee-ef5a80d5fb78","Type":"ContainerStarted","Data":"f17367e644b5023868526443971b0a2d3a19d45b330588329e129c2ed4e76bd0"} Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.252473 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"251e5c6f-c762-4a6e-9253-81f94d592239","Type":"ContainerStarted","Data":"14314db5cebe3cc1f4b3c7b5538e0ca1499cecdd7bd45226663872a27b8057f1"} Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.252868 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"251e5c6f-c762-4a6e-9253-81f94d592239","Type":"ContainerStarted","Data":"c9a8984c1c2e962263ade6752d52b164c782001dd2440d1aa785d43612550168"} Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.252887 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"251e5c6f-c762-4a6e-9253-81f94d592239","Type":"ContainerStarted","Data":"cdf32ca7bd041fbe3915f6ab65a3f8d58d8bc6aa6e444af84e084e22ae1180a9"} Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.252900 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"251e5c6f-c762-4a6e-9253-81f94d592239","Type":"ContainerStarted","Data":"dc8df1eac543fcb5ffce50320f825aa72b1f0b6016243fd1ee23dacf57fb1e03"} Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.254085 4771 generic.go:334] "Generic (PLEG): container finished" podID="1e019cf0-706d-444b-98cb-b07123d2a0d1" containerID="e0e45977b813e294d17ab1b1dd234d6ea49582deb369001e7d9b65560cfd6936" exitCode=0 Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.254169 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xxhc8" event={"ID":"1e019cf0-706d-444b-98cb-b07123d2a0d1","Type":"ContainerDied","Data":"e0e45977b813e294d17ab1b1dd234d6ea49582deb369001e7d9b65560cfd6936"} Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.637034 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9528l" Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.638786 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s5lkp-config-zx6vm" Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.789919 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4bf90520-b383-4270-99ee-ef5a80d5fb78-var-run-ovn\") pod \"4bf90520-b383-4270-99ee-ef5a80d5fb78\" (UID: \"4bf90520-b383-4270-99ee-ef5a80d5fb78\") " Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.790020 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16d958eb-cffb-4b89-bd08-7a15d68297e6-operator-scripts\") pod \"16d958eb-cffb-4b89-bd08-7a15d68297e6\" (UID: \"16d958eb-cffb-4b89-bd08-7a15d68297e6\") " Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.790030 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bf90520-b383-4270-99ee-ef5a80d5fb78-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "4bf90520-b383-4270-99ee-ef5a80d5fb78" (UID: "4bf90520-b383-4270-99ee-ef5a80d5fb78"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.790086 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4bf90520-b383-4270-99ee-ef5a80d5fb78-scripts\") pod \"4bf90520-b383-4270-99ee-ef5a80d5fb78\" (UID: \"4bf90520-b383-4270-99ee-ef5a80d5fb78\") " Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.790111 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4bf90520-b383-4270-99ee-ef5a80d5fb78-additional-scripts\") pod \"4bf90520-b383-4270-99ee-ef5a80d5fb78\" (UID: \"4bf90520-b383-4270-99ee-ef5a80d5fb78\") " Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.790126 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4bf90520-b383-4270-99ee-ef5a80d5fb78-var-log-ovn\") pod \"4bf90520-b383-4270-99ee-ef5a80d5fb78\" (UID: \"4bf90520-b383-4270-99ee-ef5a80d5fb78\") " Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.790194 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2xpd\" (UniqueName: \"kubernetes.io/projected/16d958eb-cffb-4b89-bd08-7a15d68297e6-kube-api-access-t2xpd\") pod \"16d958eb-cffb-4b89-bd08-7a15d68297e6\" (UID: \"16d958eb-cffb-4b89-bd08-7a15d68297e6\") " Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.790220 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4bf90520-b383-4270-99ee-ef5a80d5fb78-var-run\") pod \"4bf90520-b383-4270-99ee-ef5a80d5fb78\" (UID: \"4bf90520-b383-4270-99ee-ef5a80d5fb78\") " Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.790259 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bwzh\" (UniqueName: \"kubernetes.io/projected/4bf90520-b383-4270-99ee-ef5a80d5fb78-kube-api-access-7bwzh\") pod \"4bf90520-b383-4270-99ee-ef5a80d5fb78\" (UID: \"4bf90520-b383-4270-99ee-ef5a80d5fb78\") " Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.790590 4771 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4bf90520-b383-4270-99ee-ef5a80d5fb78-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.790769 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16d958eb-cffb-4b89-bd08-7a15d68297e6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "16d958eb-cffb-4b89-bd08-7a15d68297e6" (UID: "16d958eb-cffb-4b89-bd08-7a15d68297e6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.790808 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bf90520-b383-4270-99ee-ef5a80d5fb78-var-run" (OuterVolumeSpecName: "var-run") pod "4bf90520-b383-4270-99ee-ef5a80d5fb78" (UID: "4bf90520-b383-4270-99ee-ef5a80d5fb78"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.791530 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bf90520-b383-4270-99ee-ef5a80d5fb78-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "4bf90520-b383-4270-99ee-ef5a80d5fb78" (UID: "4bf90520-b383-4270-99ee-ef5a80d5fb78"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.793116 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bf90520-b383-4270-99ee-ef5a80d5fb78-scripts" (OuterVolumeSpecName: "scripts") pod "4bf90520-b383-4270-99ee-ef5a80d5fb78" (UID: "4bf90520-b383-4270-99ee-ef5a80d5fb78"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.793265 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bf90520-b383-4270-99ee-ef5a80d5fb78-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "4bf90520-b383-4270-99ee-ef5a80d5fb78" (UID: "4bf90520-b383-4270-99ee-ef5a80d5fb78"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.794728 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bf90520-b383-4270-99ee-ef5a80d5fb78-kube-api-access-7bwzh" (OuterVolumeSpecName: "kube-api-access-7bwzh") pod "4bf90520-b383-4270-99ee-ef5a80d5fb78" (UID: "4bf90520-b383-4270-99ee-ef5a80d5fb78"). InnerVolumeSpecName "kube-api-access-7bwzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.795907 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16d958eb-cffb-4b89-bd08-7a15d68297e6-kube-api-access-t2xpd" (OuterVolumeSpecName: "kube-api-access-t2xpd") pod "16d958eb-cffb-4b89-bd08-7a15d68297e6" (UID: "16d958eb-cffb-4b89-bd08-7a15d68297e6"). InnerVolumeSpecName "kube-api-access-t2xpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.892182 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16d958eb-cffb-4b89-bd08-7a15d68297e6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.892394 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4bf90520-b383-4270-99ee-ef5a80d5fb78-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.892477 4771 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4bf90520-b383-4270-99ee-ef5a80d5fb78-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.892541 4771 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4bf90520-b383-4270-99ee-ef5a80d5fb78-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.892672 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2xpd\" (UniqueName: \"kubernetes.io/projected/16d958eb-cffb-4b89-bd08-7a15d68297e6-kube-api-access-t2xpd\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.892754 4771 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4bf90520-b383-4270-99ee-ef5a80d5fb78-var-run\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:37 crc kubenswrapper[4771]: I0227 01:23:37.892818 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bwzh\" (UniqueName: \"kubernetes.io/projected/4bf90520-b383-4270-99ee-ef5a80d5fb78-kube-api-access-7bwzh\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.265522 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9528l" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.265785 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9528l" event={"ID":"16d958eb-cffb-4b89-bd08-7a15d68297e6","Type":"ContainerDied","Data":"a8c543eab4d09d7500ed2af035ec8269e5f0b1a1f723694db0f82837e7829660"} Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.266665 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8c543eab4d09d7500ed2af035ec8269e5f0b1a1f723694db0f82837e7829660" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.269627 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s5lkp-config-zx6vm" event={"ID":"4bf90520-b383-4270-99ee-ef5a80d5fb78","Type":"ContainerDied","Data":"f17367e644b5023868526443971b0a2d3a19d45b330588329e129c2ed4e76bd0"} Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.269666 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f17367e644b5023868526443971b0a2d3a19d45b330588329e129c2ed4e76bd0" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.269734 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s5lkp-config-zx6vm" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.294143 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"251e5c6f-c762-4a6e-9253-81f94d592239","Type":"ContainerStarted","Data":"d60da3d06aed7c88c439e787de4a416a9eb96b881e1f9614fe3162ad3891b695"} Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.294227 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"251e5c6f-c762-4a6e-9253-81f94d592239","Type":"ContainerStarted","Data":"df5da1f6d3d50a4b195bf0dc1acf06c61b036cf4de30086c6ec2d3ba07731507"} Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.294248 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"251e5c6f-c762-4a6e-9253-81f94d592239","Type":"ContainerStarted","Data":"86ff0e594224c0e4afe6ad1821eff0fc932d7bc173fc3a70f558b4a6a78855b8"} Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.342629 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=25.551805388 podStartE2EDuration="34.342610207s" podCreationTimestamp="2026-02-27 01:23:04 +0000 UTC" firstStartedPulling="2026-02-27 01:23:27.608919371 +0000 UTC m=+1120.546480659" lastFinishedPulling="2026-02-27 01:23:36.39972419 +0000 UTC m=+1129.337285478" observedRunningTime="2026-02-27 01:23:38.339369609 +0000 UTC m=+1131.276930927" watchObservedRunningTime="2026-02-27 01:23:38.342610207 +0000 UTC m=+1131.280171495" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.644464 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-v5lnp"] Feb 27 01:23:38 crc kubenswrapper[4771]: E0227 01:23:38.644831 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bf90520-b383-4270-99ee-ef5a80d5fb78" containerName="ovn-config" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.644848 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bf90520-b383-4270-99ee-ef5a80d5fb78" containerName="ovn-config" Feb 27 01:23:38 crc kubenswrapper[4771]: E0227 01:23:38.644871 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16d958eb-cffb-4b89-bd08-7a15d68297e6" containerName="mariadb-account-create-update" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.644879 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d958eb-cffb-4b89-bd08-7a15d68297e6" containerName="mariadb-account-create-update" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.645028 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bf90520-b383-4270-99ee-ef5a80d5fb78" containerName="ovn-config" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.645044 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="16d958eb-cffb-4b89-bd08-7a15d68297e6" containerName="mariadb-account-create-update" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.648537 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-v5lnp" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.651406 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-v5lnp"] Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.653373 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.717945 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-v5lnp\" (UID: \"c21e258c-5496-49af-a0f2-9515eea67a47\") " pod="openstack/dnsmasq-dns-764c5664d7-v5lnp" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.717985 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-v5lnp\" (UID: \"c21e258c-5496-49af-a0f2-9515eea67a47\") " pod="openstack/dnsmasq-dns-764c5664d7-v5lnp" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.718025 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-dns-svc\") pod \"dnsmasq-dns-764c5664d7-v5lnp\" (UID: \"c21e258c-5496-49af-a0f2-9515eea67a47\") " pod="openstack/dnsmasq-dns-764c5664d7-v5lnp" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.718044 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-config\") pod \"dnsmasq-dns-764c5664d7-v5lnp\" (UID: \"c21e258c-5496-49af-a0f2-9515eea67a47\") " pod="openstack/dnsmasq-dns-764c5664d7-v5lnp" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.718127 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f96kf\" (UniqueName: \"kubernetes.io/projected/c21e258c-5496-49af-a0f2-9515eea67a47-kube-api-access-f96kf\") pod \"dnsmasq-dns-764c5664d7-v5lnp\" (UID: \"c21e258c-5496-49af-a0f2-9515eea67a47\") " pod="openstack/dnsmasq-dns-764c5664d7-v5lnp" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.718149 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-v5lnp\" (UID: \"c21e258c-5496-49af-a0f2-9515eea67a47\") " pod="openstack/dnsmasq-dns-764c5664d7-v5lnp" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.753734 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-s5lkp-config-zx6vm"] Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.763096 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xxhc8" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.765590 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-s5lkp-config-zx6vm"] Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.820133 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f96kf\" (UniqueName: \"kubernetes.io/projected/c21e258c-5496-49af-a0f2-9515eea67a47-kube-api-access-f96kf\") pod \"dnsmasq-dns-764c5664d7-v5lnp\" (UID: \"c21e258c-5496-49af-a0f2-9515eea67a47\") " pod="openstack/dnsmasq-dns-764c5664d7-v5lnp" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.820185 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-v5lnp\" (UID: \"c21e258c-5496-49af-a0f2-9515eea67a47\") " pod="openstack/dnsmasq-dns-764c5664d7-v5lnp" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.820225 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-v5lnp\" (UID: \"c21e258c-5496-49af-a0f2-9515eea67a47\") " pod="openstack/dnsmasq-dns-764c5664d7-v5lnp" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.820241 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-v5lnp\" (UID: \"c21e258c-5496-49af-a0f2-9515eea67a47\") " pod="openstack/dnsmasq-dns-764c5664d7-v5lnp" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.820268 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-dns-svc\") pod \"dnsmasq-dns-764c5664d7-v5lnp\" (UID: \"c21e258c-5496-49af-a0f2-9515eea67a47\") " pod="openstack/dnsmasq-dns-764c5664d7-v5lnp" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.820286 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-config\") pod \"dnsmasq-dns-764c5664d7-v5lnp\" (UID: \"c21e258c-5496-49af-a0f2-9515eea67a47\") " pod="openstack/dnsmasq-dns-764c5664d7-v5lnp" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.821397 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-config\") pod \"dnsmasq-dns-764c5664d7-v5lnp\" (UID: \"c21e258c-5496-49af-a0f2-9515eea67a47\") " pod="openstack/dnsmasq-dns-764c5664d7-v5lnp" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.821493 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-v5lnp\" (UID: \"c21e258c-5496-49af-a0f2-9515eea67a47\") " pod="openstack/dnsmasq-dns-764c5664d7-v5lnp" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.822104 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-v5lnp\" (UID: \"c21e258c-5496-49af-a0f2-9515eea67a47\") " pod="openstack/dnsmasq-dns-764c5664d7-v5lnp" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.822245 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-v5lnp\" (UID: \"c21e258c-5496-49af-a0f2-9515eea67a47\") " pod="openstack/dnsmasq-dns-764c5664d7-v5lnp" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.822635 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-dns-svc\") pod \"dnsmasq-dns-764c5664d7-v5lnp\" (UID: \"c21e258c-5496-49af-a0f2-9515eea67a47\") " pod="openstack/dnsmasq-dns-764c5664d7-v5lnp" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.843776 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f96kf\" (UniqueName: \"kubernetes.io/projected/c21e258c-5496-49af-a0f2-9515eea67a47-kube-api-access-f96kf\") pod \"dnsmasq-dns-764c5664d7-v5lnp\" (UID: \"c21e258c-5496-49af-a0f2-9515eea67a47\") " pod="openstack/dnsmasq-dns-764c5664d7-v5lnp" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.921091 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e019cf0-706d-444b-98cb-b07123d2a0d1-combined-ca-bundle\") pod \"1e019cf0-706d-444b-98cb-b07123d2a0d1\" (UID: \"1e019cf0-706d-444b-98cb-b07123d2a0d1\") " Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.921314 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e019cf0-706d-444b-98cb-b07123d2a0d1-config-data\") pod \"1e019cf0-706d-444b-98cb-b07123d2a0d1\" (UID: \"1e019cf0-706d-444b-98cb-b07123d2a0d1\") " Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.922004 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9wfk\" (UniqueName: \"kubernetes.io/projected/1e019cf0-706d-444b-98cb-b07123d2a0d1-kube-api-access-w9wfk\") pod \"1e019cf0-706d-444b-98cb-b07123d2a0d1\" (UID: \"1e019cf0-706d-444b-98cb-b07123d2a0d1\") " Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.922259 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1e019cf0-706d-444b-98cb-b07123d2a0d1-db-sync-config-data\") pod \"1e019cf0-706d-444b-98cb-b07123d2a0d1\" (UID: \"1e019cf0-706d-444b-98cb-b07123d2a0d1\") " Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.925415 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e019cf0-706d-444b-98cb-b07123d2a0d1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1e019cf0-706d-444b-98cb-b07123d2a0d1" (UID: "1e019cf0-706d-444b-98cb-b07123d2a0d1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.927718 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e019cf0-706d-444b-98cb-b07123d2a0d1-kube-api-access-w9wfk" (OuterVolumeSpecName: "kube-api-access-w9wfk") pod "1e019cf0-706d-444b-98cb-b07123d2a0d1" (UID: "1e019cf0-706d-444b-98cb-b07123d2a0d1"). InnerVolumeSpecName "kube-api-access-w9wfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.963279 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-v5lnp" Feb 27 01:23:38 crc kubenswrapper[4771]: I0227 01:23:38.971233 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e019cf0-706d-444b-98cb-b07123d2a0d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e019cf0-706d-444b-98cb-b07123d2a0d1" (UID: "1e019cf0-706d-444b-98cb-b07123d2a0d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.024544 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e019cf0-706d-444b-98cb-b07123d2a0d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.024642 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9wfk\" (UniqueName: \"kubernetes.io/projected/1e019cf0-706d-444b-98cb-b07123d2a0d1-kube-api-access-w9wfk\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.024662 4771 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1e019cf0-706d-444b-98cb-b07123d2a0d1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.033705 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e019cf0-706d-444b-98cb-b07123d2a0d1-config-data" (OuterVolumeSpecName: "config-data") pod "1e019cf0-706d-444b-98cb-b07123d2a0d1" (UID: "1e019cf0-706d-444b-98cb-b07123d2a0d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.125581 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e019cf0-706d-444b-98cb-b07123d2a0d1-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.309843 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xxhc8" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.309925 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xxhc8" event={"ID":"1e019cf0-706d-444b-98cb-b07123d2a0d1","Type":"ContainerDied","Data":"eeae6e297cce627c7bea8b24179c57a1ea7da3a21f8338deb8d4eaa1e5ac88fd"} Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.309953 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eeae6e297cce627c7bea8b24179c57a1ea7da3a21f8338deb8d4eaa1e5ac88fd" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.489482 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-v5lnp"] Feb 27 01:23:39 crc kubenswrapper[4771]: W0227 01:23:39.501998 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc21e258c_5496_49af_a0f2_9515eea67a47.slice/crio-d6c34542f0830220badc67268bd2797f24512bcc458b765e1ba912f4a1c1d4e5 WatchSource:0}: Error finding container d6c34542f0830220badc67268bd2797f24512bcc458b765e1ba912f4a1c1d4e5: Status 404 returned error can't find the container with id d6c34542f0830220badc67268bd2797f24512bcc458b765e1ba912f4a1c1d4e5 Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.636994 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-v5lnp"] Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.666344 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-xsnfw"] Feb 27 01:23:39 crc kubenswrapper[4771]: E0227 01:23:39.666666 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e019cf0-706d-444b-98cb-b07123d2a0d1" containerName="glance-db-sync" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.666683 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e019cf0-706d-444b-98cb-b07123d2a0d1" containerName="glance-db-sync" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.666862 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e019cf0-706d-444b-98cb-b07123d2a0d1" containerName="glance-db-sync" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.667813 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.685126 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-xsnfw"] Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.736819 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-xsnfw\" (UID: \"b1cbef08-6bd3-4010-8d53-914b02a1d670\") " pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.736886 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-xsnfw\" (UID: \"b1cbef08-6bd3-4010-8d53-914b02a1d670\") " pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.736919 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-xsnfw\" (UID: \"b1cbef08-6bd3-4010-8d53-914b02a1d670\") " pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.737101 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-config\") pod \"dnsmasq-dns-74f6bcbc87-xsnfw\" (UID: \"b1cbef08-6bd3-4010-8d53-914b02a1d670\") " pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.737220 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zhjb\" (UniqueName: \"kubernetes.io/projected/b1cbef08-6bd3-4010-8d53-914b02a1d670-kube-api-access-2zhjb\") pod \"dnsmasq-dns-74f6bcbc87-xsnfw\" (UID: \"b1cbef08-6bd3-4010-8d53-914b02a1d670\") " pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.737330 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-xsnfw\" (UID: \"b1cbef08-6bd3-4010-8d53-914b02a1d670\") " pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.762700 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.788754 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bf90520-b383-4270-99ee-ef5a80d5fb78" path="/var/lib/kubelet/pods/4bf90520-b383-4270-99ee-ef5a80d5fb78/volumes" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.825778 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.839149 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-xsnfw\" (UID: \"b1cbef08-6bd3-4010-8d53-914b02a1d670\") " pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.839197 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-xsnfw\" (UID: \"b1cbef08-6bd3-4010-8d53-914b02a1d670\") " pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.839278 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-config\") pod \"dnsmasq-dns-74f6bcbc87-xsnfw\" (UID: \"b1cbef08-6bd3-4010-8d53-914b02a1d670\") " pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.839306 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zhjb\" (UniqueName: \"kubernetes.io/projected/b1cbef08-6bd3-4010-8d53-914b02a1d670-kube-api-access-2zhjb\") pod \"dnsmasq-dns-74f6bcbc87-xsnfw\" (UID: \"b1cbef08-6bd3-4010-8d53-914b02a1d670\") " pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.839367 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-xsnfw\" (UID: \"b1cbef08-6bd3-4010-8d53-914b02a1d670\") " pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.839427 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-xsnfw\" (UID: \"b1cbef08-6bd3-4010-8d53-914b02a1d670\") " pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.840179 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-xsnfw\" (UID: \"b1cbef08-6bd3-4010-8d53-914b02a1d670\") " pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.840482 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-xsnfw\" (UID: \"b1cbef08-6bd3-4010-8d53-914b02a1d670\") " pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.840605 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-xsnfw\" (UID: \"b1cbef08-6bd3-4010-8d53-914b02a1d670\") " pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.841400 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-config\") pod \"dnsmasq-dns-74f6bcbc87-xsnfw\" (UID: \"b1cbef08-6bd3-4010-8d53-914b02a1d670\") " pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.841446 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-xsnfw\" (UID: \"b1cbef08-6bd3-4010-8d53-914b02a1d670\") " pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" Feb 27 01:23:39 crc kubenswrapper[4771]: I0227 01:23:39.870299 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zhjb\" (UniqueName: \"kubernetes.io/projected/b1cbef08-6bd3-4010-8d53-914b02a1d670-kube-api-access-2zhjb\") pod \"dnsmasq-dns-74f6bcbc87-xsnfw\" (UID: \"b1cbef08-6bd3-4010-8d53-914b02a1d670\") " pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.006508 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.164291 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-5lfsc"] Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.165679 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5lfsc" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.175715 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-5696-account-create-update-7mx4f"] Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.177050 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5696-account-create-update-7mx4f" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.182886 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.187590 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-5lfsc"] Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.211996 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-5696-account-create-update-7mx4f"] Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.245572 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqv9q\" (UniqueName: \"kubernetes.io/projected/46fd6f86-04d4-4c78-bbf8-9f057ac4308b-kube-api-access-pqv9q\") pod \"cinder-5696-account-create-update-7mx4f\" (UID: \"46fd6f86-04d4-4c78-bbf8-9f057ac4308b\") " pod="openstack/cinder-5696-account-create-update-7mx4f" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.245631 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp4k6\" (UniqueName: \"kubernetes.io/projected/733a6478-88f8-4dd2-ad0b-fa824ec14a4d-kube-api-access-xp4k6\") pod \"cinder-db-create-5lfsc\" (UID: \"733a6478-88f8-4dd2-ad0b-fa824ec14a4d\") " pod="openstack/cinder-db-create-5lfsc" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.245654 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46fd6f86-04d4-4c78-bbf8-9f057ac4308b-operator-scripts\") pod \"cinder-5696-account-create-update-7mx4f\" (UID: \"46fd6f86-04d4-4c78-bbf8-9f057ac4308b\") " pod="openstack/cinder-5696-account-create-update-7mx4f" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.245708 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/733a6478-88f8-4dd2-ad0b-fa824ec14a4d-operator-scripts\") pod \"cinder-db-create-5lfsc\" (UID: \"733a6478-88f8-4dd2-ad0b-fa824ec14a4d\") " pod="openstack/cinder-db-create-5lfsc" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.317829 4771 generic.go:334] "Generic (PLEG): container finished" podID="c21e258c-5496-49af-a0f2-9515eea67a47" containerID="38419f341b794bbcd198bfb12317a21b838eeaaa48420f096e07d1f824d71abc" exitCode=0 Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.317869 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-v5lnp" event={"ID":"c21e258c-5496-49af-a0f2-9515eea67a47","Type":"ContainerDied","Data":"38419f341b794bbcd198bfb12317a21b838eeaaa48420f096e07d1f824d71abc"} Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.317894 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-v5lnp" event={"ID":"c21e258c-5496-49af-a0f2-9515eea67a47","Type":"ContainerStarted","Data":"d6c34542f0830220badc67268bd2797f24512bcc458b765e1ba912f4a1c1d4e5"} Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.353581 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqv9q\" (UniqueName: \"kubernetes.io/projected/46fd6f86-04d4-4c78-bbf8-9f057ac4308b-kube-api-access-pqv9q\") pod \"cinder-5696-account-create-update-7mx4f\" (UID: \"46fd6f86-04d4-4c78-bbf8-9f057ac4308b\") " pod="openstack/cinder-5696-account-create-update-7mx4f" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.353629 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp4k6\" (UniqueName: \"kubernetes.io/projected/733a6478-88f8-4dd2-ad0b-fa824ec14a4d-kube-api-access-xp4k6\") pod \"cinder-db-create-5lfsc\" (UID: \"733a6478-88f8-4dd2-ad0b-fa824ec14a4d\") " pod="openstack/cinder-db-create-5lfsc" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.353655 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46fd6f86-04d4-4c78-bbf8-9f057ac4308b-operator-scripts\") pod \"cinder-5696-account-create-update-7mx4f\" (UID: \"46fd6f86-04d4-4c78-bbf8-9f057ac4308b\") " pod="openstack/cinder-5696-account-create-update-7mx4f" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.353711 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/733a6478-88f8-4dd2-ad0b-fa824ec14a4d-operator-scripts\") pod \"cinder-db-create-5lfsc\" (UID: \"733a6478-88f8-4dd2-ad0b-fa824ec14a4d\") " pod="openstack/cinder-db-create-5lfsc" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.354384 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/733a6478-88f8-4dd2-ad0b-fa824ec14a4d-operator-scripts\") pod \"cinder-db-create-5lfsc\" (UID: \"733a6478-88f8-4dd2-ad0b-fa824ec14a4d\") " pod="openstack/cinder-db-create-5lfsc" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.355141 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46fd6f86-04d4-4c78-bbf8-9f057ac4308b-operator-scripts\") pod \"cinder-5696-account-create-update-7mx4f\" (UID: \"46fd6f86-04d4-4c78-bbf8-9f057ac4308b\") " pod="openstack/cinder-5696-account-create-update-7mx4f" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.368381 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-cg8jm"] Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.369471 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cg8jm" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.397174 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqv9q\" (UniqueName: \"kubernetes.io/projected/46fd6f86-04d4-4c78-bbf8-9f057ac4308b-kube-api-access-pqv9q\") pod \"cinder-5696-account-create-update-7mx4f\" (UID: \"46fd6f86-04d4-4c78-bbf8-9f057ac4308b\") " pod="openstack/cinder-5696-account-create-update-7mx4f" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.418474 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp4k6\" (UniqueName: \"kubernetes.io/projected/733a6478-88f8-4dd2-ad0b-fa824ec14a4d-kube-api-access-xp4k6\") pod \"cinder-db-create-5lfsc\" (UID: \"733a6478-88f8-4dd2-ad0b-fa824ec14a4d\") " pod="openstack/cinder-db-create-5lfsc" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.424814 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-cg8jm"] Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.433290 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-201a-account-create-update-b8qmr"] Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.434300 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-201a-account-create-update-b8qmr" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.439486 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.450069 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-201a-account-create-update-b8qmr"] Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.455296 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d47d411-ba32-47c5-96dc-448ab3aab865-operator-scripts\") pod \"barbican-201a-account-create-update-b8qmr\" (UID: \"9d47d411-ba32-47c5-96dc-448ab3aab865\") " pod="openstack/barbican-201a-account-create-update-b8qmr" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.455730 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjsst\" (UniqueName: \"kubernetes.io/projected/634488bc-0dd8-4dbc-8be4-99328c6a0088-kube-api-access-sjsst\") pod \"barbican-db-create-cg8jm\" (UID: \"634488bc-0dd8-4dbc-8be4-99328c6a0088\") " pod="openstack/barbican-db-create-cg8jm" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.455817 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/634488bc-0dd8-4dbc-8be4-99328c6a0088-operator-scripts\") pod \"barbican-db-create-cg8jm\" (UID: \"634488bc-0dd8-4dbc-8be4-99328c6a0088\") " pod="openstack/barbican-db-create-cg8jm" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.455910 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbtml\" (UniqueName: \"kubernetes.io/projected/9d47d411-ba32-47c5-96dc-448ab3aab865-kube-api-access-pbtml\") pod \"barbican-201a-account-create-update-b8qmr\" (UID: \"9d47d411-ba32-47c5-96dc-448ab3aab865\") " pod="openstack/barbican-201a-account-create-update-b8qmr" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.496239 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-sswqd"] Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.497965 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sswqd" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.501943 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5lfsc" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.519007 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5696-account-create-update-7mx4f" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.551855 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-sswqd"] Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.560332 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjsst\" (UniqueName: \"kubernetes.io/projected/634488bc-0dd8-4dbc-8be4-99328c6a0088-kube-api-access-sjsst\") pod \"barbican-db-create-cg8jm\" (UID: \"634488bc-0dd8-4dbc-8be4-99328c6a0088\") " pod="openstack/barbican-db-create-cg8jm" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.560650 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/634488bc-0dd8-4dbc-8be4-99328c6a0088-operator-scripts\") pod \"barbican-db-create-cg8jm\" (UID: \"634488bc-0dd8-4dbc-8be4-99328c6a0088\") " pod="openstack/barbican-db-create-cg8jm" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.560716 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crvcc\" (UniqueName: \"kubernetes.io/projected/b9b12792-b88f-4d69-8df2-03dec12a53ac-kube-api-access-crvcc\") pod \"neutron-db-create-sswqd\" (UID: \"b9b12792-b88f-4d69-8df2-03dec12a53ac\") " pod="openstack/neutron-db-create-sswqd" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.560746 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbtml\" (UniqueName: \"kubernetes.io/projected/9d47d411-ba32-47c5-96dc-448ab3aab865-kube-api-access-pbtml\") pod \"barbican-201a-account-create-update-b8qmr\" (UID: \"9d47d411-ba32-47c5-96dc-448ab3aab865\") " pod="openstack/barbican-201a-account-create-update-b8qmr" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.560787 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9b12792-b88f-4d69-8df2-03dec12a53ac-operator-scripts\") pod \"neutron-db-create-sswqd\" (UID: \"b9b12792-b88f-4d69-8df2-03dec12a53ac\") " pod="openstack/neutron-db-create-sswqd" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.560826 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d47d411-ba32-47c5-96dc-448ab3aab865-operator-scripts\") pod \"barbican-201a-account-create-update-b8qmr\" (UID: \"9d47d411-ba32-47c5-96dc-448ab3aab865\") " pod="openstack/barbican-201a-account-create-update-b8qmr" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.561538 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d47d411-ba32-47c5-96dc-448ab3aab865-operator-scripts\") pod \"barbican-201a-account-create-update-b8qmr\" (UID: \"9d47d411-ba32-47c5-96dc-448ab3aab865\") " pod="openstack/barbican-201a-account-create-update-b8qmr" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.571094 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/634488bc-0dd8-4dbc-8be4-99328c6a0088-operator-scripts\") pod \"barbican-db-create-cg8jm\" (UID: \"634488bc-0dd8-4dbc-8be4-99328c6a0088\") " pod="openstack/barbican-db-create-cg8jm" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.612762 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjsst\" (UniqueName: \"kubernetes.io/projected/634488bc-0dd8-4dbc-8be4-99328c6a0088-kube-api-access-sjsst\") pod \"barbican-db-create-cg8jm\" (UID: \"634488bc-0dd8-4dbc-8be4-99328c6a0088\") " pod="openstack/barbican-db-create-cg8jm" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.617601 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbtml\" (UniqueName: \"kubernetes.io/projected/9d47d411-ba32-47c5-96dc-448ab3aab865-kube-api-access-pbtml\") pod \"barbican-201a-account-create-update-b8qmr\" (UID: \"9d47d411-ba32-47c5-96dc-448ab3aab865\") " pod="openstack/barbican-201a-account-create-update-b8qmr" Feb 27 01:23:40 crc kubenswrapper[4771]: W0227 01:23:40.650603 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1cbef08_6bd3_4010_8d53_914b02a1d670.slice/crio-55cadb4e1601e7eec6949f561b2fddf69faa521c797b9684a117ddc636666383 WatchSource:0}: Error finding container 55cadb4e1601e7eec6949f561b2fddf69faa521c797b9684a117ddc636666383: Status 404 returned error can't find the container with id 55cadb4e1601e7eec6949f561b2fddf69faa521c797b9684a117ddc636666383 Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.653383 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-l4w7n"] Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.654635 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-l4w7n" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.658124 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-7nd8k" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.658330 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.658728 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.660623 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-xsnfw"] Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.662427 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crvcc\" (UniqueName: \"kubernetes.io/projected/b9b12792-b88f-4d69-8df2-03dec12a53ac-kube-api-access-crvcc\") pod \"neutron-db-create-sswqd\" (UID: \"b9b12792-b88f-4d69-8df2-03dec12a53ac\") " pod="openstack/neutron-db-create-sswqd" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.662493 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9b12792-b88f-4d69-8df2-03dec12a53ac-operator-scripts\") pod \"neutron-db-create-sswqd\" (UID: \"b9b12792-b88f-4d69-8df2-03dec12a53ac\") " pod="openstack/neutron-db-create-sswqd" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.664346 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.665268 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9b12792-b88f-4d69-8df2-03dec12a53ac-operator-scripts\") pod \"neutron-db-create-sswqd\" (UID: \"b9b12792-b88f-4d69-8df2-03dec12a53ac\") " pod="openstack/neutron-db-create-sswqd" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.669270 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-l4w7n"] Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.675533 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b38e-account-create-update-d286j"] Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.676656 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b38e-account-create-update-d286j" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.681948 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.685709 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crvcc\" (UniqueName: \"kubernetes.io/projected/b9b12792-b88f-4d69-8df2-03dec12a53ac-kube-api-access-crvcc\") pod \"neutron-db-create-sswqd\" (UID: \"b9b12792-b88f-4d69-8df2-03dec12a53ac\") " pod="openstack/neutron-db-create-sswqd" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.685774 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b38e-account-create-update-d286j"] Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.726405 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cg8jm" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.771895 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjh2n\" (UniqueName: \"kubernetes.io/projected/091872dd-fe0b-4e93-a837-2fd692af8f21-kube-api-access-hjh2n\") pod \"neutron-b38e-account-create-update-d286j\" (UID: \"091872dd-fe0b-4e93-a837-2fd692af8f21\") " pod="openstack/neutron-b38e-account-create-update-d286j" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.772049 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf593882-913d-4168-b14f-c7df95930f73-config-data\") pod \"keystone-db-sync-l4w7n\" (UID: \"cf593882-913d-4168-b14f-c7df95930f73\") " pod="openstack/keystone-db-sync-l4w7n" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.772074 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/091872dd-fe0b-4e93-a837-2fd692af8f21-operator-scripts\") pod \"neutron-b38e-account-create-update-d286j\" (UID: \"091872dd-fe0b-4e93-a837-2fd692af8f21\") " pod="openstack/neutron-b38e-account-create-update-d286j" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.772114 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c269k\" (UniqueName: \"kubernetes.io/projected/cf593882-913d-4168-b14f-c7df95930f73-kube-api-access-c269k\") pod \"keystone-db-sync-l4w7n\" (UID: \"cf593882-913d-4168-b14f-c7df95930f73\") " pod="openstack/keystone-db-sync-l4w7n" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.772173 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf593882-913d-4168-b14f-c7df95930f73-combined-ca-bundle\") pod \"keystone-db-sync-l4w7n\" (UID: \"cf593882-913d-4168-b14f-c7df95930f73\") " pod="openstack/keystone-db-sync-l4w7n" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.799879 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-201a-account-create-update-b8qmr" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.807181 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-v5lnp" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.815138 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sswqd" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.873495 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-dns-swift-storage-0\") pod \"c21e258c-5496-49af-a0f2-9515eea67a47\" (UID: \"c21e258c-5496-49af-a0f2-9515eea67a47\") " Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.875993 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-config\") pod \"c21e258c-5496-49af-a0f2-9515eea67a47\" (UID: \"c21e258c-5496-49af-a0f2-9515eea67a47\") " Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.876021 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-dns-svc\") pod \"c21e258c-5496-49af-a0f2-9515eea67a47\" (UID: \"c21e258c-5496-49af-a0f2-9515eea67a47\") " Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.876115 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f96kf\" (UniqueName: \"kubernetes.io/projected/c21e258c-5496-49af-a0f2-9515eea67a47-kube-api-access-f96kf\") pod \"c21e258c-5496-49af-a0f2-9515eea67a47\" (UID: \"c21e258c-5496-49af-a0f2-9515eea67a47\") " Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.876152 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-ovsdbserver-sb\") pod \"c21e258c-5496-49af-a0f2-9515eea67a47\" (UID: \"c21e258c-5496-49af-a0f2-9515eea67a47\") " Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.876172 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-ovsdbserver-nb\") pod \"c21e258c-5496-49af-a0f2-9515eea67a47\" (UID: \"c21e258c-5496-49af-a0f2-9515eea67a47\") " Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.876412 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf593882-913d-4168-b14f-c7df95930f73-combined-ca-bundle\") pod \"keystone-db-sync-l4w7n\" (UID: \"cf593882-913d-4168-b14f-c7df95930f73\") " pod="openstack/keystone-db-sync-l4w7n" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.876482 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjh2n\" (UniqueName: \"kubernetes.io/projected/091872dd-fe0b-4e93-a837-2fd692af8f21-kube-api-access-hjh2n\") pod \"neutron-b38e-account-create-update-d286j\" (UID: \"091872dd-fe0b-4e93-a837-2fd692af8f21\") " pod="openstack/neutron-b38e-account-create-update-d286j" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.876666 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf593882-913d-4168-b14f-c7df95930f73-config-data\") pod \"keystone-db-sync-l4w7n\" (UID: \"cf593882-913d-4168-b14f-c7df95930f73\") " pod="openstack/keystone-db-sync-l4w7n" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.876691 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/091872dd-fe0b-4e93-a837-2fd692af8f21-operator-scripts\") pod \"neutron-b38e-account-create-update-d286j\" (UID: \"091872dd-fe0b-4e93-a837-2fd692af8f21\") " pod="openstack/neutron-b38e-account-create-update-d286j" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.876734 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c269k\" (UniqueName: \"kubernetes.io/projected/cf593882-913d-4168-b14f-c7df95930f73-kube-api-access-c269k\") pod \"keystone-db-sync-l4w7n\" (UID: \"cf593882-913d-4168-b14f-c7df95930f73\") " pod="openstack/keystone-db-sync-l4w7n" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.890828 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/091872dd-fe0b-4e93-a837-2fd692af8f21-operator-scripts\") pod \"neutron-b38e-account-create-update-d286j\" (UID: \"091872dd-fe0b-4e93-a837-2fd692af8f21\") " pod="openstack/neutron-b38e-account-create-update-d286j" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.899497 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf593882-913d-4168-b14f-c7df95930f73-config-data\") pod \"keystone-db-sync-l4w7n\" (UID: \"cf593882-913d-4168-b14f-c7df95930f73\") " pod="openstack/keystone-db-sync-l4w7n" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.900357 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjh2n\" (UniqueName: \"kubernetes.io/projected/091872dd-fe0b-4e93-a837-2fd692af8f21-kube-api-access-hjh2n\") pod \"neutron-b38e-account-create-update-d286j\" (UID: \"091872dd-fe0b-4e93-a837-2fd692af8f21\") " pod="openstack/neutron-b38e-account-create-update-d286j" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.908085 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf593882-913d-4168-b14f-c7df95930f73-combined-ca-bundle\") pod \"keystone-db-sync-l4w7n\" (UID: \"cf593882-913d-4168-b14f-c7df95930f73\") " pod="openstack/keystone-db-sync-l4w7n" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.914487 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c269k\" (UniqueName: \"kubernetes.io/projected/cf593882-913d-4168-b14f-c7df95930f73-kube-api-access-c269k\") pod \"keystone-db-sync-l4w7n\" (UID: \"cf593882-913d-4168-b14f-c7df95930f73\") " pod="openstack/keystone-db-sync-l4w7n" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.914795 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c21e258c-5496-49af-a0f2-9515eea67a47-kube-api-access-f96kf" (OuterVolumeSpecName: "kube-api-access-f96kf") pod "c21e258c-5496-49af-a0f2-9515eea67a47" (UID: "c21e258c-5496-49af-a0f2-9515eea67a47"). InnerVolumeSpecName "kube-api-access-f96kf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.925336 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c21e258c-5496-49af-a0f2-9515eea67a47" (UID: "c21e258c-5496-49af-a0f2-9515eea67a47"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.959930 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-5696-account-create-update-7mx4f"] Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.969830 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c21e258c-5496-49af-a0f2-9515eea67a47" (UID: "c21e258c-5496-49af-a0f2-9515eea67a47"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.993019 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.993052 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f96kf\" (UniqueName: \"kubernetes.io/projected/c21e258c-5496-49af-a0f2-9515eea67a47-kube-api-access-f96kf\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.993063 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:40 crc kubenswrapper[4771]: I0227 01:23:40.997955 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-l4w7n" Feb 27 01:23:41 crc kubenswrapper[4771]: I0227 01:23:41.014284 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b38e-account-create-update-d286j" Feb 27 01:23:41 crc kubenswrapper[4771]: I0227 01:23:41.014350 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-config" (OuterVolumeSpecName: "config") pod "c21e258c-5496-49af-a0f2-9515eea67a47" (UID: "c21e258c-5496-49af-a0f2-9515eea67a47"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:41 crc kubenswrapper[4771]: I0227 01:23:41.039228 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c21e258c-5496-49af-a0f2-9515eea67a47" (UID: "c21e258c-5496-49af-a0f2-9515eea67a47"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:41 crc kubenswrapper[4771]: I0227 01:23:41.040038 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c21e258c-5496-49af-a0f2-9515eea67a47" (UID: "c21e258c-5496-49af-a0f2-9515eea67a47"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:41 crc kubenswrapper[4771]: I0227 01:23:41.084524 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-9528l"] Feb 27 01:23:41 crc kubenswrapper[4771]: I0227 01:23:41.094225 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:41 crc kubenswrapper[4771]: I0227 01:23:41.094248 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:41 crc kubenswrapper[4771]: I0227 01:23:41.094258 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c21e258c-5496-49af-a0f2-9515eea67a47-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:41 crc kubenswrapper[4771]: I0227 01:23:41.104524 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-9528l"] Feb 27 01:23:41 crc kubenswrapper[4771]: I0227 01:23:41.192254 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-5lfsc"] Feb 27 01:23:41 crc kubenswrapper[4771]: W0227 01:23:41.204231 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod733a6478_88f8_4dd2_ad0b_fa824ec14a4d.slice/crio-a62c92429942a753753c3f9317d8f5b9087f3b439dbbdbaf4dbefe8561e8141d WatchSource:0}: Error finding container a62c92429942a753753c3f9317d8f5b9087f3b439dbbdbaf4dbefe8561e8141d: Status 404 returned error can't find the container with id a62c92429942a753753c3f9317d8f5b9087f3b439dbbdbaf4dbefe8561e8141d Feb 27 01:23:41 crc kubenswrapper[4771]: I0227 01:23:41.329518 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5lfsc" event={"ID":"733a6478-88f8-4dd2-ad0b-fa824ec14a4d","Type":"ContainerStarted","Data":"a62c92429942a753753c3f9317d8f5b9087f3b439dbbdbaf4dbefe8561e8141d"} Feb 27 01:23:41 crc kubenswrapper[4771]: I0227 01:23:41.345371 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-v5lnp" Feb 27 01:23:41 crc kubenswrapper[4771]: I0227 01:23:41.347018 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-v5lnp" event={"ID":"c21e258c-5496-49af-a0f2-9515eea67a47","Type":"ContainerDied","Data":"d6c34542f0830220badc67268bd2797f24512bcc458b765e1ba912f4a1c1d4e5"} Feb 27 01:23:41 crc kubenswrapper[4771]: I0227 01:23:41.347080 4771 scope.go:117] "RemoveContainer" containerID="38419f341b794bbcd198bfb12317a21b838eeaaa48420f096e07d1f824d71abc" Feb 27 01:23:41 crc kubenswrapper[4771]: I0227 01:23:41.350817 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5696-account-create-update-7mx4f" event={"ID":"46fd6f86-04d4-4c78-bbf8-9f057ac4308b","Type":"ContainerStarted","Data":"031bdd9fd16539743643f8c4773b2ae677e3ecaf0f5d292f99bc62a8db55360c"} Feb 27 01:23:41 crc kubenswrapper[4771]: I0227 01:23:41.353320 4771 generic.go:334] "Generic (PLEG): container finished" podID="b1cbef08-6bd3-4010-8d53-914b02a1d670" containerID="8c02a2260da6fe30018e2305175d0b7d93abf4c3b41aa8c92426c5453b31b0fe" exitCode=0 Feb 27 01:23:41 crc kubenswrapper[4771]: I0227 01:23:41.353360 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" event={"ID":"b1cbef08-6bd3-4010-8d53-914b02a1d670","Type":"ContainerDied","Data":"8c02a2260da6fe30018e2305175d0b7d93abf4c3b41aa8c92426c5453b31b0fe"} Feb 27 01:23:41 crc kubenswrapper[4771]: I0227 01:23:41.353390 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" event={"ID":"b1cbef08-6bd3-4010-8d53-914b02a1d670","Type":"ContainerStarted","Data":"55cadb4e1601e7eec6949f561b2fddf69faa521c797b9684a117ddc636666383"} Feb 27 01:23:41 crc kubenswrapper[4771]: I0227 01:23:41.456038 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-cg8jm"] Feb 27 01:23:41 crc kubenswrapper[4771]: I0227 01:23:41.533652 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-v5lnp"] Feb 27 01:23:41 crc kubenswrapper[4771]: I0227 01:23:41.553605 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-v5lnp"] Feb 27 01:23:41 crc kubenswrapper[4771]: I0227 01:23:41.566348 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-201a-account-create-update-b8qmr"] Feb 27 01:23:41 crc kubenswrapper[4771]: I0227 01:23:41.803002 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16d958eb-cffb-4b89-bd08-7a15d68297e6" path="/var/lib/kubelet/pods/16d958eb-cffb-4b89-bd08-7a15d68297e6/volumes" Feb 27 01:23:41 crc kubenswrapper[4771]: I0227 01:23:41.803750 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c21e258c-5496-49af-a0f2-9515eea67a47" path="/var/lib/kubelet/pods/c21e258c-5496-49af-a0f2-9515eea67a47/volumes" Feb 27 01:23:41 crc kubenswrapper[4771]: I0227 01:23:41.835706 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-sswqd"] Feb 27 01:23:41 crc kubenswrapper[4771]: I0227 01:23:41.969013 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-l4w7n"] Feb 27 01:23:42 crc kubenswrapper[4771]: I0227 01:23:42.080399 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b38e-account-create-update-d286j"] Feb 27 01:23:42 crc kubenswrapper[4771]: I0227 01:23:42.363269 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" event={"ID":"b1cbef08-6bd3-4010-8d53-914b02a1d670","Type":"ContainerStarted","Data":"810d51041dcb69779df1a1971d06c3637f8c0a921f433df928b0fcf7547cb422"} Feb 27 01:23:42 crc kubenswrapper[4771]: I0227 01:23:42.363442 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" Feb 27 01:23:42 crc kubenswrapper[4771]: I0227 01:23:42.364844 4771 generic.go:334] "Generic (PLEG): container finished" podID="634488bc-0dd8-4dbc-8be4-99328c6a0088" containerID="34409f10da27221006f5134c9147e1b83cd82ce20cd7bfc0019338bfb6fee5a3" exitCode=0 Feb 27 01:23:42 crc kubenswrapper[4771]: I0227 01:23:42.365185 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cg8jm" event={"ID":"634488bc-0dd8-4dbc-8be4-99328c6a0088","Type":"ContainerDied","Data":"34409f10da27221006f5134c9147e1b83cd82ce20cd7bfc0019338bfb6fee5a3"} Feb 27 01:23:42 crc kubenswrapper[4771]: I0227 01:23:42.365210 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cg8jm" event={"ID":"634488bc-0dd8-4dbc-8be4-99328c6a0088","Type":"ContainerStarted","Data":"65d0b8a5444cd296579faf68e4fcb66f38297e2fe22e26d79a61eef90ed7dc98"} Feb 27 01:23:42 crc kubenswrapper[4771]: I0227 01:23:42.367795 4771 generic.go:334] "Generic (PLEG): container finished" podID="46fd6f86-04d4-4c78-bbf8-9f057ac4308b" containerID="b2c1d545d61e8fbda505e99057e7cbf67ea03b9e8da2fabebf60d89760134563" exitCode=0 Feb 27 01:23:42 crc kubenswrapper[4771]: I0227 01:23:42.367847 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5696-account-create-update-7mx4f" event={"ID":"46fd6f86-04d4-4c78-bbf8-9f057ac4308b","Type":"ContainerDied","Data":"b2c1d545d61e8fbda505e99057e7cbf67ea03b9e8da2fabebf60d89760134563"} Feb 27 01:23:42 crc kubenswrapper[4771]: I0227 01:23:42.372877 4771 generic.go:334] "Generic (PLEG): container finished" podID="733a6478-88f8-4dd2-ad0b-fa824ec14a4d" containerID="086025db74cc9b34ed6cf0ddb188159f320cba3aa21b4cc17b433b8b021fd3de" exitCode=0 Feb 27 01:23:42 crc kubenswrapper[4771]: I0227 01:23:42.372962 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5lfsc" event={"ID":"733a6478-88f8-4dd2-ad0b-fa824ec14a4d","Type":"ContainerDied","Data":"086025db74cc9b34ed6cf0ddb188159f320cba3aa21b4cc17b433b8b021fd3de"} Feb 27 01:23:42 crc kubenswrapper[4771]: I0227 01:23:42.374075 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b38e-account-create-update-d286j" event={"ID":"091872dd-fe0b-4e93-a837-2fd692af8f21","Type":"ContainerStarted","Data":"9ff7ce57d4423fe2227b6a2c693644f4856a65ce36ec3acc1324a1adfabf219b"} Feb 27 01:23:42 crc kubenswrapper[4771]: I0227 01:23:42.375395 4771 generic.go:334] "Generic (PLEG): container finished" podID="b9b12792-b88f-4d69-8df2-03dec12a53ac" containerID="ac172242da8d96913693eb009e7bd6164a310f4fb458cfa6efe59999b377503f" exitCode=0 Feb 27 01:23:42 crc kubenswrapper[4771]: I0227 01:23:42.375453 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-sswqd" event={"ID":"b9b12792-b88f-4d69-8df2-03dec12a53ac","Type":"ContainerDied","Data":"ac172242da8d96913693eb009e7bd6164a310f4fb458cfa6efe59999b377503f"} Feb 27 01:23:42 crc kubenswrapper[4771]: I0227 01:23:42.375474 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-sswqd" event={"ID":"b9b12792-b88f-4d69-8df2-03dec12a53ac","Type":"ContainerStarted","Data":"89c05fa8d5d2b6c352b62ac6e09e83228f24c1d19394a3948cb08b10833b821f"} Feb 27 01:23:42 crc kubenswrapper[4771]: I0227 01:23:42.377230 4771 generic.go:334] "Generic (PLEG): container finished" podID="9d47d411-ba32-47c5-96dc-448ab3aab865" containerID="5c83d516e9c76ea279bfeec3f69185c3127e0875de93167efaf0fe4233e0ed79" exitCode=0 Feb 27 01:23:42 crc kubenswrapper[4771]: I0227 01:23:42.377256 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-201a-account-create-update-b8qmr" event={"ID":"9d47d411-ba32-47c5-96dc-448ab3aab865","Type":"ContainerDied","Data":"5c83d516e9c76ea279bfeec3f69185c3127e0875de93167efaf0fe4233e0ed79"} Feb 27 01:23:42 crc kubenswrapper[4771]: I0227 01:23:42.377280 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-201a-account-create-update-b8qmr" event={"ID":"9d47d411-ba32-47c5-96dc-448ab3aab865","Type":"ContainerStarted","Data":"47329ce48ac85a03f1d62a1c6925f1b4d29503201ce9f5445a9937c1b68dece6"} Feb 27 01:23:42 crc kubenswrapper[4771]: I0227 01:23:42.382122 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-l4w7n" event={"ID":"cf593882-913d-4168-b14f-c7df95930f73","Type":"ContainerStarted","Data":"4e657f696767342bd32dbc1cb53358205f5de6a7123aed3a545a8eaf29d29961"} Feb 27 01:23:42 crc kubenswrapper[4771]: I0227 01:23:42.388453 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" podStartSLOduration=3.388433404 podStartE2EDuration="3.388433404s" podCreationTimestamp="2026-02-27 01:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:23:42.387454098 +0000 UTC m=+1135.325015386" watchObservedRunningTime="2026-02-27 01:23:42.388433404 +0000 UTC m=+1135.325994692" Feb 27 01:23:43 crc kubenswrapper[4771]: I0227 01:23:43.394431 4771 generic.go:334] "Generic (PLEG): container finished" podID="091872dd-fe0b-4e93-a837-2fd692af8f21" containerID="a9343567e9e56567b52bb82f302faacea0cb49a9e19b11a33b359b8ee0113ca6" exitCode=0 Feb 27 01:23:43 crc kubenswrapper[4771]: I0227 01:23:43.394603 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b38e-account-create-update-d286j" event={"ID":"091872dd-fe0b-4e93-a837-2fd692af8f21","Type":"ContainerDied","Data":"a9343567e9e56567b52bb82f302faacea0cb49a9e19b11a33b359b8ee0113ca6"} Feb 27 01:23:43 crc kubenswrapper[4771]: I0227 01:23:43.801356 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cg8jm" Feb 27 01:23:43 crc kubenswrapper[4771]: I0227 01:23:43.855728 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/634488bc-0dd8-4dbc-8be4-99328c6a0088-operator-scripts\") pod \"634488bc-0dd8-4dbc-8be4-99328c6a0088\" (UID: \"634488bc-0dd8-4dbc-8be4-99328c6a0088\") " Feb 27 01:23:43 crc kubenswrapper[4771]: I0227 01:23:43.855972 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjsst\" (UniqueName: \"kubernetes.io/projected/634488bc-0dd8-4dbc-8be4-99328c6a0088-kube-api-access-sjsst\") pod \"634488bc-0dd8-4dbc-8be4-99328c6a0088\" (UID: \"634488bc-0dd8-4dbc-8be4-99328c6a0088\") " Feb 27 01:23:43 crc kubenswrapper[4771]: I0227 01:23:43.857665 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/634488bc-0dd8-4dbc-8be4-99328c6a0088-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "634488bc-0dd8-4dbc-8be4-99328c6a0088" (UID: "634488bc-0dd8-4dbc-8be4-99328c6a0088"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:43 crc kubenswrapper[4771]: I0227 01:23:43.865171 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/634488bc-0dd8-4dbc-8be4-99328c6a0088-kube-api-access-sjsst" (OuterVolumeSpecName: "kube-api-access-sjsst") pod "634488bc-0dd8-4dbc-8be4-99328c6a0088" (UID: "634488bc-0dd8-4dbc-8be4-99328c6a0088"). InnerVolumeSpecName "kube-api-access-sjsst". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:43 crc kubenswrapper[4771]: I0227 01:23:43.957421 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjsst\" (UniqueName: \"kubernetes.io/projected/634488bc-0dd8-4dbc-8be4-99328c6a0088-kube-api-access-sjsst\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:43 crc kubenswrapper[4771]: I0227 01:23:43.957455 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/634488bc-0dd8-4dbc-8be4-99328c6a0088-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:43 crc kubenswrapper[4771]: I0227 01:23:43.987995 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-201a-account-create-update-b8qmr" Feb 27 01:23:43 crc kubenswrapper[4771]: I0227 01:23:43.994838 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5696-account-create-update-7mx4f" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.006985 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5lfsc" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.015769 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sswqd" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.066356 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crvcc\" (UniqueName: \"kubernetes.io/projected/b9b12792-b88f-4d69-8df2-03dec12a53ac-kube-api-access-crvcc\") pod \"b9b12792-b88f-4d69-8df2-03dec12a53ac\" (UID: \"b9b12792-b88f-4d69-8df2-03dec12a53ac\") " Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.066492 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9b12792-b88f-4d69-8df2-03dec12a53ac-operator-scripts\") pod \"b9b12792-b88f-4d69-8df2-03dec12a53ac\" (UID: \"b9b12792-b88f-4d69-8df2-03dec12a53ac\") " Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.066534 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d47d411-ba32-47c5-96dc-448ab3aab865-operator-scripts\") pod \"9d47d411-ba32-47c5-96dc-448ab3aab865\" (UID: \"9d47d411-ba32-47c5-96dc-448ab3aab865\") " Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.066589 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbtml\" (UniqueName: \"kubernetes.io/projected/9d47d411-ba32-47c5-96dc-448ab3aab865-kube-api-access-pbtml\") pod \"9d47d411-ba32-47c5-96dc-448ab3aab865\" (UID: \"9d47d411-ba32-47c5-96dc-448ab3aab865\") " Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.066628 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/733a6478-88f8-4dd2-ad0b-fa824ec14a4d-operator-scripts\") pod \"733a6478-88f8-4dd2-ad0b-fa824ec14a4d\" (UID: \"733a6478-88f8-4dd2-ad0b-fa824ec14a4d\") " Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.066660 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xp4k6\" (UniqueName: \"kubernetes.io/projected/733a6478-88f8-4dd2-ad0b-fa824ec14a4d-kube-api-access-xp4k6\") pod \"733a6478-88f8-4dd2-ad0b-fa824ec14a4d\" (UID: \"733a6478-88f8-4dd2-ad0b-fa824ec14a4d\") " Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.066714 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqv9q\" (UniqueName: \"kubernetes.io/projected/46fd6f86-04d4-4c78-bbf8-9f057ac4308b-kube-api-access-pqv9q\") pod \"46fd6f86-04d4-4c78-bbf8-9f057ac4308b\" (UID: \"46fd6f86-04d4-4c78-bbf8-9f057ac4308b\") " Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.066742 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46fd6f86-04d4-4c78-bbf8-9f057ac4308b-operator-scripts\") pod \"46fd6f86-04d4-4c78-bbf8-9f057ac4308b\" (UID: \"46fd6f86-04d4-4c78-bbf8-9f057ac4308b\") " Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.067940 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46fd6f86-04d4-4c78-bbf8-9f057ac4308b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46fd6f86-04d4-4c78-bbf8-9f057ac4308b" (UID: "46fd6f86-04d4-4c78-bbf8-9f057ac4308b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.069651 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9b12792-b88f-4d69-8df2-03dec12a53ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b9b12792-b88f-4d69-8df2-03dec12a53ac" (UID: "b9b12792-b88f-4d69-8df2-03dec12a53ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.070346 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d47d411-ba32-47c5-96dc-448ab3aab865-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9d47d411-ba32-47c5-96dc-448ab3aab865" (UID: "9d47d411-ba32-47c5-96dc-448ab3aab865"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.082821 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/733a6478-88f8-4dd2-ad0b-fa824ec14a4d-kube-api-access-xp4k6" (OuterVolumeSpecName: "kube-api-access-xp4k6") pod "733a6478-88f8-4dd2-ad0b-fa824ec14a4d" (UID: "733a6478-88f8-4dd2-ad0b-fa824ec14a4d"). InnerVolumeSpecName "kube-api-access-xp4k6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.070538 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/733a6478-88f8-4dd2-ad0b-fa824ec14a4d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "733a6478-88f8-4dd2-ad0b-fa824ec14a4d" (UID: "733a6478-88f8-4dd2-ad0b-fa824ec14a4d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.085855 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46fd6f86-04d4-4c78-bbf8-9f057ac4308b-kube-api-access-pqv9q" (OuterVolumeSpecName: "kube-api-access-pqv9q") pod "46fd6f86-04d4-4c78-bbf8-9f057ac4308b" (UID: "46fd6f86-04d4-4c78-bbf8-9f057ac4308b"). InnerVolumeSpecName "kube-api-access-pqv9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.087232 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d47d411-ba32-47c5-96dc-448ab3aab865-kube-api-access-pbtml" (OuterVolumeSpecName: "kube-api-access-pbtml") pod "9d47d411-ba32-47c5-96dc-448ab3aab865" (UID: "9d47d411-ba32-47c5-96dc-448ab3aab865"). InnerVolumeSpecName "kube-api-access-pbtml". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.090340 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9b12792-b88f-4d69-8df2-03dec12a53ac-kube-api-access-crvcc" (OuterVolumeSpecName: "kube-api-access-crvcc") pod "b9b12792-b88f-4d69-8df2-03dec12a53ac" (UID: "b9b12792-b88f-4d69-8df2-03dec12a53ac"). InnerVolumeSpecName "kube-api-access-crvcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.168872 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9b12792-b88f-4d69-8df2-03dec12a53ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.168907 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d47d411-ba32-47c5-96dc-448ab3aab865-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.168919 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbtml\" (UniqueName: \"kubernetes.io/projected/9d47d411-ba32-47c5-96dc-448ab3aab865-kube-api-access-pbtml\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.168931 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/733a6478-88f8-4dd2-ad0b-fa824ec14a4d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.168938 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xp4k6\" (UniqueName: \"kubernetes.io/projected/733a6478-88f8-4dd2-ad0b-fa824ec14a4d-kube-api-access-xp4k6\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.168946 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqv9q\" (UniqueName: \"kubernetes.io/projected/46fd6f86-04d4-4c78-bbf8-9f057ac4308b-kube-api-access-pqv9q\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.168954 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46fd6f86-04d4-4c78-bbf8-9f057ac4308b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.168962 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crvcc\" (UniqueName: \"kubernetes.io/projected/b9b12792-b88f-4d69-8df2-03dec12a53ac-kube-api-access-crvcc\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.406437 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-sswqd" event={"ID":"b9b12792-b88f-4d69-8df2-03dec12a53ac","Type":"ContainerDied","Data":"89c05fa8d5d2b6c352b62ac6e09e83228f24c1d19394a3948cb08b10833b821f"} Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.406484 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89c05fa8d5d2b6c352b62ac6e09e83228f24c1d19394a3948cb08b10833b821f" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.406574 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sswqd" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.409813 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-201a-account-create-update-b8qmr" event={"ID":"9d47d411-ba32-47c5-96dc-448ab3aab865","Type":"ContainerDied","Data":"47329ce48ac85a03f1d62a1c6925f1b4d29503201ce9f5445a9937c1b68dece6"} Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.409836 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47329ce48ac85a03f1d62a1c6925f1b4d29503201ce9f5445a9937c1b68dece6" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.409933 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-201a-account-create-update-b8qmr" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.421015 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5696-account-create-update-7mx4f" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.421332 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5696-account-create-update-7mx4f" event={"ID":"46fd6f86-04d4-4c78-bbf8-9f057ac4308b","Type":"ContainerDied","Data":"031bdd9fd16539743643f8c4773b2ae677e3ecaf0f5d292f99bc62a8db55360c"} Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.421371 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="031bdd9fd16539743643f8c4773b2ae677e3ecaf0f5d292f99bc62a8db55360c" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.425963 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cg8jm" event={"ID":"634488bc-0dd8-4dbc-8be4-99328c6a0088","Type":"ContainerDied","Data":"65d0b8a5444cd296579faf68e4fcb66f38297e2fe22e26d79a61eef90ed7dc98"} Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.426023 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65d0b8a5444cd296579faf68e4fcb66f38297e2fe22e26d79a61eef90ed7dc98" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.425989 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cg8jm" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.433336 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5lfsc" Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.433632 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5lfsc" event={"ID":"733a6478-88f8-4dd2-ad0b-fa824ec14a4d","Type":"ContainerDied","Data":"a62c92429942a753753c3f9317d8f5b9087f3b439dbbdbaf4dbefe8561e8141d"} Feb 27 01:23:44 crc kubenswrapper[4771]: I0227 01:23:44.433672 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a62c92429942a753753c3f9317d8f5b9087f3b439dbbdbaf4dbefe8561e8141d" Feb 27 01:23:46 crc kubenswrapper[4771]: I0227 01:23:46.042029 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-jdwvn"] Feb 27 01:23:46 crc kubenswrapper[4771]: E0227 01:23:46.042830 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d47d411-ba32-47c5-96dc-448ab3aab865" containerName="mariadb-account-create-update" Feb 27 01:23:46 crc kubenswrapper[4771]: I0227 01:23:46.042851 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d47d411-ba32-47c5-96dc-448ab3aab865" containerName="mariadb-account-create-update" Feb 27 01:23:46 crc kubenswrapper[4771]: E0227 01:23:46.042880 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="634488bc-0dd8-4dbc-8be4-99328c6a0088" containerName="mariadb-database-create" Feb 27 01:23:46 crc kubenswrapper[4771]: I0227 01:23:46.042893 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="634488bc-0dd8-4dbc-8be4-99328c6a0088" containerName="mariadb-database-create" Feb 27 01:23:46 crc kubenswrapper[4771]: E0227 01:23:46.042927 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="733a6478-88f8-4dd2-ad0b-fa824ec14a4d" containerName="mariadb-database-create" Feb 27 01:23:46 crc kubenswrapper[4771]: I0227 01:23:46.042940 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="733a6478-88f8-4dd2-ad0b-fa824ec14a4d" containerName="mariadb-database-create" Feb 27 01:23:46 crc kubenswrapper[4771]: E0227 01:23:46.042967 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9b12792-b88f-4d69-8df2-03dec12a53ac" containerName="mariadb-database-create" Feb 27 01:23:46 crc kubenswrapper[4771]: I0227 01:23:46.042980 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9b12792-b88f-4d69-8df2-03dec12a53ac" containerName="mariadb-database-create" Feb 27 01:23:46 crc kubenswrapper[4771]: E0227 01:23:46.042999 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c21e258c-5496-49af-a0f2-9515eea67a47" containerName="init" Feb 27 01:23:46 crc kubenswrapper[4771]: I0227 01:23:46.043010 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c21e258c-5496-49af-a0f2-9515eea67a47" containerName="init" Feb 27 01:23:46 crc kubenswrapper[4771]: E0227 01:23:46.043028 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46fd6f86-04d4-4c78-bbf8-9f057ac4308b" containerName="mariadb-account-create-update" Feb 27 01:23:46 crc kubenswrapper[4771]: I0227 01:23:46.043040 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="46fd6f86-04d4-4c78-bbf8-9f057ac4308b" containerName="mariadb-account-create-update" Feb 27 01:23:46 crc kubenswrapper[4771]: I0227 01:23:46.043296 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d47d411-ba32-47c5-96dc-448ab3aab865" containerName="mariadb-account-create-update" Feb 27 01:23:46 crc kubenswrapper[4771]: I0227 01:23:46.043321 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c21e258c-5496-49af-a0f2-9515eea67a47" containerName="init" Feb 27 01:23:46 crc kubenswrapper[4771]: I0227 01:23:46.043354 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="733a6478-88f8-4dd2-ad0b-fa824ec14a4d" containerName="mariadb-database-create" Feb 27 01:23:46 crc kubenswrapper[4771]: I0227 01:23:46.043374 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="634488bc-0dd8-4dbc-8be4-99328c6a0088" containerName="mariadb-database-create" Feb 27 01:23:46 crc kubenswrapper[4771]: I0227 01:23:46.043400 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="46fd6f86-04d4-4c78-bbf8-9f057ac4308b" containerName="mariadb-account-create-update" Feb 27 01:23:46 crc kubenswrapper[4771]: I0227 01:23:46.043419 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9b12792-b88f-4d69-8df2-03dec12a53ac" containerName="mariadb-database-create" Feb 27 01:23:46 crc kubenswrapper[4771]: I0227 01:23:46.044307 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jdwvn" Feb 27 01:23:46 crc kubenswrapper[4771]: I0227 01:23:46.047131 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 27 01:23:46 crc kubenswrapper[4771]: I0227 01:23:46.070369 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jdwvn"] Feb 27 01:23:46 crc kubenswrapper[4771]: I0227 01:23:46.115938 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cd00ad5-f04b-4756-b60e-054f87509d3f-operator-scripts\") pod \"root-account-create-update-jdwvn\" (UID: \"9cd00ad5-f04b-4756-b60e-054f87509d3f\") " pod="openstack/root-account-create-update-jdwvn" Feb 27 01:23:46 crc kubenswrapper[4771]: I0227 01:23:46.116407 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m4xs\" (UniqueName: \"kubernetes.io/projected/9cd00ad5-f04b-4756-b60e-054f87509d3f-kube-api-access-2m4xs\") pod \"root-account-create-update-jdwvn\" (UID: \"9cd00ad5-f04b-4756-b60e-054f87509d3f\") " pod="openstack/root-account-create-update-jdwvn" Feb 27 01:23:46 crc kubenswrapper[4771]: I0227 01:23:46.218018 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m4xs\" (UniqueName: \"kubernetes.io/projected/9cd00ad5-f04b-4756-b60e-054f87509d3f-kube-api-access-2m4xs\") pod \"root-account-create-update-jdwvn\" (UID: \"9cd00ad5-f04b-4756-b60e-054f87509d3f\") " pod="openstack/root-account-create-update-jdwvn" Feb 27 01:23:46 crc kubenswrapper[4771]: I0227 01:23:46.218085 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cd00ad5-f04b-4756-b60e-054f87509d3f-operator-scripts\") pod \"root-account-create-update-jdwvn\" (UID: \"9cd00ad5-f04b-4756-b60e-054f87509d3f\") " pod="openstack/root-account-create-update-jdwvn" Feb 27 01:23:46 crc kubenswrapper[4771]: I0227 01:23:46.219005 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cd00ad5-f04b-4756-b60e-054f87509d3f-operator-scripts\") pod \"root-account-create-update-jdwvn\" (UID: \"9cd00ad5-f04b-4756-b60e-054f87509d3f\") " pod="openstack/root-account-create-update-jdwvn" Feb 27 01:23:46 crc kubenswrapper[4771]: I0227 01:23:46.237461 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m4xs\" (UniqueName: \"kubernetes.io/projected/9cd00ad5-f04b-4756-b60e-054f87509d3f-kube-api-access-2m4xs\") pod \"root-account-create-update-jdwvn\" (UID: \"9cd00ad5-f04b-4756-b60e-054f87509d3f\") " pod="openstack/root-account-create-update-jdwvn" Feb 27 01:23:46 crc kubenswrapper[4771]: I0227 01:23:46.374467 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jdwvn" Feb 27 01:23:47 crc kubenswrapper[4771]: I0227 01:23:47.259609 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b38e-account-create-update-d286j" Feb 27 01:23:47 crc kubenswrapper[4771]: I0227 01:23:47.352167 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjh2n\" (UniqueName: \"kubernetes.io/projected/091872dd-fe0b-4e93-a837-2fd692af8f21-kube-api-access-hjh2n\") pod \"091872dd-fe0b-4e93-a837-2fd692af8f21\" (UID: \"091872dd-fe0b-4e93-a837-2fd692af8f21\") " Feb 27 01:23:47 crc kubenswrapper[4771]: I0227 01:23:47.352417 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/091872dd-fe0b-4e93-a837-2fd692af8f21-operator-scripts\") pod \"091872dd-fe0b-4e93-a837-2fd692af8f21\" (UID: \"091872dd-fe0b-4e93-a837-2fd692af8f21\") " Feb 27 01:23:47 crc kubenswrapper[4771]: I0227 01:23:47.353881 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/091872dd-fe0b-4e93-a837-2fd692af8f21-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "091872dd-fe0b-4e93-a837-2fd692af8f21" (UID: "091872dd-fe0b-4e93-a837-2fd692af8f21"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:47 crc kubenswrapper[4771]: I0227 01:23:47.357672 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/091872dd-fe0b-4e93-a837-2fd692af8f21-kube-api-access-hjh2n" (OuterVolumeSpecName: "kube-api-access-hjh2n") pod "091872dd-fe0b-4e93-a837-2fd692af8f21" (UID: "091872dd-fe0b-4e93-a837-2fd692af8f21"). InnerVolumeSpecName "kube-api-access-hjh2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:47 crc kubenswrapper[4771]: I0227 01:23:47.454540 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjh2n\" (UniqueName: \"kubernetes.io/projected/091872dd-fe0b-4e93-a837-2fd692af8f21-kube-api-access-hjh2n\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:47 crc kubenswrapper[4771]: I0227 01:23:47.454855 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/091872dd-fe0b-4e93-a837-2fd692af8f21-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:47 crc kubenswrapper[4771]: I0227 01:23:47.456888 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-l4w7n" event={"ID":"cf593882-913d-4168-b14f-c7df95930f73","Type":"ContainerStarted","Data":"c3c14a90a5c23fa0f6008256eb0bb7d14ebd6bc3f79eb00361481bca14439cd0"} Feb 27 01:23:47 crc kubenswrapper[4771]: I0227 01:23:47.458047 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b38e-account-create-update-d286j" event={"ID":"091872dd-fe0b-4e93-a837-2fd692af8f21","Type":"ContainerDied","Data":"9ff7ce57d4423fe2227b6a2c693644f4856a65ce36ec3acc1324a1adfabf219b"} Feb 27 01:23:47 crc kubenswrapper[4771]: I0227 01:23:47.458082 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ff7ce57d4423fe2227b6a2c693644f4856a65ce36ec3acc1324a1adfabf219b" Feb 27 01:23:47 crc kubenswrapper[4771]: I0227 01:23:47.458093 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b38e-account-create-update-d286j" Feb 27 01:23:47 crc kubenswrapper[4771]: I0227 01:23:47.494946 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-l4w7n" podStartSLOduration=2.292741721 podStartE2EDuration="7.494927414s" podCreationTimestamp="2026-02-27 01:23:40 +0000 UTC" firstStartedPulling="2026-02-27 01:23:41.973054652 +0000 UTC m=+1134.910615940" lastFinishedPulling="2026-02-27 01:23:47.175240335 +0000 UTC m=+1140.112801633" observedRunningTime="2026-02-27 01:23:47.486667649 +0000 UTC m=+1140.424228947" watchObservedRunningTime="2026-02-27 01:23:47.494927414 +0000 UTC m=+1140.432488702" Feb 27 01:23:47 crc kubenswrapper[4771]: I0227 01:23:47.619736 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jdwvn"] Feb 27 01:23:47 crc kubenswrapper[4771]: W0227 01:23:47.620898 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cd00ad5_f04b_4756_b60e_054f87509d3f.slice/crio-1bf113add22fcfece7b1298c6e4a6a2db37208e7ef81b25c1736b08a57059cf1 WatchSource:0}: Error finding container 1bf113add22fcfece7b1298c6e4a6a2db37208e7ef81b25c1736b08a57059cf1: Status 404 returned error can't find the container with id 1bf113add22fcfece7b1298c6e4a6a2db37208e7ef81b25c1736b08a57059cf1 Feb 27 01:23:48 crc kubenswrapper[4771]: I0227 01:23:48.480402 4771 generic.go:334] "Generic (PLEG): container finished" podID="9cd00ad5-f04b-4756-b60e-054f87509d3f" containerID="2c38bbafbf6607bc5a1382e4dffd8f62f5ccb078bb97519ab8e7f00014769e65" exitCode=0 Feb 27 01:23:48 crc kubenswrapper[4771]: I0227 01:23:48.480492 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jdwvn" event={"ID":"9cd00ad5-f04b-4756-b60e-054f87509d3f","Type":"ContainerDied","Data":"2c38bbafbf6607bc5a1382e4dffd8f62f5ccb078bb97519ab8e7f00014769e65"} Feb 27 01:23:48 crc kubenswrapper[4771]: I0227 01:23:48.480710 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jdwvn" event={"ID":"9cd00ad5-f04b-4756-b60e-054f87509d3f","Type":"ContainerStarted","Data":"1bf113add22fcfece7b1298c6e4a6a2db37208e7ef81b25c1736b08a57059cf1"} Feb 27 01:23:49 crc kubenswrapper[4771]: I0227 01:23:49.888625 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jdwvn" Feb 27 01:23:49 crc kubenswrapper[4771]: I0227 01:23:49.996775 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cd00ad5-f04b-4756-b60e-054f87509d3f-operator-scripts\") pod \"9cd00ad5-f04b-4756-b60e-054f87509d3f\" (UID: \"9cd00ad5-f04b-4756-b60e-054f87509d3f\") " Feb 27 01:23:49 crc kubenswrapper[4771]: I0227 01:23:49.996864 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m4xs\" (UniqueName: \"kubernetes.io/projected/9cd00ad5-f04b-4756-b60e-054f87509d3f-kube-api-access-2m4xs\") pod \"9cd00ad5-f04b-4756-b60e-054f87509d3f\" (UID: \"9cd00ad5-f04b-4756-b60e-054f87509d3f\") " Feb 27 01:23:49 crc kubenswrapper[4771]: I0227 01:23:49.997715 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cd00ad5-f04b-4756-b60e-054f87509d3f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9cd00ad5-f04b-4756-b60e-054f87509d3f" (UID: "9cd00ad5-f04b-4756-b60e-054f87509d3f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.005242 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cd00ad5-f04b-4756-b60e-054f87509d3f-kube-api-access-2m4xs" (OuterVolumeSpecName: "kube-api-access-2m4xs") pod "9cd00ad5-f04b-4756-b60e-054f87509d3f" (UID: "9cd00ad5-f04b-4756-b60e-054f87509d3f"). InnerVolumeSpecName "kube-api-access-2m4xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.008768 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.092958 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-w4hf9"] Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.093175 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-w4hf9" podUID="2d72b00b-f919-47dd-8f0b-428a4d08d0e8" containerName="dnsmasq-dns" containerID="cri-o://9eac1ca5ec14913833db7f0abc8b1c19460bb4ed27ce8a4af8bb66832562b005" gracePeriod=10 Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.098938 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cd00ad5-f04b-4756-b60e-054f87509d3f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.098963 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m4xs\" (UniqueName: \"kubernetes.io/projected/9cd00ad5-f04b-4756-b60e-054f87509d3f-kube-api-access-2m4xs\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.500022 4771 generic.go:334] "Generic (PLEG): container finished" podID="2d72b00b-f919-47dd-8f0b-428a4d08d0e8" containerID="9eac1ca5ec14913833db7f0abc8b1c19460bb4ed27ce8a4af8bb66832562b005" exitCode=0 Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.500090 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-w4hf9" event={"ID":"2d72b00b-f919-47dd-8f0b-428a4d08d0e8","Type":"ContainerDied","Data":"9eac1ca5ec14913833db7f0abc8b1c19460bb4ed27ce8a4af8bb66832562b005"} Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.500362 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-w4hf9" event={"ID":"2d72b00b-f919-47dd-8f0b-428a4d08d0e8","Type":"ContainerDied","Data":"674ebc610c82d34ca1eaa4f8ba55e87fe412d6309e5024658e55439acfac4e44"} Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.500375 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="674ebc610c82d34ca1eaa4f8ba55e87fe412d6309e5024658e55439acfac4e44" Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.503611 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jdwvn" event={"ID":"9cd00ad5-f04b-4756-b60e-054f87509d3f","Type":"ContainerDied","Data":"1bf113add22fcfece7b1298c6e4a6a2db37208e7ef81b25c1736b08a57059cf1"} Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.503634 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bf113add22fcfece7b1298c6e4a6a2db37208e7ef81b25c1736b08a57059cf1" Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.503664 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jdwvn" Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.512271 4771 generic.go:334] "Generic (PLEG): container finished" podID="cf593882-913d-4168-b14f-c7df95930f73" containerID="c3c14a90a5c23fa0f6008256eb0bb7d14ebd6bc3f79eb00361481bca14439cd0" exitCode=0 Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.512293 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-l4w7n" event={"ID":"cf593882-913d-4168-b14f-c7df95930f73","Type":"ContainerDied","Data":"c3c14a90a5c23fa0f6008256eb0bb7d14ebd6bc3f79eb00361481bca14439cd0"} Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.514325 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-w4hf9" Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.614714 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-config\") pod \"2d72b00b-f919-47dd-8f0b-428a4d08d0e8\" (UID: \"2d72b00b-f919-47dd-8f0b-428a4d08d0e8\") " Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.614790 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpdsz\" (UniqueName: \"kubernetes.io/projected/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-kube-api-access-xpdsz\") pod \"2d72b00b-f919-47dd-8f0b-428a4d08d0e8\" (UID: \"2d72b00b-f919-47dd-8f0b-428a4d08d0e8\") " Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.614900 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-dns-svc\") pod \"2d72b00b-f919-47dd-8f0b-428a4d08d0e8\" (UID: \"2d72b00b-f919-47dd-8f0b-428a4d08d0e8\") " Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.614921 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-ovsdbserver-sb\") pod \"2d72b00b-f919-47dd-8f0b-428a4d08d0e8\" (UID: \"2d72b00b-f919-47dd-8f0b-428a4d08d0e8\") " Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.614979 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-ovsdbserver-nb\") pod \"2d72b00b-f919-47dd-8f0b-428a4d08d0e8\" (UID: \"2d72b00b-f919-47dd-8f0b-428a4d08d0e8\") " Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.619305 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-kube-api-access-xpdsz" (OuterVolumeSpecName: "kube-api-access-xpdsz") pod "2d72b00b-f919-47dd-8f0b-428a4d08d0e8" (UID: "2d72b00b-f919-47dd-8f0b-428a4d08d0e8"). InnerVolumeSpecName "kube-api-access-xpdsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.652240 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2d72b00b-f919-47dd-8f0b-428a4d08d0e8" (UID: "2d72b00b-f919-47dd-8f0b-428a4d08d0e8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.657485 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2d72b00b-f919-47dd-8f0b-428a4d08d0e8" (UID: "2d72b00b-f919-47dd-8f0b-428a4d08d0e8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.666090 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2d72b00b-f919-47dd-8f0b-428a4d08d0e8" (UID: "2d72b00b-f919-47dd-8f0b-428a4d08d0e8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.672752 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-config" (OuterVolumeSpecName: "config") pod "2d72b00b-f919-47dd-8f0b-428a4d08d0e8" (UID: "2d72b00b-f919-47dd-8f0b-428a4d08d0e8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.716752 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpdsz\" (UniqueName: \"kubernetes.io/projected/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-kube-api-access-xpdsz\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.716822 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.716835 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.716846 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:50 crc kubenswrapper[4771]: I0227 01:23:50.716858 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d72b00b-f919-47dd-8f0b-428a4d08d0e8-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:51 crc kubenswrapper[4771]: I0227 01:23:51.520265 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-w4hf9" Feb 27 01:23:51 crc kubenswrapper[4771]: I0227 01:23:51.577895 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-w4hf9"] Feb 27 01:23:51 crc kubenswrapper[4771]: I0227 01:23:51.593932 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-w4hf9"] Feb 27 01:23:51 crc kubenswrapper[4771]: I0227 01:23:51.785154 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d72b00b-f919-47dd-8f0b-428a4d08d0e8" path="/var/lib/kubelet/pods/2d72b00b-f919-47dd-8f0b-428a4d08d0e8/volumes" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.003581 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-l4w7n" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.037897 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf593882-913d-4168-b14f-c7df95930f73-combined-ca-bundle\") pod \"cf593882-913d-4168-b14f-c7df95930f73\" (UID: \"cf593882-913d-4168-b14f-c7df95930f73\") " Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.037991 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c269k\" (UniqueName: \"kubernetes.io/projected/cf593882-913d-4168-b14f-c7df95930f73-kube-api-access-c269k\") pod \"cf593882-913d-4168-b14f-c7df95930f73\" (UID: \"cf593882-913d-4168-b14f-c7df95930f73\") " Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.038122 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf593882-913d-4168-b14f-c7df95930f73-config-data\") pod \"cf593882-913d-4168-b14f-c7df95930f73\" (UID: \"cf593882-913d-4168-b14f-c7df95930f73\") " Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.051040 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf593882-913d-4168-b14f-c7df95930f73-kube-api-access-c269k" (OuterVolumeSpecName: "kube-api-access-c269k") pod "cf593882-913d-4168-b14f-c7df95930f73" (UID: "cf593882-913d-4168-b14f-c7df95930f73"). InnerVolumeSpecName "kube-api-access-c269k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.064181 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf593882-913d-4168-b14f-c7df95930f73-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf593882-913d-4168-b14f-c7df95930f73" (UID: "cf593882-913d-4168-b14f-c7df95930f73"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.087207 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf593882-913d-4168-b14f-c7df95930f73-config-data" (OuterVolumeSpecName: "config-data") pod "cf593882-913d-4168-b14f-c7df95930f73" (UID: "cf593882-913d-4168-b14f-c7df95930f73"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.140743 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c269k\" (UniqueName: \"kubernetes.io/projected/cf593882-913d-4168-b14f-c7df95930f73-kube-api-access-c269k\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.140780 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf593882-913d-4168-b14f-c7df95930f73-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.140790 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf593882-913d-4168-b14f-c7df95930f73-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.531252 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-l4w7n" event={"ID":"cf593882-913d-4168-b14f-c7df95930f73","Type":"ContainerDied","Data":"4e657f696767342bd32dbc1cb53358205f5de6a7123aed3a545a8eaf29d29961"} Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.531290 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-l4w7n" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.531294 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e657f696767342bd32dbc1cb53358205f5de6a7123aed3a545a8eaf29d29961" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.774891 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-5b2xx"] Feb 27 01:23:52 crc kubenswrapper[4771]: E0227 01:23:52.775255 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf593882-913d-4168-b14f-c7df95930f73" containerName="keystone-db-sync" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.775277 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf593882-913d-4168-b14f-c7df95930f73" containerName="keystone-db-sync" Feb 27 01:23:52 crc kubenswrapper[4771]: E0227 01:23:52.775289 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d72b00b-f919-47dd-8f0b-428a4d08d0e8" containerName="dnsmasq-dns" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.776666 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d72b00b-f919-47dd-8f0b-428a4d08d0e8" containerName="dnsmasq-dns" Feb 27 01:23:52 crc kubenswrapper[4771]: E0227 01:23:52.776686 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd00ad5-f04b-4756-b60e-054f87509d3f" containerName="mariadb-account-create-update" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.776696 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd00ad5-f04b-4756-b60e-054f87509d3f" containerName="mariadb-account-create-update" Feb 27 01:23:52 crc kubenswrapper[4771]: E0227 01:23:52.776727 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d72b00b-f919-47dd-8f0b-428a4d08d0e8" containerName="init" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.776735 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d72b00b-f919-47dd-8f0b-428a4d08d0e8" containerName="init" Feb 27 01:23:52 crc kubenswrapper[4771]: E0227 01:23:52.776745 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="091872dd-fe0b-4e93-a837-2fd692af8f21" containerName="mariadb-account-create-update" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.776753 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="091872dd-fe0b-4e93-a837-2fd692af8f21" containerName="mariadb-account-create-update" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.776964 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd00ad5-f04b-4756-b60e-054f87509d3f" containerName="mariadb-account-create-update" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.776983 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf593882-913d-4168-b14f-c7df95930f73" containerName="keystone-db-sync" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.776995 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d72b00b-f919-47dd-8f0b-428a4d08d0e8" containerName="dnsmasq-dns" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.777009 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="091872dd-fe0b-4e93-a837-2fd692af8f21" containerName="mariadb-account-create-update" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.777578 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5b2xx" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.800448 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-wjc2q"] Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.802303 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-wjc2q" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.805831 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.805841 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.806087 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-7nd8k" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.806280 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.806383 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.837354 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5b2xx"] Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.858330 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxjfl\" (UniqueName: \"kubernetes.io/projected/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-kube-api-access-lxjfl\") pod \"dnsmasq-dns-847c4cc679-wjc2q\" (UID: \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\") " pod="openstack/dnsmasq-dns-847c4cc679-wjc2q" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.858385 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-credential-keys\") pod \"keystone-bootstrap-5b2xx\" (UID: \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\") " pod="openstack/keystone-bootstrap-5b2xx" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.858411 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-scripts\") pod \"keystone-bootstrap-5b2xx\" (UID: \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\") " pod="openstack/keystone-bootstrap-5b2xx" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.858466 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-combined-ca-bundle\") pod \"keystone-bootstrap-5b2xx\" (UID: \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\") " pod="openstack/keystone-bootstrap-5b2xx" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.858536 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-wjc2q\" (UID: \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\") " pod="openstack/dnsmasq-dns-847c4cc679-wjc2q" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.858578 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-wjc2q\" (UID: \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\") " pod="openstack/dnsmasq-dns-847c4cc679-wjc2q" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.858648 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-config\") pod \"dnsmasq-dns-847c4cc679-wjc2q\" (UID: \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\") " pod="openstack/dnsmasq-dns-847c4cc679-wjc2q" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.858704 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-wjc2q\" (UID: \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\") " pod="openstack/dnsmasq-dns-847c4cc679-wjc2q" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.858766 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txl4f\" (UniqueName: \"kubernetes.io/projected/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-kube-api-access-txl4f\") pod \"keystone-bootstrap-5b2xx\" (UID: \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\") " pod="openstack/keystone-bootstrap-5b2xx" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.858845 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-fernet-keys\") pod \"keystone-bootstrap-5b2xx\" (UID: \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\") " pod="openstack/keystone-bootstrap-5b2xx" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.858887 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-dns-svc\") pod \"dnsmasq-dns-847c4cc679-wjc2q\" (UID: \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\") " pod="openstack/dnsmasq-dns-847c4cc679-wjc2q" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.858917 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-config-data\") pod \"keystone-bootstrap-5b2xx\" (UID: \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\") " pod="openstack/keystone-bootstrap-5b2xx" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.859962 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-wjc2q"] Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.960257 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-config\") pod \"dnsmasq-dns-847c4cc679-wjc2q\" (UID: \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\") " pod="openstack/dnsmasq-dns-847c4cc679-wjc2q" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.960317 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-wjc2q\" (UID: \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\") " pod="openstack/dnsmasq-dns-847c4cc679-wjc2q" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.960351 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txl4f\" (UniqueName: \"kubernetes.io/projected/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-kube-api-access-txl4f\") pod \"keystone-bootstrap-5b2xx\" (UID: \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\") " pod="openstack/keystone-bootstrap-5b2xx" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.960389 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-fernet-keys\") pod \"keystone-bootstrap-5b2xx\" (UID: \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\") " pod="openstack/keystone-bootstrap-5b2xx" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.960411 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-dns-svc\") pod \"dnsmasq-dns-847c4cc679-wjc2q\" (UID: \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\") " pod="openstack/dnsmasq-dns-847c4cc679-wjc2q" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.960432 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-config-data\") pod \"keystone-bootstrap-5b2xx\" (UID: \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\") " pod="openstack/keystone-bootstrap-5b2xx" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.960460 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxjfl\" (UniqueName: \"kubernetes.io/projected/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-kube-api-access-lxjfl\") pod \"dnsmasq-dns-847c4cc679-wjc2q\" (UID: \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\") " pod="openstack/dnsmasq-dns-847c4cc679-wjc2q" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.960475 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-credential-keys\") pod \"keystone-bootstrap-5b2xx\" (UID: \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\") " pod="openstack/keystone-bootstrap-5b2xx" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.960490 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-scripts\") pod \"keystone-bootstrap-5b2xx\" (UID: \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\") " pod="openstack/keystone-bootstrap-5b2xx" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.960514 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-combined-ca-bundle\") pod \"keystone-bootstrap-5b2xx\" (UID: \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\") " pod="openstack/keystone-bootstrap-5b2xx" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.960559 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-wjc2q\" (UID: \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\") " pod="openstack/dnsmasq-dns-847c4cc679-wjc2q" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.960614 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-wjc2q\" (UID: \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\") " pod="openstack/dnsmasq-dns-847c4cc679-wjc2q" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.961611 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-wjc2q\" (UID: \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\") " pod="openstack/dnsmasq-dns-847c4cc679-wjc2q" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.962191 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-config\") pod \"dnsmasq-dns-847c4cc679-wjc2q\" (UID: \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\") " pod="openstack/dnsmasq-dns-847c4cc679-wjc2q" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.962706 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-wjc2q\" (UID: \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\") " pod="openstack/dnsmasq-dns-847c4cc679-wjc2q" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.976325 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-combined-ca-bundle\") pod \"keystone-bootstrap-5b2xx\" (UID: \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\") " pod="openstack/keystone-bootstrap-5b2xx" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.976977 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-dns-svc\") pod \"dnsmasq-dns-847c4cc679-wjc2q\" (UID: \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\") " pod="openstack/dnsmasq-dns-847c4cc679-wjc2q" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.977275 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-credential-keys\") pod \"keystone-bootstrap-5b2xx\" (UID: \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\") " pod="openstack/keystone-bootstrap-5b2xx" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.978189 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-wjc2q\" (UID: \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\") " pod="openstack/dnsmasq-dns-847c4cc679-wjc2q" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.978762 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-fernet-keys\") pod \"keystone-bootstrap-5b2xx\" (UID: \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\") " pod="openstack/keystone-bootstrap-5b2xx" Feb 27 01:23:52 crc kubenswrapper[4771]: I0227 01:23:52.987874 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-69b7666f9c-x658x"] Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.000086 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txl4f\" (UniqueName: \"kubernetes.io/projected/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-kube-api-access-txl4f\") pod \"keystone-bootstrap-5b2xx\" (UID: \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\") " pod="openstack/keystone-bootstrap-5b2xx" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.001062 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-config-data\") pod \"keystone-bootstrap-5b2xx\" (UID: \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\") " pod="openstack/keystone-bootstrap-5b2xx" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.002684 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69b7666f9c-x658x" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.011202 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-scripts\") pod \"keystone-bootstrap-5b2xx\" (UID: \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\") " pod="openstack/keystone-bootstrap-5b2xx" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.011491 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.012003 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.012096 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-ksrgs" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.012260 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.019380 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxjfl\" (UniqueName: \"kubernetes.io/projected/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-kube-api-access-lxjfl\") pod \"dnsmasq-dns-847c4cc679-wjc2q\" (UID: \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\") " pod="openstack/dnsmasq-dns-847c4cc679-wjc2q" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.024146 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-mg85q"] Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.034779 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-69b7666f9c-x658x"] Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.057661 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mg85q" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.064746 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.069540 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-hjxfj" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.083565 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.085734 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d82930a8-1630-4e85-86f0-0f2027e7225d-combined-ca-bundle\") pod \"neutron-db-sync-mg85q\" (UID: \"d82930a8-1630-4e85-86f0-0f2027e7225d\") " pod="openstack/neutron-db-sync-mg85q" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.085813 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d82930a8-1630-4e85-86f0-0f2027e7225d-config\") pod \"neutron-db-sync-mg85q\" (UID: \"d82930a8-1630-4e85-86f0-0f2027e7225d\") " pod="openstack/neutron-db-sync-mg85q" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.085834 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dws6k\" (UniqueName: \"kubernetes.io/projected/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-kube-api-access-dws6k\") pod \"horizon-69b7666f9c-x658x\" (UID: \"f610e3af-68b8-445a-b8c3-a9f7d4319fdf\") " pod="openstack/horizon-69b7666f9c-x658x" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.085855 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-config-data\") pod \"horizon-69b7666f9c-x658x\" (UID: \"f610e3af-68b8-445a-b8c3-a9f7d4319fdf\") " pod="openstack/horizon-69b7666f9c-x658x" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.085893 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-logs\") pod \"horizon-69b7666f9c-x658x\" (UID: \"f610e3af-68b8-445a-b8c3-a9f7d4319fdf\") " pod="openstack/horizon-69b7666f9c-x658x" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.085918 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-horizon-secret-key\") pod \"horizon-69b7666f9c-x658x\" (UID: \"f610e3af-68b8-445a-b8c3-a9f7d4319fdf\") " pod="openstack/horizon-69b7666f9c-x658x" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.085932 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-scripts\") pod \"horizon-69b7666f9c-x658x\" (UID: \"f610e3af-68b8-445a-b8c3-a9f7d4319fdf\") " pod="openstack/horizon-69b7666f9c-x658x" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.085970 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dgbb\" (UniqueName: \"kubernetes.io/projected/d82930a8-1630-4e85-86f0-0f2027e7225d-kube-api-access-5dgbb\") pod \"neutron-db-sync-mg85q\" (UID: \"d82930a8-1630-4e85-86f0-0f2027e7225d\") " pod="openstack/neutron-db-sync-mg85q" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.086203 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.087629 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.099279 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.099497 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.099736 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-mg85q"] Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.113727 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5b2xx" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.123106 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.171074 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-wjc2q" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.192296 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dgbb\" (UniqueName: \"kubernetes.io/projected/d82930a8-1630-4e85-86f0-0f2027e7225d-kube-api-access-5dgbb\") pod \"neutron-db-sync-mg85q\" (UID: \"d82930a8-1630-4e85-86f0-0f2027e7225d\") " pod="openstack/neutron-db-sync-mg85q" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.192343 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d82930a8-1630-4e85-86f0-0f2027e7225d-combined-ca-bundle\") pod \"neutron-db-sync-mg85q\" (UID: \"d82930a8-1630-4e85-86f0-0f2027e7225d\") " pod="openstack/neutron-db-sync-mg85q" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.192376 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-run-httpd\") pod \"ceilometer-0\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " pod="openstack/ceilometer-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.192392 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " pod="openstack/ceilometer-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.192442 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d82930a8-1630-4e85-86f0-0f2027e7225d-config\") pod \"neutron-db-sync-mg85q\" (UID: \"d82930a8-1630-4e85-86f0-0f2027e7225d\") " pod="openstack/neutron-db-sync-mg85q" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.192459 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dws6k\" (UniqueName: \"kubernetes.io/projected/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-kube-api-access-dws6k\") pod \"horizon-69b7666f9c-x658x\" (UID: \"f610e3af-68b8-445a-b8c3-a9f7d4319fdf\") " pod="openstack/horizon-69b7666f9c-x658x" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.192476 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-config-data\") pod \"horizon-69b7666f9c-x658x\" (UID: \"f610e3af-68b8-445a-b8c3-a9f7d4319fdf\") " pod="openstack/horizon-69b7666f9c-x658x" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.192508 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " pod="openstack/ceilometer-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.192536 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkfnl\" (UniqueName: \"kubernetes.io/projected/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-kube-api-access-qkfnl\") pod \"ceilometer-0\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " pod="openstack/ceilometer-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.192565 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-logs\") pod \"horizon-69b7666f9c-x658x\" (UID: \"f610e3af-68b8-445a-b8c3-a9f7d4319fdf\") " pod="openstack/horizon-69b7666f9c-x658x" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.192583 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-log-httpd\") pod \"ceilometer-0\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " pod="openstack/ceilometer-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.192598 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-scripts\") pod \"ceilometer-0\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " pod="openstack/ceilometer-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.192619 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-horizon-secret-key\") pod \"horizon-69b7666f9c-x658x\" (UID: \"f610e3af-68b8-445a-b8c3-a9f7d4319fdf\") " pod="openstack/horizon-69b7666f9c-x658x" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.192634 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-config-data\") pod \"ceilometer-0\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " pod="openstack/ceilometer-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.192649 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-scripts\") pod \"horizon-69b7666f9c-x658x\" (UID: \"f610e3af-68b8-445a-b8c3-a9f7d4319fdf\") " pod="openstack/horizon-69b7666f9c-x658x" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.193596 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-logs\") pod \"horizon-69b7666f9c-x658x\" (UID: \"f610e3af-68b8-445a-b8c3-a9f7d4319fdf\") " pod="openstack/horizon-69b7666f9c-x658x" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.202288 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-scripts\") pod \"horizon-69b7666f9c-x658x\" (UID: \"f610e3af-68b8-445a-b8c3-a9f7d4319fdf\") " pod="openstack/horizon-69b7666f9c-x658x" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.202537 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-horizon-secret-key\") pod \"horizon-69b7666f9c-x658x\" (UID: \"f610e3af-68b8-445a-b8c3-a9f7d4319fdf\") " pod="openstack/horizon-69b7666f9c-x658x" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.207781 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d82930a8-1630-4e85-86f0-0f2027e7225d-combined-ca-bundle\") pod \"neutron-db-sync-mg85q\" (UID: \"d82930a8-1630-4e85-86f0-0f2027e7225d\") " pod="openstack/neutron-db-sync-mg85q" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.209272 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-config-data\") pod \"horizon-69b7666f9c-x658x\" (UID: \"f610e3af-68b8-445a-b8c3-a9f7d4319fdf\") " pod="openstack/horizon-69b7666f9c-x658x" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.221845 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d82930a8-1630-4e85-86f0-0f2027e7225d-config\") pod \"neutron-db-sync-mg85q\" (UID: \"d82930a8-1630-4e85-86f0-0f2027e7225d\") " pod="openstack/neutron-db-sync-mg85q" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.240120 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dws6k\" (UniqueName: \"kubernetes.io/projected/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-kube-api-access-dws6k\") pod \"horizon-69b7666f9c-x658x\" (UID: \"f610e3af-68b8-445a-b8c3-a9f7d4319fdf\") " pod="openstack/horizon-69b7666f9c-x658x" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.248117 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dgbb\" (UniqueName: \"kubernetes.io/projected/d82930a8-1630-4e85-86f0-0f2027e7225d-kube-api-access-5dgbb\") pod \"neutron-db-sync-mg85q\" (UID: \"d82930a8-1630-4e85-86f0-0f2027e7225d\") " pod="openstack/neutron-db-sync-mg85q" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.260839 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-zhfdt"] Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.264630 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zhfdt" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.277620 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.277897 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.278008 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-vdgqk" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.294978 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkfnl\" (UniqueName: \"kubernetes.io/projected/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-kube-api-access-qkfnl\") pod \"ceilometer-0\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " pod="openstack/ceilometer-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.295021 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-log-httpd\") pod \"ceilometer-0\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " pod="openstack/ceilometer-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.295042 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-scripts\") pod \"ceilometer-0\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " pod="openstack/ceilometer-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.295059 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-config-data\") pod \"ceilometer-0\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " pod="openstack/ceilometer-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.295118 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-run-httpd\") pod \"ceilometer-0\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " pod="openstack/ceilometer-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.295136 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " pod="openstack/ceilometer-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.295201 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " pod="openstack/ceilometer-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.296879 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-run-httpd\") pod \"ceilometer-0\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " pod="openstack/ceilometer-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.296960 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-log-httpd\") pod \"ceilometer-0\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " pod="openstack/ceilometer-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.302757 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mg85q" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.302916 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " pod="openstack/ceilometer-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.312502 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-config-data\") pod \"ceilometer-0\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " pod="openstack/ceilometer-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.315123 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-scripts\") pod \"ceilometer-0\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " pod="openstack/ceilometer-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.325342 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " pod="openstack/ceilometer-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.328455 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkfnl\" (UniqueName: \"kubernetes.io/projected/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-kube-api-access-qkfnl\") pod \"ceilometer-0\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " pod="openstack/ceilometer-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.333570 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.361051 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-zhfdt"] Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.383998 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-r69k6"] Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.385214 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-r69k6" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.395531 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.395727 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-8jvt5" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.396612 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/37e7849a-97b9-4e3d-9ad3-c0c942775e64-db-sync-config-data\") pod \"cinder-db-sync-zhfdt\" (UID: \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\") " pod="openstack/cinder-db-sync-zhfdt" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.396656 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37e7849a-97b9-4e3d-9ad3-c0c942775e64-scripts\") pod \"cinder-db-sync-zhfdt\" (UID: \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\") " pod="openstack/cinder-db-sync-zhfdt" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.396686 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scgr9\" (UniqueName: \"kubernetes.io/projected/37e7849a-97b9-4e3d-9ad3-c0c942775e64-kube-api-access-scgr9\") pod \"cinder-db-sync-zhfdt\" (UID: \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\") " pod="openstack/cinder-db-sync-zhfdt" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.396705 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37e7849a-97b9-4e3d-9ad3-c0c942775e64-etc-machine-id\") pod \"cinder-db-sync-zhfdt\" (UID: \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\") " pod="openstack/cinder-db-sync-zhfdt" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.396745 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e7849a-97b9-4e3d-9ad3-c0c942775e64-combined-ca-bundle\") pod \"cinder-db-sync-zhfdt\" (UID: \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\") " pod="openstack/cinder-db-sync-zhfdt" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.396897 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e7849a-97b9-4e3d-9ad3-c0c942775e64-config-data\") pod \"cinder-db-sync-zhfdt\" (UID: \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\") " pod="openstack/cinder-db-sync-zhfdt" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.407855 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-r69k6"] Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.435379 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-tclqb"] Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.436696 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-tclqb" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.441821 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-594f74c97c-r6bp5"] Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.449216 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69b7666f9c-x658x" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.457766 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-wjc2q"] Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.458977 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-594f74c97c-r6bp5"] Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.457869 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-594f74c97c-r6bp5" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.462116 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.462168 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.462216 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-fn4nn" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.468006 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-tclqb"] Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.487896 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.489430 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.492330 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.492457 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.492541 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-jpjnq" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.492647 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.502132 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a592bd48-ea9a-4f6c-a7fe-49185fbbed82-combined-ca-bundle\") pod \"barbican-db-sync-r69k6\" (UID: \"a592bd48-ea9a-4f6c-a7fe-49185fbbed82\") " pod="openstack/barbican-db-sync-r69k6" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.502177 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b411543d-f7a2-4a56-acb5-9b2d9598739a-logs\") pod \"placement-db-sync-tclqb\" (UID: \"b411543d-f7a2-4a56-acb5-9b2d9598739a\") " pod="openstack/placement-db-sync-tclqb" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.502198 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e291ec97-2bfe-4bbe-a39d-9eca937f1855-logs\") pod \"horizon-594f74c97c-r6bp5\" (UID: \"e291ec97-2bfe-4bbe-a39d-9eca937f1855\") " pod="openstack/horizon-594f74c97c-r6bp5" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.502223 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx2kh\" (UniqueName: \"kubernetes.io/projected/e291ec97-2bfe-4bbe-a39d-9eca937f1855-kube-api-access-mx2kh\") pod \"horizon-594f74c97c-r6bp5\" (UID: \"e291ec97-2bfe-4bbe-a39d-9eca937f1855\") " pod="openstack/horizon-594f74c97c-r6bp5" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.502246 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b411543d-f7a2-4a56-acb5-9b2d9598739a-config-data\") pod \"placement-db-sync-tclqb\" (UID: \"b411543d-f7a2-4a56-acb5-9b2d9598739a\") " pod="openstack/placement-db-sync-tclqb" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.502268 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b411543d-f7a2-4a56-acb5-9b2d9598739a-combined-ca-bundle\") pod \"placement-db-sync-tclqb\" (UID: \"b411543d-f7a2-4a56-acb5-9b2d9598739a\") " pod="openstack/placement-db-sync-tclqb" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.502295 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e291ec97-2bfe-4bbe-a39d-9eca937f1855-scripts\") pod \"horizon-594f74c97c-r6bp5\" (UID: \"e291ec97-2bfe-4bbe-a39d-9eca937f1855\") " pod="openstack/horizon-594f74c97c-r6bp5" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.502317 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e7849a-97b9-4e3d-9ad3-c0c942775e64-config-data\") pod \"cinder-db-sync-zhfdt\" (UID: \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\") " pod="openstack/cinder-db-sync-zhfdt" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.502830 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/37e7849a-97b9-4e3d-9ad3-c0c942775e64-db-sync-config-data\") pod \"cinder-db-sync-zhfdt\" (UID: \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\") " pod="openstack/cinder-db-sync-zhfdt" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.502858 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37e7849a-97b9-4e3d-9ad3-c0c942775e64-scripts\") pod \"cinder-db-sync-zhfdt\" (UID: \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\") " pod="openstack/cinder-db-sync-zhfdt" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.502969 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scgr9\" (UniqueName: \"kubernetes.io/projected/37e7849a-97b9-4e3d-9ad3-c0c942775e64-kube-api-access-scgr9\") pod \"cinder-db-sync-zhfdt\" (UID: \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\") " pod="openstack/cinder-db-sync-zhfdt" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.503014 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37e7849a-97b9-4e3d-9ad3-c0c942775e64-etc-machine-id\") pod \"cinder-db-sync-zhfdt\" (UID: \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\") " pod="openstack/cinder-db-sync-zhfdt" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.503045 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a592bd48-ea9a-4f6c-a7fe-49185fbbed82-db-sync-config-data\") pod \"barbican-db-sync-r69k6\" (UID: \"a592bd48-ea9a-4f6c-a7fe-49185fbbed82\") " pod="openstack/barbican-db-sync-r69k6" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.503081 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e291ec97-2bfe-4bbe-a39d-9eca937f1855-horizon-secret-key\") pod \"horizon-594f74c97c-r6bp5\" (UID: \"e291ec97-2bfe-4bbe-a39d-9eca937f1855\") " pod="openstack/horizon-594f74c97c-r6bp5" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.503101 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5tfq\" (UniqueName: \"kubernetes.io/projected/b411543d-f7a2-4a56-acb5-9b2d9598739a-kube-api-access-d5tfq\") pod \"placement-db-sync-tclqb\" (UID: \"b411543d-f7a2-4a56-acb5-9b2d9598739a\") " pod="openstack/placement-db-sync-tclqb" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.503131 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b411543d-f7a2-4a56-acb5-9b2d9598739a-scripts\") pod \"placement-db-sync-tclqb\" (UID: \"b411543d-f7a2-4a56-acb5-9b2d9598739a\") " pod="openstack/placement-db-sync-tclqb" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.503145 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwlnl\" (UniqueName: \"kubernetes.io/projected/a592bd48-ea9a-4f6c-a7fe-49185fbbed82-kube-api-access-vwlnl\") pod \"barbican-db-sync-r69k6\" (UID: \"a592bd48-ea9a-4f6c-a7fe-49185fbbed82\") " pod="openstack/barbican-db-sync-r69k6" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.503166 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e291ec97-2bfe-4bbe-a39d-9eca937f1855-config-data\") pod \"horizon-594f74c97c-r6bp5\" (UID: \"e291ec97-2bfe-4bbe-a39d-9eca937f1855\") " pod="openstack/horizon-594f74c97c-r6bp5" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.503187 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e7849a-97b9-4e3d-9ad3-c0c942775e64-combined-ca-bundle\") pod \"cinder-db-sync-zhfdt\" (UID: \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\") " pod="openstack/cinder-db-sync-zhfdt" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.504131 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37e7849a-97b9-4e3d-9ad3-c0c942775e64-etc-machine-id\") pod \"cinder-db-sync-zhfdt\" (UID: \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\") " pod="openstack/cinder-db-sync-zhfdt" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.504208 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.519323 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e7849a-97b9-4e3d-9ad3-c0c942775e64-config-data\") pod \"cinder-db-sync-zhfdt\" (UID: \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\") " pod="openstack/cinder-db-sync-zhfdt" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.520248 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/37e7849a-97b9-4e3d-9ad3-c0c942775e64-db-sync-config-data\") pod \"cinder-db-sync-zhfdt\" (UID: \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\") " pod="openstack/cinder-db-sync-zhfdt" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.520526 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37e7849a-97b9-4e3d-9ad3-c0c942775e64-scripts\") pod \"cinder-db-sync-zhfdt\" (UID: \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\") " pod="openstack/cinder-db-sync-zhfdt" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.533392 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e7849a-97b9-4e3d-9ad3-c0c942775e64-combined-ca-bundle\") pod \"cinder-db-sync-zhfdt\" (UID: \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\") " pod="openstack/cinder-db-sync-zhfdt" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.534747 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scgr9\" (UniqueName: \"kubernetes.io/projected/37e7849a-97b9-4e3d-9ad3-c0c942775e64-kube-api-access-scgr9\") pod \"cinder-db-sync-zhfdt\" (UID: \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\") " pod="openstack/cinder-db-sync-zhfdt" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.548641 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-lmh2v"] Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.561445 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.577127 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-lmh2v"] Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.613855 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e291ec97-2bfe-4bbe-a39d-9eca937f1855-scripts\") pod \"horizon-594f74c97c-r6bp5\" (UID: \"e291ec97-2bfe-4bbe-a39d-9eca937f1855\") " pod="openstack/horizon-594f74c97c-r6bp5" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.613964 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a592bd48-ea9a-4f6c-a7fe-49185fbbed82-db-sync-config-data\") pod \"barbican-db-sync-r69k6\" (UID: \"a592bd48-ea9a-4f6c-a7fe-49185fbbed82\") " pod="openstack/barbican-db-sync-r69k6" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.613993 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e291ec97-2bfe-4bbe-a39d-9eca937f1855-horizon-secret-key\") pod \"horizon-594f74c97c-r6bp5\" (UID: \"e291ec97-2bfe-4bbe-a39d-9eca937f1855\") " pod="openstack/horizon-594f74c97c-r6bp5" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.614010 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5tfq\" (UniqueName: \"kubernetes.io/projected/b411543d-f7a2-4a56-acb5-9b2d9598739a-kube-api-access-d5tfq\") pod \"placement-db-sync-tclqb\" (UID: \"b411543d-f7a2-4a56-acb5-9b2d9598739a\") " pod="openstack/placement-db-sync-tclqb" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.614034 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwlnl\" (UniqueName: \"kubernetes.io/projected/a592bd48-ea9a-4f6c-a7fe-49185fbbed82-kube-api-access-vwlnl\") pod \"barbican-db-sync-r69k6\" (UID: \"a592bd48-ea9a-4f6c-a7fe-49185fbbed82\") " pod="openstack/barbican-db-sync-r69k6" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.614056 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b411543d-f7a2-4a56-acb5-9b2d9598739a-scripts\") pod \"placement-db-sync-tclqb\" (UID: \"b411543d-f7a2-4a56-acb5-9b2d9598739a\") " pod="openstack/placement-db-sync-tclqb" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.614076 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e291ec97-2bfe-4bbe-a39d-9eca937f1855-config-data\") pod \"horizon-594f74c97c-r6bp5\" (UID: \"e291ec97-2bfe-4bbe-a39d-9eca937f1855\") " pod="openstack/horizon-594f74c97c-r6bp5" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.614131 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a592bd48-ea9a-4f6c-a7fe-49185fbbed82-combined-ca-bundle\") pod \"barbican-db-sync-r69k6\" (UID: \"a592bd48-ea9a-4f6c-a7fe-49185fbbed82\") " pod="openstack/barbican-db-sync-r69k6" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.614151 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b411543d-f7a2-4a56-acb5-9b2d9598739a-logs\") pod \"placement-db-sync-tclqb\" (UID: \"b411543d-f7a2-4a56-acb5-9b2d9598739a\") " pod="openstack/placement-db-sync-tclqb" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.614169 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e291ec97-2bfe-4bbe-a39d-9eca937f1855-logs\") pod \"horizon-594f74c97c-r6bp5\" (UID: \"e291ec97-2bfe-4bbe-a39d-9eca937f1855\") " pod="openstack/horizon-594f74c97c-r6bp5" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.614191 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx2kh\" (UniqueName: \"kubernetes.io/projected/e291ec97-2bfe-4bbe-a39d-9eca937f1855-kube-api-access-mx2kh\") pod \"horizon-594f74c97c-r6bp5\" (UID: \"e291ec97-2bfe-4bbe-a39d-9eca937f1855\") " pod="openstack/horizon-594f74c97c-r6bp5" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.614221 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b411543d-f7a2-4a56-acb5-9b2d9598739a-config-data\") pod \"placement-db-sync-tclqb\" (UID: \"b411543d-f7a2-4a56-acb5-9b2d9598739a\") " pod="openstack/placement-db-sync-tclqb" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.614239 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b411543d-f7a2-4a56-acb5-9b2d9598739a-combined-ca-bundle\") pod \"placement-db-sync-tclqb\" (UID: \"b411543d-f7a2-4a56-acb5-9b2d9598739a\") " pod="openstack/placement-db-sync-tclqb" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.620298 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b411543d-f7a2-4a56-acb5-9b2d9598739a-logs\") pod \"placement-db-sync-tclqb\" (UID: \"b411543d-f7a2-4a56-acb5-9b2d9598739a\") " pod="openstack/placement-db-sync-tclqb" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.621758 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e291ec97-2bfe-4bbe-a39d-9eca937f1855-config-data\") pod \"horizon-594f74c97c-r6bp5\" (UID: \"e291ec97-2bfe-4bbe-a39d-9eca937f1855\") " pod="openstack/horizon-594f74c97c-r6bp5" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.624176 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b411543d-f7a2-4a56-acb5-9b2d9598739a-scripts\") pod \"placement-db-sync-tclqb\" (UID: \"b411543d-f7a2-4a56-acb5-9b2d9598739a\") " pod="openstack/placement-db-sync-tclqb" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.624777 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e291ec97-2bfe-4bbe-a39d-9eca937f1855-scripts\") pod \"horizon-594f74c97c-r6bp5\" (UID: \"e291ec97-2bfe-4bbe-a39d-9eca937f1855\") " pod="openstack/horizon-594f74c97c-r6bp5" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.626023 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a592bd48-ea9a-4f6c-a7fe-49185fbbed82-combined-ca-bundle\") pod \"barbican-db-sync-r69k6\" (UID: \"a592bd48-ea9a-4f6c-a7fe-49185fbbed82\") " pod="openstack/barbican-db-sync-r69k6" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.626342 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b411543d-f7a2-4a56-acb5-9b2d9598739a-combined-ca-bundle\") pod \"placement-db-sync-tclqb\" (UID: \"b411543d-f7a2-4a56-acb5-9b2d9598739a\") " pod="openstack/placement-db-sync-tclqb" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.627453 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e291ec97-2bfe-4bbe-a39d-9eca937f1855-logs\") pod \"horizon-594f74c97c-r6bp5\" (UID: \"e291ec97-2bfe-4bbe-a39d-9eca937f1855\") " pod="openstack/horizon-594f74c97c-r6bp5" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.628649 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a592bd48-ea9a-4f6c-a7fe-49185fbbed82-db-sync-config-data\") pod \"barbican-db-sync-r69k6\" (UID: \"a592bd48-ea9a-4f6c-a7fe-49185fbbed82\") " pod="openstack/barbican-db-sync-r69k6" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.645755 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e291ec97-2bfe-4bbe-a39d-9eca937f1855-horizon-secret-key\") pod \"horizon-594f74c97c-r6bp5\" (UID: \"e291ec97-2bfe-4bbe-a39d-9eca937f1855\") " pod="openstack/horizon-594f74c97c-r6bp5" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.646940 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zhfdt" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.650753 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx2kh\" (UniqueName: \"kubernetes.io/projected/e291ec97-2bfe-4bbe-a39d-9eca937f1855-kube-api-access-mx2kh\") pod \"horizon-594f74c97c-r6bp5\" (UID: \"e291ec97-2bfe-4bbe-a39d-9eca937f1855\") " pod="openstack/horizon-594f74c97c-r6bp5" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.651711 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5tfq\" (UniqueName: \"kubernetes.io/projected/b411543d-f7a2-4a56-acb5-9b2d9598739a-kube-api-access-d5tfq\") pod \"placement-db-sync-tclqb\" (UID: \"b411543d-f7a2-4a56-acb5-9b2d9598739a\") " pod="openstack/placement-db-sync-tclqb" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.655662 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwlnl\" (UniqueName: \"kubernetes.io/projected/a592bd48-ea9a-4f6c-a7fe-49185fbbed82-kube-api-access-vwlnl\") pod \"barbican-db-sync-r69k6\" (UID: \"a592bd48-ea9a-4f6c-a7fe-49185fbbed82\") " pod="openstack/barbican-db-sync-r69k6" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.655674 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b411543d-f7a2-4a56-acb5-9b2d9598739a-config-data\") pod \"placement-db-sync-tclqb\" (UID: \"b411543d-f7a2-4a56-acb5-9b2d9598739a\") " pod="openstack/placement-db-sync-tclqb" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.716869 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-r69k6" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.717456 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-lmh2v\" (UID: \"394871e9-ec61-4b01-8d2a-90ce7785052b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.721801 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-config\") pod \"dnsmasq-dns-785d8bcb8c-lmh2v\" (UID: \"394871e9-ec61-4b01-8d2a-90ce7785052b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.721979 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6c309b4-8181-4f27-816a-f24419e2237f-scripts\") pod \"glance-default-external-api-0\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.722077 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-lmh2v\" (UID: \"394871e9-ec61-4b01-8d2a-90ce7785052b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.722171 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6c309b4-8181-4f27-816a-f24419e2237f-config-data\") pod \"glance-default-external-api-0\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.722257 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b6c309b4-8181-4f27-816a-f24419e2237f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.722363 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-lmh2v\" (UID: \"394871e9-ec61-4b01-8d2a-90ce7785052b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.722449 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c309b4-8181-4f27-816a-f24419e2237f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.722541 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-lmh2v\" (UID: \"394871e9-ec61-4b01-8d2a-90ce7785052b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.722636 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtdpd\" (UniqueName: \"kubernetes.io/projected/b6c309b4-8181-4f27-816a-f24419e2237f-kube-api-access-vtdpd\") pod \"glance-default-external-api-0\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.722748 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6c309b4-8181-4f27-816a-f24419e2237f-logs\") pod \"glance-default-external-api-0\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.722841 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6c309b4-8181-4f27-816a-f24419e2237f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.722901 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.722998 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wswsc\" (UniqueName: \"kubernetes.io/projected/394871e9-ec61-4b01-8d2a-90ce7785052b-kube-api-access-wswsc\") pod \"dnsmasq-dns-785d8bcb8c-lmh2v\" (UID: \"394871e9-ec61-4b01-8d2a-90ce7785052b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.829916 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-tclqb" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.835497 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b6c309b4-8181-4f27-816a-f24419e2237f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.835669 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-lmh2v\" (UID: \"394871e9-ec61-4b01-8d2a-90ce7785052b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.835696 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c309b4-8181-4f27-816a-f24419e2237f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.835724 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-lmh2v\" (UID: \"394871e9-ec61-4b01-8d2a-90ce7785052b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.835758 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtdpd\" (UniqueName: \"kubernetes.io/projected/b6c309b4-8181-4f27-816a-f24419e2237f-kube-api-access-vtdpd\") pod \"glance-default-external-api-0\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.835813 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6c309b4-8181-4f27-816a-f24419e2237f-logs\") pod \"glance-default-external-api-0\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.835846 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6c309b4-8181-4f27-816a-f24419e2237f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.835868 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.835905 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wswsc\" (UniqueName: \"kubernetes.io/projected/394871e9-ec61-4b01-8d2a-90ce7785052b-kube-api-access-wswsc\") pod \"dnsmasq-dns-785d8bcb8c-lmh2v\" (UID: \"394871e9-ec61-4b01-8d2a-90ce7785052b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.835933 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-lmh2v\" (UID: \"394871e9-ec61-4b01-8d2a-90ce7785052b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.835962 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-config\") pod \"dnsmasq-dns-785d8bcb8c-lmh2v\" (UID: \"394871e9-ec61-4b01-8d2a-90ce7785052b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.835999 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-lmh2v\" (UID: \"394871e9-ec61-4b01-8d2a-90ce7785052b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.836014 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6c309b4-8181-4f27-816a-f24419e2237f-scripts\") pod \"glance-default-external-api-0\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.836034 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6c309b4-8181-4f27-816a-f24419e2237f-config-data\") pod \"glance-default-external-api-0\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.836913 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-594f74c97c-r6bp5" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.837422 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-lmh2v\" (UID: \"394871e9-ec61-4b01-8d2a-90ce7785052b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.837802 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.838207 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-lmh2v\" (UID: \"394871e9-ec61-4b01-8d2a-90ce7785052b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.838787 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-config\") pod \"dnsmasq-dns-785d8bcb8c-lmh2v\" (UID: \"394871e9-ec61-4b01-8d2a-90ce7785052b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.839329 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b6c309b4-8181-4f27-816a-f24419e2237f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.845531 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-lmh2v\" (UID: \"394871e9-ec61-4b01-8d2a-90ce7785052b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.846043 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-lmh2v\" (UID: \"394871e9-ec61-4b01-8d2a-90ce7785052b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.846247 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6c309b4-8181-4f27-816a-f24419e2237f-logs\") pod \"glance-default-external-api-0\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.854051 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6c309b4-8181-4f27-816a-f24419e2237f-scripts\") pod \"glance-default-external-api-0\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.875051 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wswsc\" (UniqueName: \"kubernetes.io/projected/394871e9-ec61-4b01-8d2a-90ce7785052b-kube-api-access-wswsc\") pod \"dnsmasq-dns-785d8bcb8c-lmh2v\" (UID: \"394871e9-ec61-4b01-8d2a-90ce7785052b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.880396 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtdpd\" (UniqueName: \"kubernetes.io/projected/b6c309b4-8181-4f27-816a-f24419e2237f-kube-api-access-vtdpd\") pod \"glance-default-external-api-0\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.880449 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c309b4-8181-4f27-816a-f24419e2237f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.883338 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6c309b4-8181-4f27-816a-f24419e2237f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.908441 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6c309b4-8181-4f27-816a-f24419e2237f-config-data\") pod \"glance-default-external-api-0\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.918700 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5b2xx"] Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.919417 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.928961 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 01:23:53 crc kubenswrapper[4771]: I0227 01:23:53.943615 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" Feb 27 01:23:54 crc kubenswrapper[4771]: W0227 01:23:54.024218 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20ec595c_6590_4b78_8ecc_f1e93d38d9f0.slice/crio-7ceddce40eef04b697d22781eda3fd2ff1f77e7c766a5b816606f04291e79a56 WatchSource:0}: Error finding container 7ceddce40eef04b697d22781eda3fd2ff1f77e7c766a5b816606f04291e79a56: Status 404 returned error can't find the container with id 7ceddce40eef04b697d22781eda3fd2ff1f77e7c766a5b816606f04291e79a56 Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.066278 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.071887 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.078089 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.078268 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.080024 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-wjc2q"] Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.120987 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.202093 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-mg85q"] Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.231445 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.258811 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.258864 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.258900 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.258922 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.258952 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pmc8\" (UniqueName: \"kubernetes.io/projected/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-kube-api-access-6pmc8\") pod \"glance-default-internal-api-0\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.259015 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.259033 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.259058 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-logs\") pod \"glance-default-internal-api-0\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.362107 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.365762 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.365814 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.365837 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.365880 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pmc8\" (UniqueName: \"kubernetes.io/projected/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-kube-api-access-6pmc8\") pod \"glance-default-internal-api-0\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.365993 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.366009 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.366033 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-logs\") pod \"glance-default-internal-api-0\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.366457 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-logs\") pod \"glance-default-internal-api-0\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.366670 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.375399 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.405318 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pmc8\" (UniqueName: \"kubernetes.io/projected/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-kube-api-access-6pmc8\") pod \"glance-default-internal-api-0\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.405909 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.406769 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.406815 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.407469 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-69b7666f9c-x658x"] Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.412370 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.435171 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.585932 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-r69k6"] Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.591684 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-zhfdt"] Feb 27 01:23:54 crc kubenswrapper[4771]: W0227 01:23:54.605832 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37e7849a_97b9_4e3d_9ad3_c0c942775e64.slice/crio-2cd6bcdc237d5014263bcfe603054958d12eed63049322e3c231e7a0ef90116d WatchSource:0}: Error finding container 2cd6bcdc237d5014263bcfe603054958d12eed63049322e3c231e7a0ef90116d: Status 404 returned error can't find the container with id 2cd6bcdc237d5014263bcfe603054958d12eed63049322e3c231e7a0ef90116d Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.640771 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zhfdt" event={"ID":"37e7849a-97b9-4e3d-9ad3-c0c942775e64","Type":"ContainerStarted","Data":"2cd6bcdc237d5014263bcfe603054958d12eed63049322e3c231e7a0ef90116d"} Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.644343 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3","Type":"ContainerStarted","Data":"a65e85a159a9b17486796d380a3351deecb3414ee3e1d85ac5d1b6c55b301bc2"} Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.644744 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.646382 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mg85q" event={"ID":"d82930a8-1630-4e85-86f0-0f2027e7225d","Type":"ContainerStarted","Data":"c5e5a41494c057433bfb54e940b25780c49840b79783e92af471fde7d1ea1aa5"} Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.647829 4771 generic.go:334] "Generic (PLEG): container finished" podID="57b3503d-2d97-4099-b26a-4c3ba2bb5c1b" containerID="aeac629ea27b72a16934e3774e2c582b2e5bbd02ff8433ee069eb4e5b4f1cf42" exitCode=0 Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.648153 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-wjc2q" event={"ID":"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b","Type":"ContainerDied","Data":"aeac629ea27b72a16934e3774e2c582b2e5bbd02ff8433ee069eb4e5b4f1cf42"} Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.648169 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-wjc2q" event={"ID":"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b","Type":"ContainerStarted","Data":"d36f233662f63e71abc8847db49eee6b0afb6ae5d289929a282ca344145c74cd"} Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.673039 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69b7666f9c-x658x" event={"ID":"f610e3af-68b8-445a-b8c3-a9f7d4319fdf","Type":"ContainerStarted","Data":"ad17b20b7b10283a7336f75fa7d46b802a4c1486a7d1ca3d99981e6299621399"} Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.696425 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5b2xx" event={"ID":"20ec595c-6590-4b78-8ecc-f1e93d38d9f0","Type":"ContainerStarted","Data":"c181f99a92bb6947d480fa246210ecc8c94e2dc39fc910dafa79d202a559fae5"} Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.696472 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5b2xx" event={"ID":"20ec595c-6590-4b78-8ecc-f1e93d38d9f0","Type":"ContainerStarted","Data":"7ceddce40eef04b697d22781eda3fd2ff1f77e7c766a5b816606f04291e79a56"} Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.714765 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-5b2xx" podStartSLOduration=2.7147492570000002 podStartE2EDuration="2.714749257s" podCreationTimestamp="2026-02-27 01:23:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:23:54.714526591 +0000 UTC m=+1147.652087879" watchObservedRunningTime="2026-02-27 01:23:54.714749257 +0000 UTC m=+1147.652310535" Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.874748 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-tclqb"] Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.907478 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-594f74c97c-r6bp5"] Feb 27 01:23:54 crc kubenswrapper[4771]: I0227 01:23:54.952387 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-lmh2v"] Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.156187 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.270561 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-wjc2q" Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.434968 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-dns-swift-storage-0\") pod \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\" (UID: \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\") " Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.435229 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-ovsdbserver-nb\") pod \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\" (UID: \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\") " Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.435325 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-ovsdbserver-sb\") pod \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\" (UID: \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\") " Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.435373 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-dns-svc\") pod \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\" (UID: \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\") " Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.435430 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxjfl\" (UniqueName: \"kubernetes.io/projected/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-kube-api-access-lxjfl\") pod \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\" (UID: \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\") " Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.435518 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-config\") pod \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\" (UID: \"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b\") " Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.450641 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-kube-api-access-lxjfl" (OuterVolumeSpecName: "kube-api-access-lxjfl") pod "57b3503d-2d97-4099-b26a-4c3ba2bb5c1b" (UID: "57b3503d-2d97-4099-b26a-4c3ba2bb5c1b"). InnerVolumeSpecName "kube-api-access-lxjfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.462235 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-config" (OuterVolumeSpecName: "config") pod "57b3503d-2d97-4099-b26a-4c3ba2bb5c1b" (UID: "57b3503d-2d97-4099-b26a-4c3ba2bb5c1b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.468592 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "57b3503d-2d97-4099-b26a-4c3ba2bb5c1b" (UID: "57b3503d-2d97-4099-b26a-4c3ba2bb5c1b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.472347 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "57b3503d-2d97-4099-b26a-4c3ba2bb5c1b" (UID: "57b3503d-2d97-4099-b26a-4c3ba2bb5c1b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.484200 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "57b3503d-2d97-4099-b26a-4c3ba2bb5c1b" (UID: "57b3503d-2d97-4099-b26a-4c3ba2bb5c1b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.489967 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "57b3503d-2d97-4099-b26a-4c3ba2bb5c1b" (UID: "57b3503d-2d97-4099-b26a-4c3ba2bb5c1b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.540404 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.540535 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.540577 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.540587 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxjfl\" (UniqueName: \"kubernetes.io/projected/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-kube-api-access-lxjfl\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.540597 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.540606 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.540614 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:55 crc kubenswrapper[4771]: W0227 01:23:55.575356 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode69c9c6d_a6df_410f_81e9_ed9f0ac4f19e.slice/crio-51fada7abc1f4c186fada8b65c098a18317f07671a4099586e674b0a27ea1aef WatchSource:0}: Error finding container 51fada7abc1f4c186fada8b65c098a18317f07671a4099586e674b0a27ea1aef: Status 404 returned error can't find the container with id 51fada7abc1f4c186fada8b65c098a18317f07671a4099586e674b0a27ea1aef Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.711004 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mg85q" event={"ID":"d82930a8-1630-4e85-86f0-0f2027e7225d","Type":"ContainerStarted","Data":"78fcd55cba24557ee0c5e9291f2b4f14ef5cd69699b00ead6dac6bbbd1fe2ef4"} Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.717134 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e","Type":"ContainerStarted","Data":"51fada7abc1f4c186fada8b65c098a18317f07671a4099586e674b0a27ea1aef"} Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.721040 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-wjc2q" event={"ID":"57b3503d-2d97-4099-b26a-4c3ba2bb5c1b","Type":"ContainerDied","Data":"d36f233662f63e71abc8847db49eee6b0afb6ae5d289929a282ca344145c74cd"} Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.721093 4771 scope.go:117] "RemoveContainer" containerID="aeac629ea27b72a16934e3774e2c582b2e5bbd02ff8433ee069eb4e5b4f1cf42" Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.721226 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-wjc2q" Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.727859 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-mg85q" podStartSLOduration=3.727839474 podStartE2EDuration="3.727839474s" podCreationTimestamp="2026-02-27 01:23:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:23:55.726319983 +0000 UTC m=+1148.663881271" watchObservedRunningTime="2026-02-27 01:23:55.727839474 +0000 UTC m=+1148.665400762" Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.728008 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-594f74c97c-r6bp5" event={"ID":"e291ec97-2bfe-4bbe-a39d-9eca937f1855","Type":"ContainerStarted","Data":"e6bfd19b6bc88cee5b51d9a873846b38bed6b86f360da5995b0f86d8c030aeae"} Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.735417 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-r69k6" event={"ID":"a592bd48-ea9a-4f6c-a7fe-49185fbbed82","Type":"ContainerStarted","Data":"eeee4f76390f988441a32b7ca39a4c74044868e5e415b1893c2124acd0e9c7b2"} Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.737903 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b6c309b4-8181-4f27-816a-f24419e2237f","Type":"ContainerStarted","Data":"8767056e16133c3f7cc5c02253756dbb5fa4ca990feb32d3561236e96c47b555"} Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.743139 4771 generic.go:334] "Generic (PLEG): container finished" podID="394871e9-ec61-4b01-8d2a-90ce7785052b" containerID="e322bc4c34bfe742ef9e17db16433643e1554fca8bcb23fe3ecb49970965d046" exitCode=0 Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.743200 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" event={"ID":"394871e9-ec61-4b01-8d2a-90ce7785052b","Type":"ContainerDied","Data":"e322bc4c34bfe742ef9e17db16433643e1554fca8bcb23fe3ecb49970965d046"} Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.743290 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" event={"ID":"394871e9-ec61-4b01-8d2a-90ce7785052b","Type":"ContainerStarted","Data":"59bc82cd2aa6b2f99c765d45ceddb45e3db6fe8591265300c8688097af7a996c"} Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.749417 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-tclqb" event={"ID":"b411543d-f7a2-4a56-acb5-9b2d9598739a","Type":"ContainerStarted","Data":"e24d075f680eafd68e20864a624e7af6935c55c75d8bdbc7902e246aac015cb7"} Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.820651 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-wjc2q"] Feb 27 01:23:55 crc kubenswrapper[4771]: I0227 01:23:55.843614 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-wjc2q"] Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.333931 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.376627 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-69b7666f9c-x658x"] Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.382987 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.412452 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6dcdc5597c-5tv7x"] Feb 27 01:23:56 crc kubenswrapper[4771]: E0227 01:23:56.412864 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57b3503d-2d97-4099-b26a-4c3ba2bb5c1b" containerName="init" Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.412881 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b3503d-2d97-4099-b26a-4c3ba2bb5c1b" containerName="init" Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.413084 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="57b3503d-2d97-4099-b26a-4c3ba2bb5c1b" containerName="init" Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.414594 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6dcdc5597c-5tv7x" Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.439750 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6dcdc5597c-5tv7x"] Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.464613 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.563623 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c0cc73fb-2983-4575-9a64-6d66336ef380-horizon-secret-key\") pod \"horizon-6dcdc5597c-5tv7x\" (UID: \"c0cc73fb-2983-4575-9a64-6d66336ef380\") " pod="openstack/horizon-6dcdc5597c-5tv7x" Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.563724 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5stm\" (UniqueName: \"kubernetes.io/projected/c0cc73fb-2983-4575-9a64-6d66336ef380-kube-api-access-v5stm\") pod \"horizon-6dcdc5597c-5tv7x\" (UID: \"c0cc73fb-2983-4575-9a64-6d66336ef380\") " pod="openstack/horizon-6dcdc5597c-5tv7x" Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.563835 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0cc73fb-2983-4575-9a64-6d66336ef380-logs\") pod \"horizon-6dcdc5597c-5tv7x\" (UID: \"c0cc73fb-2983-4575-9a64-6d66336ef380\") " pod="openstack/horizon-6dcdc5597c-5tv7x" Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.563857 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0cc73fb-2983-4575-9a64-6d66336ef380-config-data\") pod \"horizon-6dcdc5597c-5tv7x\" (UID: \"c0cc73fb-2983-4575-9a64-6d66336ef380\") " pod="openstack/horizon-6dcdc5597c-5tv7x" Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.563884 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0cc73fb-2983-4575-9a64-6d66336ef380-scripts\") pod \"horizon-6dcdc5597c-5tv7x\" (UID: \"c0cc73fb-2983-4575-9a64-6d66336ef380\") " pod="openstack/horizon-6dcdc5597c-5tv7x" Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.667522 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5stm\" (UniqueName: \"kubernetes.io/projected/c0cc73fb-2983-4575-9a64-6d66336ef380-kube-api-access-v5stm\") pod \"horizon-6dcdc5597c-5tv7x\" (UID: \"c0cc73fb-2983-4575-9a64-6d66336ef380\") " pod="openstack/horizon-6dcdc5597c-5tv7x" Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.667625 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0cc73fb-2983-4575-9a64-6d66336ef380-logs\") pod \"horizon-6dcdc5597c-5tv7x\" (UID: \"c0cc73fb-2983-4575-9a64-6d66336ef380\") " pod="openstack/horizon-6dcdc5597c-5tv7x" Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.667647 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0cc73fb-2983-4575-9a64-6d66336ef380-config-data\") pod \"horizon-6dcdc5597c-5tv7x\" (UID: \"c0cc73fb-2983-4575-9a64-6d66336ef380\") " pod="openstack/horizon-6dcdc5597c-5tv7x" Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.667701 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0cc73fb-2983-4575-9a64-6d66336ef380-scripts\") pod \"horizon-6dcdc5597c-5tv7x\" (UID: \"c0cc73fb-2983-4575-9a64-6d66336ef380\") " pod="openstack/horizon-6dcdc5597c-5tv7x" Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.667730 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c0cc73fb-2983-4575-9a64-6d66336ef380-horizon-secret-key\") pod \"horizon-6dcdc5597c-5tv7x\" (UID: \"c0cc73fb-2983-4575-9a64-6d66336ef380\") " pod="openstack/horizon-6dcdc5597c-5tv7x" Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.669959 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0cc73fb-2983-4575-9a64-6d66336ef380-logs\") pod \"horizon-6dcdc5597c-5tv7x\" (UID: \"c0cc73fb-2983-4575-9a64-6d66336ef380\") " pod="openstack/horizon-6dcdc5597c-5tv7x" Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.671160 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0cc73fb-2983-4575-9a64-6d66336ef380-config-data\") pod \"horizon-6dcdc5597c-5tv7x\" (UID: \"c0cc73fb-2983-4575-9a64-6d66336ef380\") " pod="openstack/horizon-6dcdc5597c-5tv7x" Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.671627 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0cc73fb-2983-4575-9a64-6d66336ef380-scripts\") pod \"horizon-6dcdc5597c-5tv7x\" (UID: \"c0cc73fb-2983-4575-9a64-6d66336ef380\") " pod="openstack/horizon-6dcdc5597c-5tv7x" Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.676112 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c0cc73fb-2983-4575-9a64-6d66336ef380-horizon-secret-key\") pod \"horizon-6dcdc5597c-5tv7x\" (UID: \"c0cc73fb-2983-4575-9a64-6d66336ef380\") " pod="openstack/horizon-6dcdc5597c-5tv7x" Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.705113 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5stm\" (UniqueName: \"kubernetes.io/projected/c0cc73fb-2983-4575-9a64-6d66336ef380-kube-api-access-v5stm\") pod \"horizon-6dcdc5597c-5tv7x\" (UID: \"c0cc73fb-2983-4575-9a64-6d66336ef380\") " pod="openstack/horizon-6dcdc5597c-5tv7x" Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.743934 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6dcdc5597c-5tv7x" Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.826877 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b6c309b4-8181-4f27-816a-f24419e2237f","Type":"ContainerStarted","Data":"3946b501b6c21d66093b133e7dc7a6ce8650a5646695c483d5de54837e34456d"} Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.879959 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" event={"ID":"394871e9-ec61-4b01-8d2a-90ce7785052b","Type":"ContainerStarted","Data":"d939a357abf1eccc747917a70bbadfbe2d16e5df05e7b92b9be91f16e946efbd"} Feb 27 01:23:56 crc kubenswrapper[4771]: I0227 01:23:56.905349 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" podStartSLOduration=3.905326634 podStartE2EDuration="3.905326634s" podCreationTimestamp="2026-02-27 01:23:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:23:56.899700791 +0000 UTC m=+1149.837262079" watchObservedRunningTime="2026-02-27 01:23:56.905326634 +0000 UTC m=+1149.842887922" Feb 27 01:23:57 crc kubenswrapper[4771]: I0227 01:23:57.377934 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6dcdc5597c-5tv7x"] Feb 27 01:23:57 crc kubenswrapper[4771]: W0227 01:23:57.396641 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0cc73fb_2983_4575_9a64_6d66336ef380.slice/crio-c0048cd9313c5a6bbf9e1729087d0bbab7e059122650b4d36ba3102b5faa80b4 WatchSource:0}: Error finding container c0048cd9313c5a6bbf9e1729087d0bbab7e059122650b4d36ba3102b5faa80b4: Status 404 returned error can't find the container with id c0048cd9313c5a6bbf9e1729087d0bbab7e059122650b4d36ba3102b5faa80b4 Feb 27 01:23:57 crc kubenswrapper[4771]: I0227 01:23:57.817136 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57b3503d-2d97-4099-b26a-4c3ba2bb5c1b" path="/var/lib/kubelet/pods/57b3503d-2d97-4099-b26a-4c3ba2bb5c1b/volumes" Feb 27 01:23:57 crc kubenswrapper[4771]: I0227 01:23:57.927255 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e","Type":"ContainerStarted","Data":"44f80634fef71224bf8b704684822a977e931ab5e3f9b9686996045019da81be"} Feb 27 01:23:57 crc kubenswrapper[4771]: I0227 01:23:57.929700 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6dcdc5597c-5tv7x" event={"ID":"c0cc73fb-2983-4575-9a64-6d66336ef380","Type":"ContainerStarted","Data":"c0048cd9313c5a6bbf9e1729087d0bbab7e059122650b4d36ba3102b5faa80b4"} Feb 27 01:23:57 crc kubenswrapper[4771]: I0227 01:23:57.937750 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b6c309b4-8181-4f27-816a-f24419e2237f","Type":"ContainerStarted","Data":"aae965270ca9e98148784feab5f2671ba4ab32ff839a7ccb0aea699d3ffaef24"} Feb 27 01:23:57 crc kubenswrapper[4771]: I0227 01:23:57.937827 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" Feb 27 01:23:57 crc kubenswrapper[4771]: I0227 01:23:57.937831 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b6c309b4-8181-4f27-816a-f24419e2237f" containerName="glance-log" containerID="cri-o://3946b501b6c21d66093b133e7dc7a6ce8650a5646695c483d5de54837e34456d" gracePeriod=30 Feb 27 01:23:57 crc kubenswrapper[4771]: I0227 01:23:57.937915 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b6c309b4-8181-4f27-816a-f24419e2237f" containerName="glance-httpd" containerID="cri-o://aae965270ca9e98148784feab5f2671ba4ab32ff839a7ccb0aea699d3ffaef24" gracePeriod=30 Feb 27 01:23:58 crc kubenswrapper[4771]: I0227 01:23:58.014070 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.014051632 podStartE2EDuration="5.014051632s" podCreationTimestamp="2026-02-27 01:23:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:23:58.007083572 +0000 UTC m=+1150.944644920" watchObservedRunningTime="2026-02-27 01:23:58.014051632 +0000 UTC m=+1150.951612920" Feb 27 01:23:58 crc kubenswrapper[4771]: I0227 01:23:58.893221 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 01:23:58 crc kubenswrapper[4771]: I0227 01:23:58.950673 4771 generic.go:334] "Generic (PLEG): container finished" podID="b6c309b4-8181-4f27-816a-f24419e2237f" containerID="aae965270ca9e98148784feab5f2671ba4ab32ff839a7ccb0aea699d3ffaef24" exitCode=143 Feb 27 01:23:58 crc kubenswrapper[4771]: I0227 01:23:58.950709 4771 generic.go:334] "Generic (PLEG): container finished" podID="b6c309b4-8181-4f27-816a-f24419e2237f" containerID="3946b501b6c21d66093b133e7dc7a6ce8650a5646695c483d5de54837e34456d" exitCode=143 Feb 27 01:23:58 crc kubenswrapper[4771]: I0227 01:23:58.950736 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 01:23:58 crc kubenswrapper[4771]: I0227 01:23:58.950772 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b6c309b4-8181-4f27-816a-f24419e2237f","Type":"ContainerDied","Data":"aae965270ca9e98148784feab5f2671ba4ab32ff839a7ccb0aea699d3ffaef24"} Feb 27 01:23:58 crc kubenswrapper[4771]: I0227 01:23:58.950805 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b6c309b4-8181-4f27-816a-f24419e2237f","Type":"ContainerDied","Data":"3946b501b6c21d66093b133e7dc7a6ce8650a5646695c483d5de54837e34456d"} Feb 27 01:23:58 crc kubenswrapper[4771]: I0227 01:23:58.950820 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b6c309b4-8181-4f27-816a-f24419e2237f","Type":"ContainerDied","Data":"8767056e16133c3f7cc5c02253756dbb5fa4ca990feb32d3561236e96c47b555"} Feb 27 01:23:58 crc kubenswrapper[4771]: I0227 01:23:58.950839 4771 scope.go:117] "RemoveContainer" containerID="aae965270ca9e98148784feab5f2671ba4ab32ff839a7ccb0aea699d3ffaef24" Feb 27 01:23:58 crc kubenswrapper[4771]: I0227 01:23:58.952997 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:23:58 crc kubenswrapper[4771]: I0227 01:23:58.953029 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:23:58 crc kubenswrapper[4771]: I0227 01:23:58.953058 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 01:23:58 crc kubenswrapper[4771]: I0227 01:23:58.953200 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e","Type":"ContainerStarted","Data":"298ec68cf3ac857040e6c8e715e4c9e8784bce6f8a571ee42d6fefb5064a34ab"} Feb 27 01:23:58 crc kubenswrapper[4771]: I0227 01:23:58.953995 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f3112e69f234defa1fcff4a9c5517c895c98346bf69153547a5fa6e13f50fed1"} pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 01:23:58 crc kubenswrapper[4771]: I0227 01:23:58.954059 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" containerID="cri-o://f3112e69f234defa1fcff4a9c5517c895c98346bf69153547a5fa6e13f50fed1" gracePeriod=600 Feb 27 01:23:58 crc kubenswrapper[4771]: I0227 01:23:58.954262 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e" containerName="glance-log" containerID="cri-o://44f80634fef71224bf8b704684822a977e931ab5e3f9b9686996045019da81be" gracePeriod=30 Feb 27 01:23:58 crc kubenswrapper[4771]: I0227 01:23:58.954319 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e" containerName="glance-httpd" containerID="cri-o://298ec68cf3ac857040e6c8e715e4c9e8784bce6f8a571ee42d6fefb5064a34ab" gracePeriod=30 Feb 27 01:23:58 crc kubenswrapper[4771]: I0227 01:23:58.981361 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.981340642 podStartE2EDuration="5.981340642s" podCreationTimestamp="2026-02-27 01:23:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:23:58.975942245 +0000 UTC m=+1151.913503533" watchObservedRunningTime="2026-02-27 01:23:58.981340642 +0000 UTC m=+1151.918901930" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:58.998806 4771 scope.go:117] "RemoveContainer" containerID="3946b501b6c21d66093b133e7dc7a6ce8650a5646695c483d5de54837e34456d" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.029770 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6c309b4-8181-4f27-816a-f24419e2237f-logs\") pod \"b6c309b4-8181-4f27-816a-f24419e2237f\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.030252 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6c309b4-8181-4f27-816a-f24419e2237f-public-tls-certs\") pod \"b6c309b4-8181-4f27-816a-f24419e2237f\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.030204 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6c309b4-8181-4f27-816a-f24419e2237f-logs" (OuterVolumeSpecName: "logs") pod "b6c309b4-8181-4f27-816a-f24419e2237f" (UID: "b6c309b4-8181-4f27-816a-f24419e2237f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.030417 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b6c309b4-8181-4f27-816a-f24419e2237f-httpd-run\") pod \"b6c309b4-8181-4f27-816a-f24419e2237f\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.031083 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6c309b4-8181-4f27-816a-f24419e2237f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b6c309b4-8181-4f27-816a-f24419e2237f" (UID: "b6c309b4-8181-4f27-816a-f24419e2237f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.031173 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"b6c309b4-8181-4f27-816a-f24419e2237f\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.031523 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtdpd\" (UniqueName: \"kubernetes.io/projected/b6c309b4-8181-4f27-816a-f24419e2237f-kube-api-access-vtdpd\") pod \"b6c309b4-8181-4f27-816a-f24419e2237f\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.032075 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6c309b4-8181-4f27-816a-f24419e2237f-config-data\") pod \"b6c309b4-8181-4f27-816a-f24419e2237f\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.032110 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6c309b4-8181-4f27-816a-f24419e2237f-scripts\") pod \"b6c309b4-8181-4f27-816a-f24419e2237f\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.032336 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c309b4-8181-4f27-816a-f24419e2237f-combined-ca-bundle\") pod \"b6c309b4-8181-4f27-816a-f24419e2237f\" (UID: \"b6c309b4-8181-4f27-816a-f24419e2237f\") " Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.032941 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6c309b4-8181-4f27-816a-f24419e2237f-logs\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.033021 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b6c309b4-8181-4f27-816a-f24419e2237f-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.036636 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "b6c309b4-8181-4f27-816a-f24419e2237f" (UID: "b6c309b4-8181-4f27-816a-f24419e2237f"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.040524 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6c309b4-8181-4f27-816a-f24419e2237f-kube-api-access-vtdpd" (OuterVolumeSpecName: "kube-api-access-vtdpd") pod "b6c309b4-8181-4f27-816a-f24419e2237f" (UID: "b6c309b4-8181-4f27-816a-f24419e2237f"). InnerVolumeSpecName "kube-api-access-vtdpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.040876 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6c309b4-8181-4f27-816a-f24419e2237f-scripts" (OuterVolumeSpecName: "scripts") pod "b6c309b4-8181-4f27-816a-f24419e2237f" (UID: "b6c309b4-8181-4f27-816a-f24419e2237f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.045921 4771 scope.go:117] "RemoveContainer" containerID="aae965270ca9e98148784feab5f2671ba4ab32ff839a7ccb0aea699d3ffaef24" Feb 27 01:23:59 crc kubenswrapper[4771]: E0227 01:23:59.047783 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aae965270ca9e98148784feab5f2671ba4ab32ff839a7ccb0aea699d3ffaef24\": container with ID starting with aae965270ca9e98148784feab5f2671ba4ab32ff839a7ccb0aea699d3ffaef24 not found: ID does not exist" containerID="aae965270ca9e98148784feab5f2671ba4ab32ff839a7ccb0aea699d3ffaef24" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.047828 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aae965270ca9e98148784feab5f2671ba4ab32ff839a7ccb0aea699d3ffaef24"} err="failed to get container status \"aae965270ca9e98148784feab5f2671ba4ab32ff839a7ccb0aea699d3ffaef24\": rpc error: code = NotFound desc = could not find container \"aae965270ca9e98148784feab5f2671ba4ab32ff839a7ccb0aea699d3ffaef24\": container with ID starting with aae965270ca9e98148784feab5f2671ba4ab32ff839a7ccb0aea699d3ffaef24 not found: ID does not exist" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.047855 4771 scope.go:117] "RemoveContainer" containerID="3946b501b6c21d66093b133e7dc7a6ce8650a5646695c483d5de54837e34456d" Feb 27 01:23:59 crc kubenswrapper[4771]: E0227 01:23:59.050095 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3946b501b6c21d66093b133e7dc7a6ce8650a5646695c483d5de54837e34456d\": container with ID starting with 3946b501b6c21d66093b133e7dc7a6ce8650a5646695c483d5de54837e34456d not found: ID does not exist" containerID="3946b501b6c21d66093b133e7dc7a6ce8650a5646695c483d5de54837e34456d" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.050164 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3946b501b6c21d66093b133e7dc7a6ce8650a5646695c483d5de54837e34456d"} err="failed to get container status \"3946b501b6c21d66093b133e7dc7a6ce8650a5646695c483d5de54837e34456d\": rpc error: code = NotFound desc = could not find container \"3946b501b6c21d66093b133e7dc7a6ce8650a5646695c483d5de54837e34456d\": container with ID starting with 3946b501b6c21d66093b133e7dc7a6ce8650a5646695c483d5de54837e34456d not found: ID does not exist" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.050202 4771 scope.go:117] "RemoveContainer" containerID="aae965270ca9e98148784feab5f2671ba4ab32ff839a7ccb0aea699d3ffaef24" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.051188 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aae965270ca9e98148784feab5f2671ba4ab32ff839a7ccb0aea699d3ffaef24"} err="failed to get container status \"aae965270ca9e98148784feab5f2671ba4ab32ff839a7ccb0aea699d3ffaef24\": rpc error: code = NotFound desc = could not find container \"aae965270ca9e98148784feab5f2671ba4ab32ff839a7ccb0aea699d3ffaef24\": container with ID starting with aae965270ca9e98148784feab5f2671ba4ab32ff839a7ccb0aea699d3ffaef24 not found: ID does not exist" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.051228 4771 scope.go:117] "RemoveContainer" containerID="3946b501b6c21d66093b133e7dc7a6ce8650a5646695c483d5de54837e34456d" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.051668 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3946b501b6c21d66093b133e7dc7a6ce8650a5646695c483d5de54837e34456d"} err="failed to get container status \"3946b501b6c21d66093b133e7dc7a6ce8650a5646695c483d5de54837e34456d\": rpc error: code = NotFound desc = could not find container \"3946b501b6c21d66093b133e7dc7a6ce8650a5646695c483d5de54837e34456d\": container with ID starting with 3946b501b6c21d66093b133e7dc7a6ce8650a5646695c483d5de54837e34456d not found: ID does not exist" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.097089 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6c309b4-8181-4f27-816a-f24419e2237f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b6c309b4-8181-4f27-816a-f24419e2237f" (UID: "b6c309b4-8181-4f27-816a-f24419e2237f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.101601 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6c309b4-8181-4f27-816a-f24419e2237f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6c309b4-8181-4f27-816a-f24419e2237f" (UID: "b6c309b4-8181-4f27-816a-f24419e2237f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.129759 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6c309b4-8181-4f27-816a-f24419e2237f-config-data" (OuterVolumeSpecName: "config-data") pod "b6c309b4-8181-4f27-816a-f24419e2237f" (UID: "b6c309b4-8181-4f27-816a-f24419e2237f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.135098 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtdpd\" (UniqueName: \"kubernetes.io/projected/b6c309b4-8181-4f27-816a-f24419e2237f-kube-api-access-vtdpd\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.135128 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6c309b4-8181-4f27-816a-f24419e2237f-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.135139 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6c309b4-8181-4f27-816a-f24419e2237f-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.135148 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c309b4-8181-4f27-816a-f24419e2237f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.135157 4771 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6c309b4-8181-4f27-816a-f24419e2237f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.135178 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.160248 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.238190 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.365778 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.376029 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.396535 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 01:23:59 crc kubenswrapper[4771]: E0227 01:23:59.396938 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6c309b4-8181-4f27-816a-f24419e2237f" containerName="glance-log" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.396960 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c309b4-8181-4f27-816a-f24419e2237f" containerName="glance-log" Feb 27 01:23:59 crc kubenswrapper[4771]: E0227 01:23:59.396995 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6c309b4-8181-4f27-816a-f24419e2237f" containerName="glance-httpd" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.397005 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c309b4-8181-4f27-816a-f24419e2237f" containerName="glance-httpd" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.397191 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6c309b4-8181-4f27-816a-f24419e2237f" containerName="glance-log" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.397219 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6c309b4-8181-4f27-816a-f24419e2237f" containerName="glance-httpd" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.398188 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.413898 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.414229 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.472942 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.542526 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1289d10b-b2e9-4b14-ba46-c4de11e966be-scripts\") pod \"glance-default-external-api-0\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.542595 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1289d10b-b2e9-4b14-ba46-c4de11e966be-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.542625 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnjk9\" (UniqueName: \"kubernetes.io/projected/1289d10b-b2e9-4b14-ba46-c4de11e966be-kube-api-access-qnjk9\") pod \"glance-default-external-api-0\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.542688 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1289d10b-b2e9-4b14-ba46-c4de11e966be-logs\") pod \"glance-default-external-api-0\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.542723 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1289d10b-b2e9-4b14-ba46-c4de11e966be-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.542765 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1289d10b-b2e9-4b14-ba46-c4de11e966be-config-data\") pod \"glance-default-external-api-0\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.542789 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.542808 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1289d10b-b2e9-4b14-ba46-c4de11e966be-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.648435 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1289d10b-b2e9-4b14-ba46-c4de11e966be-config-data\") pod \"glance-default-external-api-0\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.648495 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.648515 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1289d10b-b2e9-4b14-ba46-c4de11e966be-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.648576 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1289d10b-b2e9-4b14-ba46-c4de11e966be-scripts\") pod \"glance-default-external-api-0\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.648597 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1289d10b-b2e9-4b14-ba46-c4de11e966be-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.648632 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnjk9\" (UniqueName: \"kubernetes.io/projected/1289d10b-b2e9-4b14-ba46-c4de11e966be-kube-api-access-qnjk9\") pod \"glance-default-external-api-0\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.648679 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1289d10b-b2e9-4b14-ba46-c4de11e966be-logs\") pod \"glance-default-external-api-0\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.648705 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1289d10b-b2e9-4b14-ba46-c4de11e966be-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.650892 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.651227 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1289d10b-b2e9-4b14-ba46-c4de11e966be-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.652119 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1289d10b-b2e9-4b14-ba46-c4de11e966be-logs\") pod \"glance-default-external-api-0\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.653735 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1289d10b-b2e9-4b14-ba46-c4de11e966be-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.658185 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1289d10b-b2e9-4b14-ba46-c4de11e966be-scripts\") pod \"glance-default-external-api-0\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.658890 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1289d10b-b2e9-4b14-ba46-c4de11e966be-config-data\") pod \"glance-default-external-api-0\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.667206 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1289d10b-b2e9-4b14-ba46-c4de11e966be-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.673353 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnjk9\" (UniqueName: \"kubernetes.io/projected/1289d10b-b2e9-4b14-ba46-c4de11e966be-kube-api-access-qnjk9\") pod \"glance-default-external-api-0\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.698594 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.752794 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.794155 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6c309b4-8181-4f27-816a-f24419e2237f" path="/var/lib/kubelet/pods/b6c309b4-8181-4f27-816a-f24419e2237f/volumes" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.797564 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.853028 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-internal-tls-certs\") pod \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.853109 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.853193 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-config-data\") pod \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.853271 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pmc8\" (UniqueName: \"kubernetes.io/projected/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-kube-api-access-6pmc8\") pod \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.853334 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-logs\") pod \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.853360 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-combined-ca-bundle\") pod \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.853422 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-httpd-run\") pod \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.853444 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-scripts\") pod \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\" (UID: \"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e\") " Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.854690 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e" (UID: "e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.854989 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-logs" (OuterVolumeSpecName: "logs") pod "e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e" (UID: "e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.860340 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-scripts" (OuterVolumeSpecName: "scripts") pod "e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e" (UID: "e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.861602 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e" (UID: "e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.862329 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-kube-api-access-6pmc8" (OuterVolumeSpecName: "kube-api-access-6pmc8") pod "e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e" (UID: "e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e"). InnerVolumeSpecName "kube-api-access-6pmc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.897627 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e" (UID: "e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.916394 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e" (UID: "e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.940989 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-config-data" (OuterVolumeSpecName: "config-data") pod "e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e" (UID: "e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.956758 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.960676 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.960709 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pmc8\" (UniqueName: \"kubernetes.io/projected/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-kube-api-access-6pmc8\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.960724 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-logs\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.960733 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.960743 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.960751 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.960761 4771 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.968840 4771 generic.go:334] "Generic (PLEG): container finished" podID="20ec595c-6590-4b78-8ecc-f1e93d38d9f0" containerID="c181f99a92bb6947d480fa246210ecc8c94e2dc39fc910dafa79d202a559fae5" exitCode=0 Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.968912 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5b2xx" event={"ID":"20ec595c-6590-4b78-8ecc-f1e93d38d9f0","Type":"ContainerDied","Data":"c181f99a92bb6947d480fa246210ecc8c94e2dc39fc910dafa79d202a559fae5"} Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.973938 4771 generic.go:334] "Generic (PLEG): container finished" podID="e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e" containerID="298ec68cf3ac857040e6c8e715e4c9e8784bce6f8a571ee42d6fefb5064a34ab" exitCode=0 Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.973968 4771 generic.go:334] "Generic (PLEG): container finished" podID="e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e" containerID="44f80634fef71224bf8b704684822a977e931ab5e3f9b9686996045019da81be" exitCode=143 Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.974062 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.974775 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e","Type":"ContainerDied","Data":"298ec68cf3ac857040e6c8e715e4c9e8784bce6f8a571ee42d6fefb5064a34ab"} Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.974823 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e","Type":"ContainerDied","Data":"44f80634fef71224bf8b704684822a977e931ab5e3f9b9686996045019da81be"} Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.974838 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e","Type":"ContainerDied","Data":"51fada7abc1f4c186fada8b65c098a18317f07671a4099586e674b0a27ea1aef"} Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.974854 4771 scope.go:117] "RemoveContainer" containerID="298ec68cf3ac857040e6c8e715e4c9e8784bce6f8a571ee42d6fefb5064a34ab" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.976752 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.996242 4771 generic.go:334] "Generic (PLEG): container finished" podID="ca81e505-d53f-496e-bd26-7cec669591e4" containerID="f3112e69f234defa1fcff4a9c5517c895c98346bf69153547a5fa6e13f50fed1" exitCode=0 Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.996286 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerDied","Data":"f3112e69f234defa1fcff4a9c5517c895c98346bf69153547a5fa6e13f50fed1"} Feb 27 01:23:59 crc kubenswrapper[4771]: I0227 01:23:59.996311 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerStarted","Data":"466a33b6112ab220887139a7abe10596ba6afedbccef8b636c28177f74cb6a85"} Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.062758 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.063991 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.072428 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.097324 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 01:24:00 crc kubenswrapper[4771]: E0227 01:24:00.100227 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e" containerName="glance-httpd" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.100247 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e" containerName="glance-httpd" Feb 27 01:24:00 crc kubenswrapper[4771]: E0227 01:24:00.100267 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e" containerName="glance-log" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.100275 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e" containerName="glance-log" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.100510 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e" containerName="glance-httpd" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.100586 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e" containerName="glance-log" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.108054 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.108167 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.114204 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.114256 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.173599 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535924-2x5jq"] Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.174910 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535924-2x5jq" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.178529 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535924-2x5jq"] Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.180678 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.180790 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.180858 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.266789 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af436659-00a5-4e7d-80a2-66bd5c1f5e04-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.267407 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af436659-00a5-4e7d-80a2-66bd5c1f5e04-scripts\") pod \"glance-default-internal-api-0\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.267517 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af436659-00a5-4e7d-80a2-66bd5c1f5e04-config-data\") pod \"glance-default-internal-api-0\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.267638 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl5mg\" (UniqueName: \"kubernetes.io/projected/71ddad50-134a-4525-ade2-057c655b1a8c-kube-api-access-vl5mg\") pod \"auto-csr-approver-29535924-2x5jq\" (UID: \"71ddad50-134a-4525-ade2-057c655b1a8c\") " pod="openshift-infra/auto-csr-approver-29535924-2x5jq" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.267754 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af436659-00a5-4e7d-80a2-66bd5c1f5e04-logs\") pod \"glance-default-internal-api-0\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.267776 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af436659-00a5-4e7d-80a2-66bd5c1f5e04-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.267900 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.267922 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af436659-00a5-4e7d-80a2-66bd5c1f5e04-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.267977 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n77l\" (UniqueName: \"kubernetes.io/projected/af436659-00a5-4e7d-80a2-66bd5c1f5e04-kube-api-access-7n77l\") pod \"glance-default-internal-api-0\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.369467 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af436659-00a5-4e7d-80a2-66bd5c1f5e04-config-data\") pod \"glance-default-internal-api-0\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.369715 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl5mg\" (UniqueName: \"kubernetes.io/projected/71ddad50-134a-4525-ade2-057c655b1a8c-kube-api-access-vl5mg\") pod \"auto-csr-approver-29535924-2x5jq\" (UID: \"71ddad50-134a-4525-ade2-057c655b1a8c\") " pod="openshift-infra/auto-csr-approver-29535924-2x5jq" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.369752 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af436659-00a5-4e7d-80a2-66bd5c1f5e04-logs\") pod \"glance-default-internal-api-0\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.369768 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af436659-00a5-4e7d-80a2-66bd5c1f5e04-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.369818 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.369841 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af436659-00a5-4e7d-80a2-66bd5c1f5e04-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.369869 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n77l\" (UniqueName: \"kubernetes.io/projected/af436659-00a5-4e7d-80a2-66bd5c1f5e04-kube-api-access-7n77l\") pod \"glance-default-internal-api-0\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.369911 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af436659-00a5-4e7d-80a2-66bd5c1f5e04-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.369932 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af436659-00a5-4e7d-80a2-66bd5c1f5e04-scripts\") pod \"glance-default-internal-api-0\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.371037 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af436659-00a5-4e7d-80a2-66bd5c1f5e04-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.372629 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af436659-00a5-4e7d-80a2-66bd5c1f5e04-logs\") pod \"glance-default-internal-api-0\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.373049 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.376432 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af436659-00a5-4e7d-80a2-66bd5c1f5e04-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.376448 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af436659-00a5-4e7d-80a2-66bd5c1f5e04-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.376993 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af436659-00a5-4e7d-80a2-66bd5c1f5e04-config-data\") pod \"glance-default-internal-api-0\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.388065 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl5mg\" (UniqueName: \"kubernetes.io/projected/71ddad50-134a-4525-ade2-057c655b1a8c-kube-api-access-vl5mg\") pod \"auto-csr-approver-29535924-2x5jq\" (UID: \"71ddad50-134a-4525-ade2-057c655b1a8c\") " pod="openshift-infra/auto-csr-approver-29535924-2x5jq" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.388763 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af436659-00a5-4e7d-80a2-66bd5c1f5e04-scripts\") pod \"glance-default-internal-api-0\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.400216 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n77l\" (UniqueName: \"kubernetes.io/projected/af436659-00a5-4e7d-80a2-66bd5c1f5e04-kube-api-access-7n77l\") pod \"glance-default-internal-api-0\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.408804 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.438605 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.461402 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 01:24:00 crc kubenswrapper[4771]: I0227 01:24:00.503420 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535924-2x5jq" Feb 27 01:24:01 crc kubenswrapper[4771]: I0227 01:24:01.785119 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e" path="/var/lib/kubelet/pods/e69c9c6d-a6df-410f-81e9-ed9f0ac4f19e/volumes" Feb 27 01:24:01 crc kubenswrapper[4771]: I0227 01:24:01.986092 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-594f74c97c-r6bp5"] Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.028038 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7fb8f8d788-kjgv6"] Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.030636 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.056017 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.098988 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7fb8f8d788-kjgv6"] Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.113269 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-logs\") pod \"horizon-7fb8f8d788-kjgv6\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.113398 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62nt5\" (UniqueName: \"kubernetes.io/projected/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-kube-api-access-62nt5\") pod \"horizon-7fb8f8d788-kjgv6\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.113457 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-scripts\") pod \"horizon-7fb8f8d788-kjgv6\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.113522 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-config-data\") pod \"horizon-7fb8f8d788-kjgv6\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.113593 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-horizon-tls-certs\") pod \"horizon-7fb8f8d788-kjgv6\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.113939 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-horizon-secret-key\") pod \"horizon-7fb8f8d788-kjgv6\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.114317 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-combined-ca-bundle\") pod \"horizon-7fb8f8d788-kjgv6\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.137596 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.161667 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6dcdc5597c-5tv7x"] Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.193699 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-555c84df64-lmgxw"] Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.195078 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.219493 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62nt5\" (UniqueName: \"kubernetes.io/projected/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-kube-api-access-62nt5\") pod \"horizon-7fb8f8d788-kjgv6\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.219540 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-scripts\") pod \"horizon-7fb8f8d788-kjgv6\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.219574 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-config-data\") pod \"horizon-7fb8f8d788-kjgv6\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.219599 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-horizon-tls-certs\") pod \"horizon-7fb8f8d788-kjgv6\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.219616 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-horizon-secret-key\") pod \"horizon-7fb8f8d788-kjgv6\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.219708 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-combined-ca-bundle\") pod \"horizon-7fb8f8d788-kjgv6\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.219744 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-logs\") pod \"horizon-7fb8f8d788-kjgv6\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.220111 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-logs\") pod \"horizon-7fb8f8d788-kjgv6\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.221307 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-scripts\") pod \"horizon-7fb8f8d788-kjgv6\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.222236 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-config-data\") pod \"horizon-7fb8f8d788-kjgv6\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.233358 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-combined-ca-bundle\") pod \"horizon-7fb8f8d788-kjgv6\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.241229 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62nt5\" (UniqueName: \"kubernetes.io/projected/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-kube-api-access-62nt5\") pod \"horizon-7fb8f8d788-kjgv6\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.250316 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-horizon-secret-key\") pod \"horizon-7fb8f8d788-kjgv6\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.258065 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-horizon-tls-certs\") pod \"horizon-7fb8f8d788-kjgv6\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.303954 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-555c84df64-lmgxw"] Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.324201 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db15a3b-2c83-4d54-b5ea-697e6362b4e9-combined-ca-bundle\") pod \"horizon-555c84df64-lmgxw\" (UID: \"9db15a3b-2c83-4d54-b5ea-697e6362b4e9\") " pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.324257 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9db15a3b-2c83-4d54-b5ea-697e6362b4e9-config-data\") pod \"horizon-555c84df64-lmgxw\" (UID: \"9db15a3b-2c83-4d54-b5ea-697e6362b4e9\") " pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.324301 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9db15a3b-2c83-4d54-b5ea-697e6362b4e9-logs\") pod \"horizon-555c84df64-lmgxw\" (UID: \"9db15a3b-2c83-4d54-b5ea-697e6362b4e9\") " pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.324321 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrlnp\" (UniqueName: \"kubernetes.io/projected/9db15a3b-2c83-4d54-b5ea-697e6362b4e9-kube-api-access-qrlnp\") pod \"horizon-555c84df64-lmgxw\" (UID: \"9db15a3b-2c83-4d54-b5ea-697e6362b4e9\") " pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.324339 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9db15a3b-2c83-4d54-b5ea-697e6362b4e9-horizon-secret-key\") pod \"horizon-555c84df64-lmgxw\" (UID: \"9db15a3b-2c83-4d54-b5ea-697e6362b4e9\") " pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.324404 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9db15a3b-2c83-4d54-b5ea-697e6362b4e9-horizon-tls-certs\") pod \"horizon-555c84df64-lmgxw\" (UID: \"9db15a3b-2c83-4d54-b5ea-697e6362b4e9\") " pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.324434 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9db15a3b-2c83-4d54-b5ea-697e6362b4e9-scripts\") pod \"horizon-555c84df64-lmgxw\" (UID: \"9db15a3b-2c83-4d54-b5ea-697e6362b4e9\") " pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.329089 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.392741 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.426221 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9db15a3b-2c83-4d54-b5ea-697e6362b4e9-horizon-tls-certs\") pod \"horizon-555c84df64-lmgxw\" (UID: \"9db15a3b-2c83-4d54-b5ea-697e6362b4e9\") " pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.426297 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9db15a3b-2c83-4d54-b5ea-697e6362b4e9-scripts\") pod \"horizon-555c84df64-lmgxw\" (UID: \"9db15a3b-2c83-4d54-b5ea-697e6362b4e9\") " pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.426357 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db15a3b-2c83-4d54-b5ea-697e6362b4e9-combined-ca-bundle\") pod \"horizon-555c84df64-lmgxw\" (UID: \"9db15a3b-2c83-4d54-b5ea-697e6362b4e9\") " pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.427068 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9db15a3b-2c83-4d54-b5ea-697e6362b4e9-scripts\") pod \"horizon-555c84df64-lmgxw\" (UID: \"9db15a3b-2c83-4d54-b5ea-697e6362b4e9\") " pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.427161 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9db15a3b-2c83-4d54-b5ea-697e6362b4e9-config-data\") pod \"horizon-555c84df64-lmgxw\" (UID: \"9db15a3b-2c83-4d54-b5ea-697e6362b4e9\") " pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.427242 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9db15a3b-2c83-4d54-b5ea-697e6362b4e9-logs\") pod \"horizon-555c84df64-lmgxw\" (UID: \"9db15a3b-2c83-4d54-b5ea-697e6362b4e9\") " pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.427562 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrlnp\" (UniqueName: \"kubernetes.io/projected/9db15a3b-2c83-4d54-b5ea-697e6362b4e9-kube-api-access-qrlnp\") pod \"horizon-555c84df64-lmgxw\" (UID: \"9db15a3b-2c83-4d54-b5ea-697e6362b4e9\") " pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.428340 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9db15a3b-2c83-4d54-b5ea-697e6362b4e9-horizon-secret-key\") pod \"horizon-555c84df64-lmgxw\" (UID: \"9db15a3b-2c83-4d54-b5ea-697e6362b4e9\") " pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.427804 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9db15a3b-2c83-4d54-b5ea-697e6362b4e9-logs\") pod \"horizon-555c84df64-lmgxw\" (UID: \"9db15a3b-2c83-4d54-b5ea-697e6362b4e9\") " pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.428279 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9db15a3b-2c83-4d54-b5ea-697e6362b4e9-config-data\") pod \"horizon-555c84df64-lmgxw\" (UID: \"9db15a3b-2c83-4d54-b5ea-697e6362b4e9\") " pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.431105 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9db15a3b-2c83-4d54-b5ea-697e6362b4e9-horizon-tls-certs\") pod \"horizon-555c84df64-lmgxw\" (UID: \"9db15a3b-2c83-4d54-b5ea-697e6362b4e9\") " pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.453418 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9db15a3b-2c83-4d54-b5ea-697e6362b4e9-horizon-secret-key\") pod \"horizon-555c84df64-lmgxw\" (UID: \"9db15a3b-2c83-4d54-b5ea-697e6362b4e9\") " pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.455224 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrlnp\" (UniqueName: \"kubernetes.io/projected/9db15a3b-2c83-4d54-b5ea-697e6362b4e9-kube-api-access-qrlnp\") pod \"horizon-555c84df64-lmgxw\" (UID: \"9db15a3b-2c83-4d54-b5ea-697e6362b4e9\") " pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.460789 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db15a3b-2c83-4d54-b5ea-697e6362b4e9-combined-ca-bundle\") pod \"horizon-555c84df64-lmgxw\" (UID: \"9db15a3b-2c83-4d54-b5ea-697e6362b4e9\") " pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:02 crc kubenswrapper[4771]: I0227 01:24:02.644088 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:03 crc kubenswrapper[4771]: I0227 01:24:03.061685 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5b2xx" Feb 27 01:24:03 crc kubenswrapper[4771]: I0227 01:24:03.067146 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5b2xx" event={"ID":"20ec595c-6590-4b78-8ecc-f1e93d38d9f0","Type":"ContainerDied","Data":"7ceddce40eef04b697d22781eda3fd2ff1f77e7c766a5b816606f04291e79a56"} Feb 27 01:24:03 crc kubenswrapper[4771]: I0227 01:24:03.067185 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ceddce40eef04b697d22781eda3fd2ff1f77e7c766a5b816606f04291e79a56" Feb 27 01:24:03 crc kubenswrapper[4771]: I0227 01:24:03.067237 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5b2xx" Feb 27 01:24:03 crc kubenswrapper[4771]: I0227 01:24:03.142466 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-scripts\") pod \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\" (UID: \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\") " Feb 27 01:24:03 crc kubenswrapper[4771]: I0227 01:24:03.142605 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-credential-keys\") pod \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\" (UID: \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\") " Feb 27 01:24:03 crc kubenswrapper[4771]: I0227 01:24:03.142745 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txl4f\" (UniqueName: \"kubernetes.io/projected/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-kube-api-access-txl4f\") pod \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\" (UID: \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\") " Feb 27 01:24:03 crc kubenswrapper[4771]: I0227 01:24:03.142851 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-fernet-keys\") pod \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\" (UID: \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\") " Feb 27 01:24:03 crc kubenswrapper[4771]: I0227 01:24:03.142904 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-config-data\") pod \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\" (UID: \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\") " Feb 27 01:24:03 crc kubenswrapper[4771]: I0227 01:24:03.142951 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-combined-ca-bundle\") pod \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\" (UID: \"20ec595c-6590-4b78-8ecc-f1e93d38d9f0\") " Feb 27 01:24:03 crc kubenswrapper[4771]: I0227 01:24:03.147997 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-scripts" (OuterVolumeSpecName: "scripts") pod "20ec595c-6590-4b78-8ecc-f1e93d38d9f0" (UID: "20ec595c-6590-4b78-8ecc-f1e93d38d9f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:03 crc kubenswrapper[4771]: I0227 01:24:03.148025 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "20ec595c-6590-4b78-8ecc-f1e93d38d9f0" (UID: "20ec595c-6590-4b78-8ecc-f1e93d38d9f0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:03 crc kubenswrapper[4771]: I0227 01:24:03.148536 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-kube-api-access-txl4f" (OuterVolumeSpecName: "kube-api-access-txl4f") pod "20ec595c-6590-4b78-8ecc-f1e93d38d9f0" (UID: "20ec595c-6590-4b78-8ecc-f1e93d38d9f0"). InnerVolumeSpecName "kube-api-access-txl4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:24:03 crc kubenswrapper[4771]: I0227 01:24:03.151060 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "20ec595c-6590-4b78-8ecc-f1e93d38d9f0" (UID: "20ec595c-6590-4b78-8ecc-f1e93d38d9f0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:03 crc kubenswrapper[4771]: I0227 01:24:03.168823 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-config-data" (OuterVolumeSpecName: "config-data") pod "20ec595c-6590-4b78-8ecc-f1e93d38d9f0" (UID: "20ec595c-6590-4b78-8ecc-f1e93d38d9f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:03 crc kubenswrapper[4771]: I0227 01:24:03.169086 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20ec595c-6590-4b78-8ecc-f1e93d38d9f0" (UID: "20ec595c-6590-4b78-8ecc-f1e93d38d9f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:03 crc kubenswrapper[4771]: I0227 01:24:03.244748 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:03 crc kubenswrapper[4771]: I0227 01:24:03.244793 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:03 crc kubenswrapper[4771]: I0227 01:24:03.244808 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:03 crc kubenswrapper[4771]: I0227 01:24:03.244818 4771 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:03 crc kubenswrapper[4771]: I0227 01:24:03.244831 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txl4f\" (UniqueName: \"kubernetes.io/projected/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-kube-api-access-txl4f\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:03 crc kubenswrapper[4771]: I0227 01:24:03.244843 4771 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/20ec595c-6590-4b78-8ecc-f1e93d38d9f0-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:03 crc kubenswrapper[4771]: I0227 01:24:03.947337 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.013173 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-xsnfw"] Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.013494 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" podUID="b1cbef08-6bd3-4010-8d53-914b02a1d670" containerName="dnsmasq-dns" containerID="cri-o://810d51041dcb69779df1a1971d06c3637f8c0a921f433df928b0fcf7547cb422" gracePeriod=10 Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.132119 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-5b2xx"] Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.139308 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-5b2xx"] Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.237277 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-kqdqm"] Feb 27 01:24:04 crc kubenswrapper[4771]: E0227 01:24:04.237685 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ec595c-6590-4b78-8ecc-f1e93d38d9f0" containerName="keystone-bootstrap" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.237698 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ec595c-6590-4b78-8ecc-f1e93d38d9f0" containerName="keystone-bootstrap" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.237882 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ec595c-6590-4b78-8ecc-f1e93d38d9f0" containerName="keystone-bootstrap" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.238542 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kqdqm" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.272814 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.272866 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.272900 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.274023 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kqdqm"] Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.276675 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-7nd8k" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.278407 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.365429 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-scripts\") pod \"keystone-bootstrap-kqdqm\" (UID: \"57ec654f-6921-476d-8001-aec299744492\") " pod="openstack/keystone-bootstrap-kqdqm" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.365511 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-credential-keys\") pod \"keystone-bootstrap-kqdqm\" (UID: \"57ec654f-6921-476d-8001-aec299744492\") " pod="openstack/keystone-bootstrap-kqdqm" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.365535 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzls4\" (UniqueName: \"kubernetes.io/projected/57ec654f-6921-476d-8001-aec299744492-kube-api-access-hzls4\") pod \"keystone-bootstrap-kqdqm\" (UID: \"57ec654f-6921-476d-8001-aec299744492\") " pod="openstack/keystone-bootstrap-kqdqm" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.365630 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-combined-ca-bundle\") pod \"keystone-bootstrap-kqdqm\" (UID: \"57ec654f-6921-476d-8001-aec299744492\") " pod="openstack/keystone-bootstrap-kqdqm" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.365665 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-fernet-keys\") pod \"keystone-bootstrap-kqdqm\" (UID: \"57ec654f-6921-476d-8001-aec299744492\") " pod="openstack/keystone-bootstrap-kqdqm" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.365696 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-config-data\") pod \"keystone-bootstrap-kqdqm\" (UID: \"57ec654f-6921-476d-8001-aec299744492\") " pod="openstack/keystone-bootstrap-kqdqm" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.468656 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-combined-ca-bundle\") pod \"keystone-bootstrap-kqdqm\" (UID: \"57ec654f-6921-476d-8001-aec299744492\") " pod="openstack/keystone-bootstrap-kqdqm" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.468743 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-fernet-keys\") pod \"keystone-bootstrap-kqdqm\" (UID: \"57ec654f-6921-476d-8001-aec299744492\") " pod="openstack/keystone-bootstrap-kqdqm" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.468796 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-config-data\") pod \"keystone-bootstrap-kqdqm\" (UID: \"57ec654f-6921-476d-8001-aec299744492\") " pod="openstack/keystone-bootstrap-kqdqm" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.468853 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-scripts\") pod \"keystone-bootstrap-kqdqm\" (UID: \"57ec654f-6921-476d-8001-aec299744492\") " pod="openstack/keystone-bootstrap-kqdqm" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.469029 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-credential-keys\") pod \"keystone-bootstrap-kqdqm\" (UID: \"57ec654f-6921-476d-8001-aec299744492\") " pod="openstack/keystone-bootstrap-kqdqm" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.469053 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzls4\" (UniqueName: \"kubernetes.io/projected/57ec654f-6921-476d-8001-aec299744492-kube-api-access-hzls4\") pod \"keystone-bootstrap-kqdqm\" (UID: \"57ec654f-6921-476d-8001-aec299744492\") " pod="openstack/keystone-bootstrap-kqdqm" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.474157 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-fernet-keys\") pod \"keystone-bootstrap-kqdqm\" (UID: \"57ec654f-6921-476d-8001-aec299744492\") " pod="openstack/keystone-bootstrap-kqdqm" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.474358 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-combined-ca-bundle\") pod \"keystone-bootstrap-kqdqm\" (UID: \"57ec654f-6921-476d-8001-aec299744492\") " pod="openstack/keystone-bootstrap-kqdqm" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.476013 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-config-data\") pod \"keystone-bootstrap-kqdqm\" (UID: \"57ec654f-6921-476d-8001-aec299744492\") " pod="openstack/keystone-bootstrap-kqdqm" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.476353 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-scripts\") pod \"keystone-bootstrap-kqdqm\" (UID: \"57ec654f-6921-476d-8001-aec299744492\") " pod="openstack/keystone-bootstrap-kqdqm" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.477419 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-credential-keys\") pod \"keystone-bootstrap-kqdqm\" (UID: \"57ec654f-6921-476d-8001-aec299744492\") " pod="openstack/keystone-bootstrap-kqdqm" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.496687 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzls4\" (UniqueName: \"kubernetes.io/projected/57ec654f-6921-476d-8001-aec299744492-kube-api-access-hzls4\") pod \"keystone-bootstrap-kqdqm\" (UID: \"57ec654f-6921-476d-8001-aec299744492\") " pod="openstack/keystone-bootstrap-kqdqm" Feb 27 01:24:04 crc kubenswrapper[4771]: I0227 01:24:04.580257 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kqdqm" Feb 27 01:24:05 crc kubenswrapper[4771]: I0227 01:24:05.007763 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" podUID="b1cbef08-6bd3-4010-8d53-914b02a1d670" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: connect: connection refused" Feb 27 01:24:05 crc kubenswrapper[4771]: I0227 01:24:05.100463 4771 generic.go:334] "Generic (PLEG): container finished" podID="b1cbef08-6bd3-4010-8d53-914b02a1d670" containerID="810d51041dcb69779df1a1971d06c3637f8c0a921f433df928b0fcf7547cb422" exitCode=0 Feb 27 01:24:05 crc kubenswrapper[4771]: I0227 01:24:05.100503 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" event={"ID":"b1cbef08-6bd3-4010-8d53-914b02a1d670","Type":"ContainerDied","Data":"810d51041dcb69779df1a1971d06c3637f8c0a921f433df928b0fcf7547cb422"} Feb 27 01:24:05 crc kubenswrapper[4771]: I0227 01:24:05.782535 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20ec595c-6590-4b78-8ecc-f1e93d38d9f0" path="/var/lib/kubelet/pods/20ec595c-6590-4b78-8ecc-f1e93d38d9f0/volumes" Feb 27 01:24:09 crc kubenswrapper[4771]: I0227 01:24:09.140675 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1289d10b-b2e9-4b14-ba46-c4de11e966be","Type":"ContainerStarted","Data":"7e0632c3e0ef302be6b2cb4e0815c7de0dc57c8d679389b5584e85b1d8de1a64"} Feb 27 01:24:10 crc kubenswrapper[4771]: I0227 01:24:10.007134 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" podUID="b1cbef08-6bd3-4010-8d53-914b02a1d670" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: connect: connection refused" Feb 27 01:24:10 crc kubenswrapper[4771]: E0227 01:24:10.694475 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Feb 27 01:24:10 crc kubenswrapper[4771]: E0227 01:24:10.694647 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5dh5f9h57h58chb4hch5f7h58ch8ch57bh4h65dh697hb9h4h5f7h569h64ch66dh5d8h548hc9hb8h6ch59fhdch597h5b8h7bh85hd7h664q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qkfnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(56ed04c4-c2a4-47be-8b9f-faaea9aab6c3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 01:24:10 crc kubenswrapper[4771]: I0227 01:24:10.725026 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 01:24:10 crc kubenswrapper[4771]: I0227 01:24:10.725138 4771 scope.go:117] "RemoveContainer" containerID="44f80634fef71224bf8b704684822a977e931ab5e3f9b9686996045019da81be" Feb 27 01:24:12 crc kubenswrapper[4771]: E0227 01:24:12.296812 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 27 01:24:12 crc kubenswrapper[4771]: E0227 01:24:12.297216 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n7fh685h574h7dhfdh68ch5c8hbch67dh694h64fh5cbh57bh5d7h85h5b4h656hb9h64bh7dh87h646h58hb7h5fch5ddhddhc4h585h686h68ch68cq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dws6k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-69b7666f9c-x658x_openstack(f610e3af-68b8-445a-b8c3-a9f7d4319fdf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 01:24:12 crc kubenswrapper[4771]: E0227 01:24:12.299970 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-69b7666f9c-x658x" podUID="f610e3af-68b8-445a-b8c3-a9f7d4319fdf" Feb 27 01:24:13 crc kubenswrapper[4771]: E0227 01:24:13.779618 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Feb 27 01:24:13 crc kubenswrapper[4771]: E0227 01:24:13.780024 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d5tfq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-tclqb_openstack(b411543d-f7a2-4a56-acb5-9b2d9598739a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 01:24:13 crc kubenswrapper[4771]: E0227 01:24:13.781384 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-tclqb" podUID="b411543d-f7a2-4a56-acb5-9b2d9598739a" Feb 27 01:24:13 crc kubenswrapper[4771]: E0227 01:24:13.797255 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 27 01:24:13 crc kubenswrapper[4771]: E0227 01:24:13.797414 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5bh5bfh5cbhbbh8h6ch56h8h68chc9hd4h5d7h548hffhf4hd5h677h87h549h75h5bch669h595h666h597hc8h677h68dh5d4h645h55fh699q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v5stm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6dcdc5597c-5tv7x_openstack(c0cc73fb-2983-4575-9a64-6d66336ef380): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 01:24:13 crc kubenswrapper[4771]: E0227 01:24:13.800311 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6dcdc5597c-5tv7x" podUID="c0cc73fb-2983-4575-9a64-6d66336ef380" Feb 27 01:24:14 crc kubenswrapper[4771]: E0227 01:24:14.188335 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-tclqb" podUID="b411543d-f7a2-4a56-acb5-9b2d9598739a" Feb 27 01:24:15 crc kubenswrapper[4771]: I0227 01:24:15.197527 4771 generic.go:334] "Generic (PLEG): container finished" podID="d82930a8-1630-4e85-86f0-0f2027e7225d" containerID="78fcd55cba24557ee0c5e9291f2b4f14ef5cd69699b00ead6dac6bbbd1fe2ef4" exitCode=0 Feb 27 01:24:15 crc kubenswrapper[4771]: I0227 01:24:15.197615 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mg85q" event={"ID":"d82930a8-1630-4e85-86f0-0f2027e7225d","Type":"ContainerDied","Data":"78fcd55cba24557ee0c5e9291f2b4f14ef5cd69699b00ead6dac6bbbd1fe2ef4"} Feb 27 01:24:20 crc kubenswrapper[4771]: I0227 01:24:20.007371 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" podUID="b1cbef08-6bd3-4010-8d53-914b02a1d670" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Feb 27 01:24:20 crc kubenswrapper[4771]: I0227 01:24:20.008012 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" Feb 27 01:24:21 crc kubenswrapper[4771]: E0227 01:24:21.731194 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 27 01:24:21 crc kubenswrapper[4771]: E0227 01:24:21.731341 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vwlnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-r69k6_openstack(a592bd48-ea9a-4f6c-a7fe-49185fbbed82): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 01:24:21 crc kubenswrapper[4771]: E0227 01:24:21.732529 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-r69k6" podUID="a592bd48-ea9a-4f6c-a7fe-49185fbbed82" Feb 27 01:24:21 crc kubenswrapper[4771]: I0227 01:24:21.935102 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69b7666f9c-x658x" Feb 27 01:24:21 crc kubenswrapper[4771]: I0227 01:24:21.943118 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" Feb 27 01:24:21 crc kubenswrapper[4771]: I0227 01:24:21.948193 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6dcdc5597c-5tv7x" Feb 27 01:24:21 crc kubenswrapper[4771]: I0227 01:24:21.954345 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mg85q" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.109750 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d82930a8-1630-4e85-86f0-0f2027e7225d-config\") pod \"d82930a8-1630-4e85-86f0-0f2027e7225d\" (UID: \"d82930a8-1630-4e85-86f0-0f2027e7225d\") " Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.109830 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-scripts\") pod \"f610e3af-68b8-445a-b8c3-a9f7d4319fdf\" (UID: \"f610e3af-68b8-445a-b8c3-a9f7d4319fdf\") " Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.109898 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zhjb\" (UniqueName: \"kubernetes.io/projected/b1cbef08-6bd3-4010-8d53-914b02a1d670-kube-api-access-2zhjb\") pod \"b1cbef08-6bd3-4010-8d53-914b02a1d670\" (UID: \"b1cbef08-6bd3-4010-8d53-914b02a1d670\") " Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.109923 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c0cc73fb-2983-4575-9a64-6d66336ef380-horizon-secret-key\") pod \"c0cc73fb-2983-4575-9a64-6d66336ef380\" (UID: \"c0cc73fb-2983-4575-9a64-6d66336ef380\") " Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.109983 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0cc73fb-2983-4575-9a64-6d66336ef380-config-data\") pod \"c0cc73fb-2983-4575-9a64-6d66336ef380\" (UID: \"c0cc73fb-2983-4575-9a64-6d66336ef380\") " Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.110003 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-ovsdbserver-nb\") pod \"b1cbef08-6bd3-4010-8d53-914b02a1d670\" (UID: \"b1cbef08-6bd3-4010-8d53-914b02a1d670\") " Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.110054 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0cc73fb-2983-4575-9a64-6d66336ef380-scripts\") pod \"c0cc73fb-2983-4575-9a64-6d66336ef380\" (UID: \"c0cc73fb-2983-4575-9a64-6d66336ef380\") " Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.110085 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5stm\" (UniqueName: \"kubernetes.io/projected/c0cc73fb-2983-4575-9a64-6d66336ef380-kube-api-access-v5stm\") pod \"c0cc73fb-2983-4575-9a64-6d66336ef380\" (UID: \"c0cc73fb-2983-4575-9a64-6d66336ef380\") " Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.110113 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-config\") pod \"b1cbef08-6bd3-4010-8d53-914b02a1d670\" (UID: \"b1cbef08-6bd3-4010-8d53-914b02a1d670\") " Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.110131 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-config-data\") pod \"f610e3af-68b8-445a-b8c3-a9f7d4319fdf\" (UID: \"f610e3af-68b8-445a-b8c3-a9f7d4319fdf\") " Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.110148 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-ovsdbserver-sb\") pod \"b1cbef08-6bd3-4010-8d53-914b02a1d670\" (UID: \"b1cbef08-6bd3-4010-8d53-914b02a1d670\") " Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.110166 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-dns-svc\") pod \"b1cbef08-6bd3-4010-8d53-914b02a1d670\" (UID: \"b1cbef08-6bd3-4010-8d53-914b02a1d670\") " Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.110192 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d82930a8-1630-4e85-86f0-0f2027e7225d-combined-ca-bundle\") pod \"d82930a8-1630-4e85-86f0-0f2027e7225d\" (UID: \"d82930a8-1630-4e85-86f0-0f2027e7225d\") " Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.110228 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dgbb\" (UniqueName: \"kubernetes.io/projected/d82930a8-1630-4e85-86f0-0f2027e7225d-kube-api-access-5dgbb\") pod \"d82930a8-1630-4e85-86f0-0f2027e7225d\" (UID: \"d82930a8-1630-4e85-86f0-0f2027e7225d\") " Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.110249 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-logs\") pod \"f610e3af-68b8-445a-b8c3-a9f7d4319fdf\" (UID: \"f610e3af-68b8-445a-b8c3-a9f7d4319fdf\") " Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.110269 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-dns-swift-storage-0\") pod \"b1cbef08-6bd3-4010-8d53-914b02a1d670\" (UID: \"b1cbef08-6bd3-4010-8d53-914b02a1d670\") " Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.110298 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0cc73fb-2983-4575-9a64-6d66336ef380-logs\") pod \"c0cc73fb-2983-4575-9a64-6d66336ef380\" (UID: \"c0cc73fb-2983-4575-9a64-6d66336ef380\") " Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.110342 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dws6k\" (UniqueName: \"kubernetes.io/projected/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-kube-api-access-dws6k\") pod \"f610e3af-68b8-445a-b8c3-a9f7d4319fdf\" (UID: \"f610e3af-68b8-445a-b8c3-a9f7d4319fdf\") " Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.110377 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-horizon-secret-key\") pod \"f610e3af-68b8-445a-b8c3-a9f7d4319fdf\" (UID: \"f610e3af-68b8-445a-b8c3-a9f7d4319fdf\") " Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.110638 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0cc73fb-2983-4575-9a64-6d66336ef380-scripts" (OuterVolumeSpecName: "scripts") pod "c0cc73fb-2983-4575-9a64-6d66336ef380" (UID: "c0cc73fb-2983-4575-9a64-6d66336ef380"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.110670 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-scripts" (OuterVolumeSpecName: "scripts") pod "f610e3af-68b8-445a-b8c3-a9f7d4319fdf" (UID: "f610e3af-68b8-445a-b8c3-a9f7d4319fdf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.110908 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0cc73fb-2983-4575-9a64-6d66336ef380-config-data" (OuterVolumeSpecName: "config-data") pod "c0cc73fb-2983-4575-9a64-6d66336ef380" (UID: "c0cc73fb-2983-4575-9a64-6d66336ef380"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.110986 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0cc73fb-2983-4575-9a64-6d66336ef380-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.111010 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.111569 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-config-data" (OuterVolumeSpecName: "config-data") pod "f610e3af-68b8-445a-b8c3-a9f7d4319fdf" (UID: "f610e3af-68b8-445a-b8c3-a9f7d4319fdf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.115833 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d82930a8-1630-4e85-86f0-0f2027e7225d-kube-api-access-5dgbb" (OuterVolumeSpecName: "kube-api-access-5dgbb") pod "d82930a8-1630-4e85-86f0-0f2027e7225d" (UID: "d82930a8-1630-4e85-86f0-0f2027e7225d"). InnerVolumeSpecName "kube-api-access-5dgbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.116463 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0cc73fb-2983-4575-9a64-6d66336ef380-logs" (OuterVolumeSpecName: "logs") pod "c0cc73fb-2983-4575-9a64-6d66336ef380" (UID: "c0cc73fb-2983-4575-9a64-6d66336ef380"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.116658 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-logs" (OuterVolumeSpecName: "logs") pod "f610e3af-68b8-445a-b8c3-a9f7d4319fdf" (UID: "f610e3af-68b8-445a-b8c3-a9f7d4319fdf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.119005 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0cc73fb-2983-4575-9a64-6d66336ef380-kube-api-access-v5stm" (OuterVolumeSpecName: "kube-api-access-v5stm") pod "c0cc73fb-2983-4575-9a64-6d66336ef380" (UID: "c0cc73fb-2983-4575-9a64-6d66336ef380"). InnerVolumeSpecName "kube-api-access-v5stm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.119272 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f610e3af-68b8-445a-b8c3-a9f7d4319fdf" (UID: "f610e3af-68b8-445a-b8c3-a9f7d4319fdf"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.119344 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-kube-api-access-dws6k" (OuterVolumeSpecName: "kube-api-access-dws6k") pod "f610e3af-68b8-445a-b8c3-a9f7d4319fdf" (UID: "f610e3af-68b8-445a-b8c3-a9f7d4319fdf"). InnerVolumeSpecName "kube-api-access-dws6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.120280 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1cbef08-6bd3-4010-8d53-914b02a1d670-kube-api-access-2zhjb" (OuterVolumeSpecName: "kube-api-access-2zhjb") pod "b1cbef08-6bd3-4010-8d53-914b02a1d670" (UID: "b1cbef08-6bd3-4010-8d53-914b02a1d670"). InnerVolumeSpecName "kube-api-access-2zhjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.123644 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0cc73fb-2983-4575-9a64-6d66336ef380-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c0cc73fb-2983-4575-9a64-6d66336ef380" (UID: "c0cc73fb-2983-4575-9a64-6d66336ef380"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.140315 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d82930a8-1630-4e85-86f0-0f2027e7225d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d82930a8-1630-4e85-86f0-0f2027e7225d" (UID: "d82930a8-1630-4e85-86f0-0f2027e7225d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.158969 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b1cbef08-6bd3-4010-8d53-914b02a1d670" (UID: "b1cbef08-6bd3-4010-8d53-914b02a1d670"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.161679 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d82930a8-1630-4e85-86f0-0f2027e7225d-config" (OuterVolumeSpecName: "config") pod "d82930a8-1630-4e85-86f0-0f2027e7225d" (UID: "d82930a8-1630-4e85-86f0-0f2027e7225d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.169446 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-config" (OuterVolumeSpecName: "config") pod "b1cbef08-6bd3-4010-8d53-914b02a1d670" (UID: "b1cbef08-6bd3-4010-8d53-914b02a1d670"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.170795 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b1cbef08-6bd3-4010-8d53-914b02a1d670" (UID: "b1cbef08-6bd3-4010-8d53-914b02a1d670"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.170570 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b1cbef08-6bd3-4010-8d53-914b02a1d670" (UID: "b1cbef08-6bd3-4010-8d53-914b02a1d670"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.180330 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b1cbef08-6bd3-4010-8d53-914b02a1d670" (UID: "b1cbef08-6bd3-4010-8d53-914b02a1d670"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.212644 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-logs\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.212676 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.212688 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0cc73fb-2983-4575-9a64-6d66336ef380-logs\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.212697 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dws6k\" (UniqueName: \"kubernetes.io/projected/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-kube-api-access-dws6k\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.212706 4771 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.212716 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d82930a8-1630-4e85-86f0-0f2027e7225d-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.212724 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zhjb\" (UniqueName: \"kubernetes.io/projected/b1cbef08-6bd3-4010-8d53-914b02a1d670-kube-api-access-2zhjb\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.212733 4771 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c0cc73fb-2983-4575-9a64-6d66336ef380-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.212740 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0cc73fb-2983-4575-9a64-6d66336ef380-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.212749 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.212757 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5stm\" (UniqueName: \"kubernetes.io/projected/c0cc73fb-2983-4575-9a64-6d66336ef380-kube-api-access-v5stm\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.212767 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.212775 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f610e3af-68b8-445a-b8c3-a9f7d4319fdf-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.212782 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.212791 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1cbef08-6bd3-4010-8d53-914b02a1d670-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.212799 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d82930a8-1630-4e85-86f0-0f2027e7225d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.212807 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dgbb\" (UniqueName: \"kubernetes.io/projected/d82930a8-1630-4e85-86f0-0f2027e7225d-kube-api-access-5dgbb\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.292054 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mg85q" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.292068 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mg85q" event={"ID":"d82930a8-1630-4e85-86f0-0f2027e7225d","Type":"ContainerDied","Data":"c5e5a41494c057433bfb54e940b25780c49840b79783e92af471fde7d1ea1aa5"} Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.292114 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5e5a41494c057433bfb54e940b25780c49840b79783e92af471fde7d1ea1aa5" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.295583 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6dcdc5597c-5tv7x" event={"ID":"c0cc73fb-2983-4575-9a64-6d66336ef380","Type":"ContainerDied","Data":"c0048cd9313c5a6bbf9e1729087d0bbab7e059122650b4d36ba3102b5faa80b4"} Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.295815 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6dcdc5597c-5tv7x" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.297063 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69b7666f9c-x658x" event={"ID":"f610e3af-68b8-445a-b8c3-a9f7d4319fdf","Type":"ContainerDied","Data":"ad17b20b7b10283a7336f75fa7d46b802a4c1486a7d1ca3d99981e6299621399"} Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.297136 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69b7666f9c-x658x" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.301380 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" event={"ID":"b1cbef08-6bd3-4010-8d53-914b02a1d670","Type":"ContainerDied","Data":"55cadb4e1601e7eec6949f561b2fddf69faa521c797b9684a117ddc636666383"} Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.301420 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" Feb 27 01:24:22 crc kubenswrapper[4771]: E0227 01:24:22.302908 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-r69k6" podUID="a592bd48-ea9a-4f6c-a7fe-49185fbbed82" Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.389223 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6dcdc5597c-5tv7x"] Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.410611 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6dcdc5597c-5tv7x"] Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.429964 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-69b7666f9c-x658x"] Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.437220 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-69b7666f9c-x658x"] Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.446373 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-xsnfw"] Feb 27 01:24:22 crc kubenswrapper[4771]: I0227 01:24:22.453632 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-xsnfw"] Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.077035 4771 scope.go:117] "RemoveContainer" containerID="298ec68cf3ac857040e6c8e715e4c9e8784bce6f8a571ee42d6fefb5064a34ab" Feb 27 01:24:23 crc kubenswrapper[4771]: E0227 01:24:23.077771 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"298ec68cf3ac857040e6c8e715e4c9e8784bce6f8a571ee42d6fefb5064a34ab\": container with ID starting with 298ec68cf3ac857040e6c8e715e4c9e8784bce6f8a571ee42d6fefb5064a34ab not found: ID does not exist" containerID="298ec68cf3ac857040e6c8e715e4c9e8784bce6f8a571ee42d6fefb5064a34ab" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.077796 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"298ec68cf3ac857040e6c8e715e4c9e8784bce6f8a571ee42d6fefb5064a34ab"} err="failed to get container status \"298ec68cf3ac857040e6c8e715e4c9e8784bce6f8a571ee42d6fefb5064a34ab\": rpc error: code = NotFound desc = could not find container \"298ec68cf3ac857040e6c8e715e4c9e8784bce6f8a571ee42d6fefb5064a34ab\": container with ID starting with 298ec68cf3ac857040e6c8e715e4c9e8784bce6f8a571ee42d6fefb5064a34ab not found: ID does not exist" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.077816 4771 scope.go:117] "RemoveContainer" containerID="44f80634fef71224bf8b704684822a977e931ab5e3f9b9686996045019da81be" Feb 27 01:24:23 crc kubenswrapper[4771]: E0227 01:24:23.078164 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44f80634fef71224bf8b704684822a977e931ab5e3f9b9686996045019da81be\": container with ID starting with 44f80634fef71224bf8b704684822a977e931ab5e3f9b9686996045019da81be not found: ID does not exist" containerID="44f80634fef71224bf8b704684822a977e931ab5e3f9b9686996045019da81be" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.078206 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44f80634fef71224bf8b704684822a977e931ab5e3f9b9686996045019da81be"} err="failed to get container status \"44f80634fef71224bf8b704684822a977e931ab5e3f9b9686996045019da81be\": rpc error: code = NotFound desc = could not find container \"44f80634fef71224bf8b704684822a977e931ab5e3f9b9686996045019da81be\": container with ID starting with 44f80634fef71224bf8b704684822a977e931ab5e3f9b9686996045019da81be not found: ID does not exist" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.078232 4771 scope.go:117] "RemoveContainer" containerID="298ec68cf3ac857040e6c8e715e4c9e8784bce6f8a571ee42d6fefb5064a34ab" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.078487 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"298ec68cf3ac857040e6c8e715e4c9e8784bce6f8a571ee42d6fefb5064a34ab"} err="failed to get container status \"298ec68cf3ac857040e6c8e715e4c9e8784bce6f8a571ee42d6fefb5064a34ab\": rpc error: code = NotFound desc = could not find container \"298ec68cf3ac857040e6c8e715e4c9e8784bce6f8a571ee42d6fefb5064a34ab\": container with ID starting with 298ec68cf3ac857040e6c8e715e4c9e8784bce6f8a571ee42d6fefb5064a34ab not found: ID does not exist" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.078507 4771 scope.go:117] "RemoveContainer" containerID="44f80634fef71224bf8b704684822a977e931ab5e3f9b9686996045019da81be" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.078770 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44f80634fef71224bf8b704684822a977e931ab5e3f9b9686996045019da81be"} err="failed to get container status \"44f80634fef71224bf8b704684822a977e931ab5e3f9b9686996045019da81be\": rpc error: code = NotFound desc = could not find container \"44f80634fef71224bf8b704684822a977e931ab5e3f9b9686996045019da81be\": container with ID starting with 44f80634fef71224bf8b704684822a977e931ab5e3f9b9686996045019da81be not found: ID does not exist" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.078794 4771 scope.go:117] "RemoveContainer" containerID="c06019bd1d417bdca00ed2eff4e51501f46dbc51fa52f89a80770d81ea06c432" Feb 27 01:24:23 crc kubenswrapper[4771]: E0227 01:24:23.108043 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 27 01:24:23 crc kubenswrapper[4771]: E0227 01:24:23.108202 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-scgr9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-zhfdt_openstack(37e7849a-97b9-4e3d-9ad3-c0c942775e64): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 01:24:23 crc kubenswrapper[4771]: E0227 01:24:23.109347 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-zhfdt" podUID="37e7849a-97b9-4e3d-9ad3-c0c942775e64" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.242260 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-n5x8h"] Feb 27 01:24:23 crc kubenswrapper[4771]: E0227 01:24:23.244825 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d82930a8-1630-4e85-86f0-0f2027e7225d" containerName="neutron-db-sync" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.244894 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d82930a8-1630-4e85-86f0-0f2027e7225d" containerName="neutron-db-sync" Feb 27 01:24:23 crc kubenswrapper[4771]: E0227 01:24:23.244916 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1cbef08-6bd3-4010-8d53-914b02a1d670" containerName="init" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.244927 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1cbef08-6bd3-4010-8d53-914b02a1d670" containerName="init" Feb 27 01:24:23 crc kubenswrapper[4771]: E0227 01:24:23.244956 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1cbef08-6bd3-4010-8d53-914b02a1d670" containerName="dnsmasq-dns" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.244967 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1cbef08-6bd3-4010-8d53-914b02a1d670" containerName="dnsmasq-dns" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.245219 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1cbef08-6bd3-4010-8d53-914b02a1d670" containerName="dnsmasq-dns" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.245236 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d82930a8-1630-4e85-86f0-0f2027e7225d" containerName="neutron-db-sync" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.246270 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.267222 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-n5x8h"] Feb 27 01:24:23 crc kubenswrapper[4771]: E0227 01:24:23.317175 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-zhfdt" podUID="37e7849a-97b9-4e3d-9ad3-c0c942775e64" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.330578 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-dns-svc\") pod \"dnsmasq-dns-55f844cf75-n5x8h\" (UID: \"345ab929-8a28-4d72-a196-bd831e1f3d0a\") " pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.330981 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlfzm\" (UniqueName: \"kubernetes.io/projected/345ab929-8a28-4d72-a196-bd831e1f3d0a-kube-api-access-tlfzm\") pod \"dnsmasq-dns-55f844cf75-n5x8h\" (UID: \"345ab929-8a28-4d72-a196-bd831e1f3d0a\") " pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.331045 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-n5x8h\" (UID: \"345ab929-8a28-4d72-a196-bd831e1f3d0a\") " pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.331131 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-n5x8h\" (UID: \"345ab929-8a28-4d72-a196-bd831e1f3d0a\") " pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.331150 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-n5x8h\" (UID: \"345ab929-8a28-4d72-a196-bd831e1f3d0a\") " pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.331367 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-config\") pod \"dnsmasq-dns-55f844cf75-n5x8h\" (UID: \"345ab929-8a28-4d72-a196-bd831e1f3d0a\") " pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.345471 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5fc95dbbd4-gfl9m"] Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.346952 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fc95dbbd4-gfl9m" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.350839 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.351144 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.351278 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-hjxfj" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.352796 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.357239 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5fc95dbbd4-gfl9m"] Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.432621 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd57f8bd-d811-4740-b644-f8d69d329d5c-combined-ca-bundle\") pod \"neutron-5fc95dbbd4-gfl9m\" (UID: \"dd57f8bd-d811-4740-b644-f8d69d329d5c\") " pod="openstack/neutron-5fc95dbbd4-gfl9m" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.432798 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-dns-svc\") pod \"dnsmasq-dns-55f844cf75-n5x8h\" (UID: \"345ab929-8a28-4d72-a196-bd831e1f3d0a\") " pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.432962 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlfzm\" (UniqueName: \"kubernetes.io/projected/345ab929-8a28-4d72-a196-bd831e1f3d0a-kube-api-access-tlfzm\") pod \"dnsmasq-dns-55f844cf75-n5x8h\" (UID: \"345ab929-8a28-4d72-a196-bd831e1f3d0a\") " pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.433012 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-n5x8h\" (UID: \"345ab929-8a28-4d72-a196-bd831e1f3d0a\") " pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.433091 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd57f8bd-d811-4740-b644-f8d69d329d5c-config\") pod \"neutron-5fc95dbbd4-gfl9m\" (UID: \"dd57f8bd-d811-4740-b644-f8d69d329d5c\") " pod="openstack/neutron-5fc95dbbd4-gfl9m" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.433187 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-n5x8h\" (UID: \"345ab929-8a28-4d72-a196-bd831e1f3d0a\") " pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.433217 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-n5x8h\" (UID: \"345ab929-8a28-4d72-a196-bd831e1f3d0a\") " pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.433260 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dd57f8bd-d811-4740-b644-f8d69d329d5c-httpd-config\") pod \"neutron-5fc95dbbd4-gfl9m\" (UID: \"dd57f8bd-d811-4740-b644-f8d69d329d5c\") " pod="openstack/neutron-5fc95dbbd4-gfl9m" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.433307 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd57f8bd-d811-4740-b644-f8d69d329d5c-ovndb-tls-certs\") pod \"neutron-5fc95dbbd4-gfl9m\" (UID: \"dd57f8bd-d811-4740-b644-f8d69d329d5c\") " pod="openstack/neutron-5fc95dbbd4-gfl9m" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.433611 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-config\") pod \"dnsmasq-dns-55f844cf75-n5x8h\" (UID: \"345ab929-8a28-4d72-a196-bd831e1f3d0a\") " pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.433647 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v5vb\" (UniqueName: \"kubernetes.io/projected/dd57f8bd-d811-4740-b644-f8d69d329d5c-kube-api-access-9v5vb\") pod \"neutron-5fc95dbbd4-gfl9m\" (UID: \"dd57f8bd-d811-4740-b644-f8d69d329d5c\") " pod="openstack/neutron-5fc95dbbd4-gfl9m" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.434099 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-dns-svc\") pod \"dnsmasq-dns-55f844cf75-n5x8h\" (UID: \"345ab929-8a28-4d72-a196-bd831e1f3d0a\") " pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.434292 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-n5x8h\" (UID: \"345ab929-8a28-4d72-a196-bd831e1f3d0a\") " pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.434705 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-n5x8h\" (UID: \"345ab929-8a28-4d72-a196-bd831e1f3d0a\") " pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.434850 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-config\") pod \"dnsmasq-dns-55f844cf75-n5x8h\" (UID: \"345ab929-8a28-4d72-a196-bd831e1f3d0a\") " pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.435198 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-n5x8h\" (UID: \"345ab929-8a28-4d72-a196-bd831e1f3d0a\") " pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.449801 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlfzm\" (UniqueName: \"kubernetes.io/projected/345ab929-8a28-4d72-a196-bd831e1f3d0a-kube-api-access-tlfzm\") pod \"dnsmasq-dns-55f844cf75-n5x8h\" (UID: \"345ab929-8a28-4d72-a196-bd831e1f3d0a\") " pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.535177 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd57f8bd-d811-4740-b644-f8d69d329d5c-config\") pod \"neutron-5fc95dbbd4-gfl9m\" (UID: \"dd57f8bd-d811-4740-b644-f8d69d329d5c\") " pod="openstack/neutron-5fc95dbbd4-gfl9m" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.535306 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dd57f8bd-d811-4740-b644-f8d69d329d5c-httpd-config\") pod \"neutron-5fc95dbbd4-gfl9m\" (UID: \"dd57f8bd-d811-4740-b644-f8d69d329d5c\") " pod="openstack/neutron-5fc95dbbd4-gfl9m" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.535355 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd57f8bd-d811-4740-b644-f8d69d329d5c-ovndb-tls-certs\") pod \"neutron-5fc95dbbd4-gfl9m\" (UID: \"dd57f8bd-d811-4740-b644-f8d69d329d5c\") " pod="openstack/neutron-5fc95dbbd4-gfl9m" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.535501 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v5vb\" (UniqueName: \"kubernetes.io/projected/dd57f8bd-d811-4740-b644-f8d69d329d5c-kube-api-access-9v5vb\") pod \"neutron-5fc95dbbd4-gfl9m\" (UID: \"dd57f8bd-d811-4740-b644-f8d69d329d5c\") " pod="openstack/neutron-5fc95dbbd4-gfl9m" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.535585 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd57f8bd-d811-4740-b644-f8d69d329d5c-combined-ca-bundle\") pod \"neutron-5fc95dbbd4-gfl9m\" (UID: \"dd57f8bd-d811-4740-b644-f8d69d329d5c\") " pod="openstack/neutron-5fc95dbbd4-gfl9m" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.539782 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd57f8bd-d811-4740-b644-f8d69d329d5c-config\") pod \"neutron-5fc95dbbd4-gfl9m\" (UID: \"dd57f8bd-d811-4740-b644-f8d69d329d5c\") " pod="openstack/neutron-5fc95dbbd4-gfl9m" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.540196 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd57f8bd-d811-4740-b644-f8d69d329d5c-combined-ca-bundle\") pod \"neutron-5fc95dbbd4-gfl9m\" (UID: \"dd57f8bd-d811-4740-b644-f8d69d329d5c\") " pod="openstack/neutron-5fc95dbbd4-gfl9m" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.542719 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dd57f8bd-d811-4740-b644-f8d69d329d5c-httpd-config\") pod \"neutron-5fc95dbbd4-gfl9m\" (UID: \"dd57f8bd-d811-4740-b644-f8d69d329d5c\") " pod="openstack/neutron-5fc95dbbd4-gfl9m" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.550204 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd57f8bd-d811-4740-b644-f8d69d329d5c-ovndb-tls-certs\") pod \"neutron-5fc95dbbd4-gfl9m\" (UID: \"dd57f8bd-d811-4740-b644-f8d69d329d5c\") " pod="openstack/neutron-5fc95dbbd4-gfl9m" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.555718 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v5vb\" (UniqueName: \"kubernetes.io/projected/dd57f8bd-d811-4740-b644-f8d69d329d5c-kube-api-access-9v5vb\") pod \"neutron-5fc95dbbd4-gfl9m\" (UID: \"dd57f8bd-d811-4740-b644-f8d69d329d5c\") " pod="openstack/neutron-5fc95dbbd4-gfl9m" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.591785 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.666826 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fc95dbbd4-gfl9m" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.779171 4771 scope.go:117] "RemoveContainer" containerID="810d51041dcb69779df1a1971d06c3637f8c0a921f433df928b0fcf7547cb422" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.802007 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1cbef08-6bd3-4010-8d53-914b02a1d670" path="/var/lib/kubelet/pods/b1cbef08-6bd3-4010-8d53-914b02a1d670/volumes" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.802654 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0cc73fb-2983-4575-9a64-6d66336ef380" path="/var/lib/kubelet/pods/c0cc73fb-2983-4575-9a64-6d66336ef380/volumes" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.803333 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f610e3af-68b8-445a-b8c3-a9f7d4319fdf" path="/var/lib/kubelet/pods/f610e3af-68b8-445a-b8c3-a9f7d4319fdf/volumes" Feb 27 01:24:23 crc kubenswrapper[4771]: I0227 01:24:23.894393 4771 scope.go:117] "RemoveContainer" containerID="8c02a2260da6fe30018e2305175d0b7d93abf4c3b41aa8c92426c5453b31b0fe" Feb 27 01:24:23 crc kubenswrapper[4771]: E0227 01:24:23.900025 4771 kuberuntime_gc.go:389] "Failed to remove container log dead symlink" err="remove /var/log/containers/dnsmasq-dns-74f6bcbc87-xsnfw_openstack_init-8c02a2260da6fe30018e2305175d0b7d93abf4c3b41aa8c92426c5453b31b0fe.log: no such file or directory" path="/var/log/containers/dnsmasq-dns-74f6bcbc87-xsnfw_openstack_init-8c02a2260da6fe30018e2305175d0b7d93abf4c3b41aa8c92426c5453b31b0fe.log" Feb 27 01:24:24 crc kubenswrapper[4771]: I0227 01:24:24.249851 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7fb8f8d788-kjgv6"] Feb 27 01:24:24 crc kubenswrapper[4771]: I0227 01:24:24.264790 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 01:24:24 crc kubenswrapper[4771]: I0227 01:24:24.272434 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-555c84df64-lmgxw"] Feb 27 01:24:24 crc kubenswrapper[4771]: I0227 01:24:24.334893 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fb8f8d788-kjgv6" event={"ID":"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7","Type":"ContainerStarted","Data":"7606139dd1784903ed5746e62418199c84c7f11ef1ad702a2612f7414ea95dcb"} Feb 27 01:24:24 crc kubenswrapper[4771]: I0227 01:24:24.343329 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-555c84df64-lmgxw" event={"ID":"9db15a3b-2c83-4d54-b5ea-697e6362b4e9","Type":"ContainerStarted","Data":"9be66deb47f552abdff201141830e849f00e3100db95d53237e7b723c940ada6"} Feb 27 01:24:24 crc kubenswrapper[4771]: I0227 01:24:24.357515 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-594f74c97c-r6bp5" event={"ID":"e291ec97-2bfe-4bbe-a39d-9eca937f1855","Type":"ContainerStarted","Data":"c6a969f9e4477c27a136509637556ccc0114acef1770783092c387907188497e"} Feb 27 01:24:24 crc kubenswrapper[4771]: I0227 01:24:24.357891 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-594f74c97c-r6bp5" podUID="e291ec97-2bfe-4bbe-a39d-9eca937f1855" containerName="horizon-log" containerID="cri-o://c6a969f9e4477c27a136509637556ccc0114acef1770783092c387907188497e" gracePeriod=30 Feb 27 01:24:24 crc kubenswrapper[4771]: I0227 01:24:24.358303 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-594f74c97c-r6bp5" podUID="e291ec97-2bfe-4bbe-a39d-9eca937f1855" containerName="horizon" containerID="cri-o://fc53994384bdc2f24024e47615918ee993ad81722b3c10093142c5fd1d9a7756" gracePeriod=30 Feb 27 01:24:24 crc kubenswrapper[4771]: I0227 01:24:24.410766 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-594f74c97c-r6bp5" podStartSLOduration=4.575398696 podStartE2EDuration="31.410749773s" podCreationTimestamp="2026-02-27 01:23:53 +0000 UTC" firstStartedPulling="2026-02-27 01:23:54.930068916 +0000 UTC m=+1147.867630204" lastFinishedPulling="2026-02-27 01:24:21.765419993 +0000 UTC m=+1174.702981281" observedRunningTime="2026-02-27 01:24:24.380789808 +0000 UTC m=+1177.318351096" watchObservedRunningTime="2026-02-27 01:24:24.410749773 +0000 UTC m=+1177.348311061" Feb 27 01:24:24 crc kubenswrapper[4771]: I0227 01:24:24.426705 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535924-2x5jq"] Feb 27 01:24:24 crc kubenswrapper[4771]: I0227 01:24:24.429983 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3","Type":"ContainerStarted","Data":"0a9b0e1fe52972b5a3bc596d21d1528fa23c441d49b0f1a09d291cf2e6eeea3e"} Feb 27 01:24:24 crc kubenswrapper[4771]: I0227 01:24:24.431484 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"af436659-00a5-4e7d-80a2-66bd5c1f5e04","Type":"ContainerStarted","Data":"1706e27f6aefc5d84f0deb3328ec89fa2a4b038916f5482756c84fb7a59af688"} Feb 27 01:24:24 crc kubenswrapper[4771]: I0227 01:24:24.439307 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kqdqm"] Feb 27 01:24:24 crc kubenswrapper[4771]: I0227 01:24:24.447865 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-n5x8h"] Feb 27 01:24:24 crc kubenswrapper[4771]: I0227 01:24:24.770243 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5fc95dbbd4-gfl9m"] Feb 27 01:24:24 crc kubenswrapper[4771]: W0227 01:24:24.849525 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd57f8bd_d811_4740_b644_f8d69d329d5c.slice/crio-5f418fc1a98d835b32c70fd3ba2b9ca68bc7ab8b6f499afce4a205a35286b35f WatchSource:0}: Error finding container 5f418fc1a98d835b32c70fd3ba2b9ca68bc7ab8b6f499afce4a205a35286b35f: Status 404 returned error can't find the container with id 5f418fc1a98d835b32c70fd3ba2b9ca68bc7ab8b6f499afce4a205a35286b35f Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.008869 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-xsnfw" podUID="b1cbef08-6bd3-4010-8d53-914b02a1d670" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.321200 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6c5945c865-z7kz7"] Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.324032 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c5945c865-z7kz7" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.341984 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.342178 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.343111 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c5945c865-z7kz7"] Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.494246 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-combined-ca-bundle\") pod \"neutron-6c5945c865-z7kz7\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " pod="openstack/neutron-6c5945c865-z7kz7" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.494507 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-ovndb-tls-certs\") pod \"neutron-6c5945c865-z7kz7\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " pod="openstack/neutron-6c5945c865-z7kz7" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.494526 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-config\") pod \"neutron-6c5945c865-z7kz7\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " pod="openstack/neutron-6c5945c865-z7kz7" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.494592 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk2lj\" (UniqueName: \"kubernetes.io/projected/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-kube-api-access-wk2lj\") pod \"neutron-6c5945c865-z7kz7\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " pod="openstack/neutron-6c5945c865-z7kz7" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.494625 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-httpd-config\") pod \"neutron-6c5945c865-z7kz7\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " pod="openstack/neutron-6c5945c865-z7kz7" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.494647 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-internal-tls-certs\") pod \"neutron-6c5945c865-z7kz7\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " pod="openstack/neutron-6c5945c865-z7kz7" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.494665 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-public-tls-certs\") pod \"neutron-6c5945c865-z7kz7\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " pod="openstack/neutron-6c5945c865-z7kz7" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.495867 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fb8f8d788-kjgv6" event={"ID":"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7","Type":"ContainerStarted","Data":"ff183ff6573a5763d5b2e109ee1eb7352e632d657f81e62dff6a2c6bc7b3dd1d"} Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.495904 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fb8f8d788-kjgv6" event={"ID":"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7","Type":"ContainerStarted","Data":"f00f02e0a0a413602901c163d4c12c6a7fe17ee6a0b5395d18a0d9b6c3ffe9f3"} Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.503723 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fc95dbbd4-gfl9m" event={"ID":"dd57f8bd-d811-4740-b644-f8d69d329d5c","Type":"ContainerStarted","Data":"5207c640e61f0641a06714c955a373f99702b9fe4c3240013ee7eb4c13d85379"} Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.503770 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fc95dbbd4-gfl9m" event={"ID":"dd57f8bd-d811-4740-b644-f8d69d329d5c","Type":"ContainerStarted","Data":"5f418fc1a98d835b32c70fd3ba2b9ca68bc7ab8b6f499afce4a205a35286b35f"} Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.503930 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5fc95dbbd4-gfl9m" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.509718 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535924-2x5jq" event={"ID":"71ddad50-134a-4525-ade2-057c655b1a8c","Type":"ContainerStarted","Data":"431659f0c36b7dc87ded08281ee9357b730e23c85256a0be373319f7ce870b33"} Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.522791 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7fb8f8d788-kjgv6" podStartSLOduration=24.522774932 podStartE2EDuration="24.522774932s" podCreationTimestamp="2026-02-27 01:24:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:24:25.5219684 +0000 UTC m=+1178.459529678" watchObservedRunningTime="2026-02-27 01:24:25.522774932 +0000 UTC m=+1178.460336220" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.525566 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1289d10b-b2e9-4b14-ba46-c4de11e966be","Type":"ContainerStarted","Data":"5975a195724de8123a7141d59debc3ff9ad49c0b0de299c853384c4b0370a249"} Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.525602 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1289d10b-b2e9-4b14-ba46-c4de11e966be","Type":"ContainerStarted","Data":"08126fd2f44b949d894404c7c136317e8709e495ba119b3350a4bb74ebbe92db"} Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.525755 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1289d10b-b2e9-4b14-ba46-c4de11e966be" containerName="glance-log" containerID="cri-o://08126fd2f44b949d894404c7c136317e8709e495ba119b3350a4bb74ebbe92db" gracePeriod=30 Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.525934 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1289d10b-b2e9-4b14-ba46-c4de11e966be" containerName="glance-httpd" containerID="cri-o://5975a195724de8123a7141d59debc3ff9ad49c0b0de299c853384c4b0370a249" gracePeriod=30 Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.548873 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5fc95dbbd4-gfl9m" podStartSLOduration=2.548857011 podStartE2EDuration="2.548857011s" podCreationTimestamp="2026-02-27 01:24:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:24:25.541047079 +0000 UTC m=+1178.478608367" watchObservedRunningTime="2026-02-27 01:24:25.548857011 +0000 UTC m=+1178.486418299" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.553921 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kqdqm" event={"ID":"57ec654f-6921-476d-8001-aec299744492","Type":"ContainerStarted","Data":"c696924d536a08d6ade8f59f5787d17d17f48f3070b5b227285d140bca360a9c"} Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.553959 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kqdqm" event={"ID":"57ec654f-6921-476d-8001-aec299744492","Type":"ContainerStarted","Data":"2ae665dea5f555dbff0916bc220a3ab9906c2e5a42df2675362269f6532de747"} Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.575918 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=26.575900017 podStartE2EDuration="26.575900017s" podCreationTimestamp="2026-02-27 01:23:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:24:25.575454985 +0000 UTC m=+1178.513016283" watchObservedRunningTime="2026-02-27 01:24:25.575900017 +0000 UTC m=+1178.513461305" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.585345 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-555c84df64-lmgxw" event={"ID":"9db15a3b-2c83-4d54-b5ea-697e6362b4e9","Type":"ContainerStarted","Data":"8ce762f8c62713b83b01f628b40a5dd8b2045c17148e0a6c64193d90de7659b8"} Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.585395 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-555c84df64-lmgxw" event={"ID":"9db15a3b-2c83-4d54-b5ea-697e6362b4e9","Type":"ContainerStarted","Data":"a344be4daa648bbd9910f8c6df5438a604dd18f5ce7c04366cc11b2ae7cbce54"} Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.588836 4771 generic.go:334] "Generic (PLEG): container finished" podID="345ab929-8a28-4d72-a196-bd831e1f3d0a" containerID="c23cc082f148d95e0671b126b2c45b531b74f8c6f74e9b5c6d5df5e1e3ea8e8f" exitCode=0 Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.588891 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" event={"ID":"345ab929-8a28-4d72-a196-bd831e1f3d0a","Type":"ContainerDied","Data":"c23cc082f148d95e0671b126b2c45b531b74f8c6f74e9b5c6d5df5e1e3ea8e8f"} Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.588906 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" event={"ID":"345ab929-8a28-4d72-a196-bd831e1f3d0a","Type":"ContainerStarted","Data":"a523b83193d12a4f2500465609748f068d334769a2f8403113a9fd53af811240"} Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.599513 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-httpd-config\") pod \"neutron-6c5945c865-z7kz7\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " pod="openstack/neutron-6c5945c865-z7kz7" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.599579 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-internal-tls-certs\") pod \"neutron-6c5945c865-z7kz7\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " pod="openstack/neutron-6c5945c865-z7kz7" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.599610 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-public-tls-certs\") pod \"neutron-6c5945c865-z7kz7\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " pod="openstack/neutron-6c5945c865-z7kz7" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.599655 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-combined-ca-bundle\") pod \"neutron-6c5945c865-z7kz7\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " pod="openstack/neutron-6c5945c865-z7kz7" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.599743 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-ovndb-tls-certs\") pod \"neutron-6c5945c865-z7kz7\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " pod="openstack/neutron-6c5945c865-z7kz7" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.599763 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-config\") pod \"neutron-6c5945c865-z7kz7\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " pod="openstack/neutron-6c5945c865-z7kz7" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.599817 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk2lj\" (UniqueName: \"kubernetes.io/projected/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-kube-api-access-wk2lj\") pod \"neutron-6c5945c865-z7kz7\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " pod="openstack/neutron-6c5945c865-z7kz7" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.600258 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-kqdqm" podStartSLOduration=21.60023901 podStartE2EDuration="21.60023901s" podCreationTimestamp="2026-02-27 01:24:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:24:25.591247564 +0000 UTC m=+1178.528808862" watchObservedRunningTime="2026-02-27 01:24:25.60023901 +0000 UTC m=+1178.537800298" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.608448 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-httpd-config\") pod \"neutron-6c5945c865-z7kz7\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " pod="openstack/neutron-6c5945c865-z7kz7" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.616469 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-594f74c97c-r6bp5" event={"ID":"e291ec97-2bfe-4bbe-a39d-9eca937f1855","Type":"ContainerStarted","Data":"fc53994384bdc2f24024e47615918ee993ad81722b3c10093142c5fd1d9a7756"} Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.620513 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-ovndb-tls-certs\") pod \"neutron-6c5945c865-z7kz7\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " pod="openstack/neutron-6c5945c865-z7kz7" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.629914 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"af436659-00a5-4e7d-80a2-66bd5c1f5e04","Type":"ContainerStarted","Data":"66685625b219e5dc6ee7ae59ebda3ee9c5966b929180288b7b70d54a1d7a242f"} Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.640384 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk2lj\" (UniqueName: \"kubernetes.io/projected/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-kube-api-access-wk2lj\") pod \"neutron-6c5945c865-z7kz7\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " pod="openstack/neutron-6c5945c865-z7kz7" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.641504 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-combined-ca-bundle\") pod \"neutron-6c5945c865-z7kz7\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " pod="openstack/neutron-6c5945c865-z7kz7" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.642212 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-public-tls-certs\") pod \"neutron-6c5945c865-z7kz7\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " pod="openstack/neutron-6c5945c865-z7kz7" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.642808 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-555c84df64-lmgxw" podStartSLOduration=23.642798687 podStartE2EDuration="23.642798687s" podCreationTimestamp="2026-02-27 01:24:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:24:25.640901046 +0000 UTC m=+1178.578462334" watchObservedRunningTime="2026-02-27 01:24:25.642798687 +0000 UTC m=+1178.580359975" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.642997 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-config\") pod \"neutron-6c5945c865-z7kz7\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " pod="openstack/neutron-6c5945c865-z7kz7" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.645246 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-internal-tls-certs\") pod \"neutron-6c5945c865-z7kz7\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " pod="openstack/neutron-6c5945c865-z7kz7" Feb 27 01:24:25 crc kubenswrapper[4771]: I0227 01:24:25.777145 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c5945c865-z7kz7" Feb 27 01:24:26 crc kubenswrapper[4771]: I0227 01:24:26.548523 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c5945c865-z7kz7"] Feb 27 01:24:26 crc kubenswrapper[4771]: I0227 01:24:26.656113 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"af436659-00a5-4e7d-80a2-66bd5c1f5e04","Type":"ContainerStarted","Data":"5a6640b56c291cf50f39ebc8d2f8c3dabeb37fad62afd61d0d2487b3fb0dc78c"} Feb 27 01:24:26 crc kubenswrapper[4771]: I0227 01:24:26.656315 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="af436659-00a5-4e7d-80a2-66bd5c1f5e04" containerName="glance-log" containerID="cri-o://66685625b219e5dc6ee7ae59ebda3ee9c5966b929180288b7b70d54a1d7a242f" gracePeriod=30 Feb 27 01:24:26 crc kubenswrapper[4771]: I0227 01:24:26.656667 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="af436659-00a5-4e7d-80a2-66bd5c1f5e04" containerName="glance-httpd" containerID="cri-o://5a6640b56c291cf50f39ebc8d2f8c3dabeb37fad62afd61d0d2487b3fb0dc78c" gracePeriod=30 Feb 27 01:24:26 crc kubenswrapper[4771]: I0227 01:24:26.671782 4771 generic.go:334] "Generic (PLEG): container finished" podID="1289d10b-b2e9-4b14-ba46-c4de11e966be" containerID="5975a195724de8123a7141d59debc3ff9ad49c0b0de299c853384c4b0370a249" exitCode=0 Feb 27 01:24:26 crc kubenswrapper[4771]: I0227 01:24:26.671816 4771 generic.go:334] "Generic (PLEG): container finished" podID="1289d10b-b2e9-4b14-ba46-c4de11e966be" containerID="08126fd2f44b949d894404c7c136317e8709e495ba119b3350a4bb74ebbe92db" exitCode=143 Feb 27 01:24:26 crc kubenswrapper[4771]: I0227 01:24:26.671897 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1289d10b-b2e9-4b14-ba46-c4de11e966be","Type":"ContainerDied","Data":"5975a195724de8123a7141d59debc3ff9ad49c0b0de299c853384c4b0370a249"} Feb 27 01:24:26 crc kubenswrapper[4771]: I0227 01:24:26.671924 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1289d10b-b2e9-4b14-ba46-c4de11e966be","Type":"ContainerDied","Data":"08126fd2f44b949d894404c7c136317e8709e495ba119b3350a4bb74ebbe92db"} Feb 27 01:24:26 crc kubenswrapper[4771]: I0227 01:24:26.688576 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c5945c865-z7kz7" event={"ID":"eca4c4e6-7c04-473a-921b-c6f7e98c81b3","Type":"ContainerStarted","Data":"fb38757c81a6e8a2f500cbd4f718222cf897a2c41b9583c86be6d120203ab552"} Feb 27 01:24:26 crc kubenswrapper[4771]: I0227 01:24:26.690164 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" event={"ID":"345ab929-8a28-4d72-a196-bd831e1f3d0a","Type":"ContainerStarted","Data":"617b858b6cec96202071d7138e71cfb49b7320690803df419dee57403c1b814c"} Feb 27 01:24:26 crc kubenswrapper[4771]: I0227 01:24:26.690923 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" Feb 27 01:24:26 crc kubenswrapper[4771]: I0227 01:24:26.685510 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=26.68549109 podStartE2EDuration="26.68549109s" podCreationTimestamp="2026-02-27 01:24:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:24:26.6803656 +0000 UTC m=+1179.617926878" watchObservedRunningTime="2026-02-27 01:24:26.68549109 +0000 UTC m=+1179.623052378" Feb 27 01:24:26 crc kubenswrapper[4771]: I0227 01:24:26.710593 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" podStartSLOduration=3.710571222 podStartE2EDuration="3.710571222s" podCreationTimestamp="2026-02-27 01:24:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:24:26.706705827 +0000 UTC m=+1179.644267115" watchObservedRunningTime="2026-02-27 01:24:26.710571222 +0000 UTC m=+1179.648132520" Feb 27 01:24:26 crc kubenswrapper[4771]: I0227 01:24:26.711907 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fc95dbbd4-gfl9m" event={"ID":"dd57f8bd-d811-4740-b644-f8d69d329d5c","Type":"ContainerStarted","Data":"1deb49c54362ab7c5b82ffdcc324add53e5fc2ca49dfc8cd67dce9ea65914b57"} Feb 27 01:24:26 crc kubenswrapper[4771]: I0227 01:24:26.729490 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535924-2x5jq" event={"ID":"71ddad50-134a-4525-ade2-057c655b1a8c","Type":"ContainerStarted","Data":"abe9e9b98dd3724d60e36667b2ac53e4e7a0bb1891c7a5429756bec39892f1dc"} Feb 27 01:24:26 crc kubenswrapper[4771]: I0227 01:24:26.758308 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535924-2x5jq" podStartSLOduration=25.743508467 podStartE2EDuration="26.75828202s" podCreationTimestamp="2026-02-27 01:24:00 +0000 UTC" firstStartedPulling="2026-02-27 01:24:24.470853548 +0000 UTC m=+1177.408414836" lastFinishedPulling="2026-02-27 01:24:25.485627101 +0000 UTC m=+1178.423188389" observedRunningTime="2026-02-27 01:24:26.75787995 +0000 UTC m=+1179.695441248" watchObservedRunningTime="2026-02-27 01:24:26.75828202 +0000 UTC m=+1179.695843308" Feb 27 01:24:27 crc kubenswrapper[4771]: I0227 01:24:27.755658 4771 generic.go:334] "Generic (PLEG): container finished" podID="71ddad50-134a-4525-ade2-057c655b1a8c" containerID="abe9e9b98dd3724d60e36667b2ac53e4e7a0bb1891c7a5429756bec39892f1dc" exitCode=0 Feb 27 01:24:27 crc kubenswrapper[4771]: I0227 01:24:27.755751 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535924-2x5jq" event={"ID":"71ddad50-134a-4525-ade2-057c655b1a8c","Type":"ContainerDied","Data":"abe9e9b98dd3724d60e36667b2ac53e4e7a0bb1891c7a5429756bec39892f1dc"} Feb 27 01:24:27 crc kubenswrapper[4771]: I0227 01:24:27.776387 4771 generic.go:334] "Generic (PLEG): container finished" podID="af436659-00a5-4e7d-80a2-66bd5c1f5e04" containerID="5a6640b56c291cf50f39ebc8d2f8c3dabeb37fad62afd61d0d2487b3fb0dc78c" exitCode=0 Feb 27 01:24:27 crc kubenswrapper[4771]: I0227 01:24:27.776795 4771 generic.go:334] "Generic (PLEG): container finished" podID="af436659-00a5-4e7d-80a2-66bd5c1f5e04" containerID="66685625b219e5dc6ee7ae59ebda3ee9c5966b929180288b7b70d54a1d7a242f" exitCode=143 Feb 27 01:24:27 crc kubenswrapper[4771]: I0227 01:24:27.806059 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6c5945c865-z7kz7" Feb 27 01:24:27 crc kubenswrapper[4771]: I0227 01:24:27.806089 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"af436659-00a5-4e7d-80a2-66bd5c1f5e04","Type":"ContainerDied","Data":"5a6640b56c291cf50f39ebc8d2f8c3dabeb37fad62afd61d0d2487b3fb0dc78c"} Feb 27 01:24:27 crc kubenswrapper[4771]: I0227 01:24:27.806106 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"af436659-00a5-4e7d-80a2-66bd5c1f5e04","Type":"ContainerDied","Data":"66685625b219e5dc6ee7ae59ebda3ee9c5966b929180288b7b70d54a1d7a242f"} Feb 27 01:24:27 crc kubenswrapper[4771]: I0227 01:24:27.806115 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c5945c865-z7kz7" event={"ID":"eca4c4e6-7c04-473a-921b-c6f7e98c81b3","Type":"ContainerStarted","Data":"f61f2708c8d73b030c8335a3c492a7f2c6f3eef399fef9b0a0394e4fc6e69bf6"} Feb 27 01:24:27 crc kubenswrapper[4771]: I0227 01:24:27.806125 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c5945c865-z7kz7" event={"ID":"eca4c4e6-7c04-473a-921b-c6f7e98c81b3","Type":"ContainerStarted","Data":"ed10914f70340fa9edb88427c9685cc726d0e02226638f2978714fec22b0b45f"} Feb 27 01:24:27 crc kubenswrapper[4771]: I0227 01:24:27.863817 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6c5945c865-z7kz7" podStartSLOduration=2.863795811 podStartE2EDuration="2.863795811s" podCreationTimestamp="2026-02-27 01:24:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:24:27.857804348 +0000 UTC m=+1180.795365656" watchObservedRunningTime="2026-02-27 01:24:27.863795811 +0000 UTC m=+1180.801357099" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.185296 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.280352 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnjk9\" (UniqueName: \"kubernetes.io/projected/1289d10b-b2e9-4b14-ba46-c4de11e966be-kube-api-access-qnjk9\") pod \"1289d10b-b2e9-4b14-ba46-c4de11e966be\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.280413 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1289d10b-b2e9-4b14-ba46-c4de11e966be-logs\") pod \"1289d10b-b2e9-4b14-ba46-c4de11e966be\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.280510 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"1289d10b-b2e9-4b14-ba46-c4de11e966be\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.280618 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1289d10b-b2e9-4b14-ba46-c4de11e966be-httpd-run\") pod \"1289d10b-b2e9-4b14-ba46-c4de11e966be\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.280739 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1289d10b-b2e9-4b14-ba46-c4de11e966be-public-tls-certs\") pod \"1289d10b-b2e9-4b14-ba46-c4de11e966be\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.280952 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1289d10b-b2e9-4b14-ba46-c4de11e966be-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1289d10b-b2e9-4b14-ba46-c4de11e966be" (UID: "1289d10b-b2e9-4b14-ba46-c4de11e966be"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.281298 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1289d10b-b2e9-4b14-ba46-c4de11e966be-scripts\") pod \"1289d10b-b2e9-4b14-ba46-c4de11e966be\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.281362 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1289d10b-b2e9-4b14-ba46-c4de11e966be-config-data\") pod \"1289d10b-b2e9-4b14-ba46-c4de11e966be\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.281409 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1289d10b-b2e9-4b14-ba46-c4de11e966be-combined-ca-bundle\") pod \"1289d10b-b2e9-4b14-ba46-c4de11e966be\" (UID: \"1289d10b-b2e9-4b14-ba46-c4de11e966be\") " Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.281680 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1289d10b-b2e9-4b14-ba46-c4de11e966be-logs" (OuterVolumeSpecName: "logs") pod "1289d10b-b2e9-4b14-ba46-c4de11e966be" (UID: "1289d10b-b2e9-4b14-ba46-c4de11e966be"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.282626 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1289d10b-b2e9-4b14-ba46-c4de11e966be-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.282642 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1289d10b-b2e9-4b14-ba46-c4de11e966be-logs\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.289581 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1289d10b-b2e9-4b14-ba46-c4de11e966be-kube-api-access-qnjk9" (OuterVolumeSpecName: "kube-api-access-qnjk9") pod "1289d10b-b2e9-4b14-ba46-c4de11e966be" (UID: "1289d10b-b2e9-4b14-ba46-c4de11e966be"). InnerVolumeSpecName "kube-api-access-qnjk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.292201 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "1289d10b-b2e9-4b14-ba46-c4de11e966be" (UID: "1289d10b-b2e9-4b14-ba46-c4de11e966be"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.315104 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1289d10b-b2e9-4b14-ba46-c4de11e966be-scripts" (OuterVolumeSpecName: "scripts") pod "1289d10b-b2e9-4b14-ba46-c4de11e966be" (UID: "1289d10b-b2e9-4b14-ba46-c4de11e966be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.371737 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1289d10b-b2e9-4b14-ba46-c4de11e966be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1289d10b-b2e9-4b14-ba46-c4de11e966be" (UID: "1289d10b-b2e9-4b14-ba46-c4de11e966be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.385858 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnjk9\" (UniqueName: \"kubernetes.io/projected/1289d10b-b2e9-4b14-ba46-c4de11e966be-kube-api-access-qnjk9\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.385895 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.385906 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1289d10b-b2e9-4b14-ba46-c4de11e966be-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.385915 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1289d10b-b2e9-4b14-ba46-c4de11e966be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.400857 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1289d10b-b2e9-4b14-ba46-c4de11e966be-config-data" (OuterVolumeSpecName: "config-data") pod "1289d10b-b2e9-4b14-ba46-c4de11e966be" (UID: "1289d10b-b2e9-4b14-ba46-c4de11e966be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.402678 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.468791 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1289d10b-b2e9-4b14-ba46-c4de11e966be-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1289d10b-b2e9-4b14-ba46-c4de11e966be" (UID: "1289d10b-b2e9-4b14-ba46-c4de11e966be"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.487817 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.487848 4771 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1289d10b-b2e9-4b14-ba46-c4de11e966be-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.487859 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1289d10b-b2e9-4b14-ba46-c4de11e966be-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.829724 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.829721 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1289d10b-b2e9-4b14-ba46-c4de11e966be","Type":"ContainerDied","Data":"7e0632c3e0ef302be6b2cb4e0815c7de0dc57c8d679389b5584e85b1d8de1a64"} Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.831486 4771 scope.go:117] "RemoveContainer" containerID="5975a195724de8123a7141d59debc3ff9ad49c0b0de299c853384c4b0370a249" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.832020 4771 generic.go:334] "Generic (PLEG): container finished" podID="57ec654f-6921-476d-8001-aec299744492" containerID="c696924d536a08d6ade8f59f5787d17d17f48f3070b5b227285d140bca360a9c" exitCode=0 Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.832091 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kqdqm" event={"ID":"57ec654f-6921-476d-8001-aec299744492","Type":"ContainerDied","Data":"c696924d536a08d6ade8f59f5787d17d17f48f3070b5b227285d140bca360a9c"} Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.878447 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.903795 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.916014 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 01:24:28 crc kubenswrapper[4771]: E0227 01:24:28.916384 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1289d10b-b2e9-4b14-ba46-c4de11e966be" containerName="glance-httpd" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.916402 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1289d10b-b2e9-4b14-ba46-c4de11e966be" containerName="glance-httpd" Feb 27 01:24:28 crc kubenswrapper[4771]: E0227 01:24:28.916442 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1289d10b-b2e9-4b14-ba46-c4de11e966be" containerName="glance-log" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.916448 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1289d10b-b2e9-4b14-ba46-c4de11e966be" containerName="glance-log" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.916641 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1289d10b-b2e9-4b14-ba46-c4de11e966be" containerName="glance-httpd" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.916667 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1289d10b-b2e9-4b14-ba46-c4de11e966be" containerName="glance-log" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.917510 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.919105 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.947682 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 27 01:24:28 crc kubenswrapper[4771]: I0227 01:24:28.948481 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.009488 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htfvr\" (UniqueName: \"kubernetes.io/projected/12727ccf-0860-4f78-9d5e-4a043848ae2f-kube-api-access-htfvr\") pod \"glance-default-external-api-0\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.009541 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12727ccf-0860-4f78-9d5e-4a043848ae2f-scripts\") pod \"glance-default-external-api-0\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.009624 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12727ccf-0860-4f78-9d5e-4a043848ae2f-logs\") pod \"glance-default-external-api-0\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.009665 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12727ccf-0860-4f78-9d5e-4a043848ae2f-config-data\") pod \"glance-default-external-api-0\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.009684 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12727ccf-0860-4f78-9d5e-4a043848ae2f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.009710 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.009745 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12727ccf-0860-4f78-9d5e-4a043848ae2f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.009819 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12727ccf-0860-4f78-9d5e-4a043848ae2f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.111792 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12727ccf-0860-4f78-9d5e-4a043848ae2f-logs\") pod \"glance-default-external-api-0\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.111851 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12727ccf-0860-4f78-9d5e-4a043848ae2f-config-data\") pod \"glance-default-external-api-0\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.111883 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.111902 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12727ccf-0860-4f78-9d5e-4a043848ae2f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.111997 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12727ccf-0860-4f78-9d5e-4a043848ae2f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.112048 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12727ccf-0860-4f78-9d5e-4a043848ae2f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.112087 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htfvr\" (UniqueName: \"kubernetes.io/projected/12727ccf-0860-4f78-9d5e-4a043848ae2f-kube-api-access-htfvr\") pod \"glance-default-external-api-0\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.112108 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12727ccf-0860-4f78-9d5e-4a043848ae2f-scripts\") pod \"glance-default-external-api-0\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.113221 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12727ccf-0860-4f78-9d5e-4a043848ae2f-logs\") pod \"glance-default-external-api-0\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.113407 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.114160 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12727ccf-0860-4f78-9d5e-4a043848ae2f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.121312 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12727ccf-0860-4f78-9d5e-4a043848ae2f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.131460 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12727ccf-0860-4f78-9d5e-4a043848ae2f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.135398 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12727ccf-0860-4f78-9d5e-4a043848ae2f-scripts\") pod \"glance-default-external-api-0\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.142815 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htfvr\" (UniqueName: \"kubernetes.io/projected/12727ccf-0860-4f78-9d5e-4a043848ae2f-kube-api-access-htfvr\") pod \"glance-default-external-api-0\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.152044 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12727ccf-0860-4f78-9d5e-4a043848ae2f-config-data\") pod \"glance-default-external-api-0\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.211242 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " pod="openstack/glance-default-external-api-0" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.275592 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 01:24:29 crc kubenswrapper[4771]: I0227 01:24:29.818867 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1289d10b-b2e9-4b14-ba46-c4de11e966be" path="/var/lib/kubelet/pods/1289d10b-b2e9-4b14-ba46-c4de11e966be/volumes" Feb 27 01:24:30 crc kubenswrapper[4771]: I0227 01:24:30.462330 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 27 01:24:30 crc kubenswrapper[4771]: I0227 01:24:30.462713 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 27 01:24:30 crc kubenswrapper[4771]: I0227 01:24:30.677998 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535924-2x5jq" Feb 27 01:24:30 crc kubenswrapper[4771]: I0227 01:24:30.743806 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl5mg\" (UniqueName: \"kubernetes.io/projected/71ddad50-134a-4525-ade2-057c655b1a8c-kube-api-access-vl5mg\") pod \"71ddad50-134a-4525-ade2-057c655b1a8c\" (UID: \"71ddad50-134a-4525-ade2-057c655b1a8c\") " Feb 27 01:24:30 crc kubenswrapper[4771]: I0227 01:24:30.753728 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71ddad50-134a-4525-ade2-057c655b1a8c-kube-api-access-vl5mg" (OuterVolumeSpecName: "kube-api-access-vl5mg") pod "71ddad50-134a-4525-ade2-057c655b1a8c" (UID: "71ddad50-134a-4525-ade2-057c655b1a8c"). InnerVolumeSpecName "kube-api-access-vl5mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:24:30 crc kubenswrapper[4771]: I0227 01:24:30.845884 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vl5mg\" (UniqueName: \"kubernetes.io/projected/71ddad50-134a-4525-ade2-057c655b1a8c-kube-api-access-vl5mg\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:30 crc kubenswrapper[4771]: I0227 01:24:30.855699 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535924-2x5jq" event={"ID":"71ddad50-134a-4525-ade2-057c655b1a8c","Type":"ContainerDied","Data":"431659f0c36b7dc87ded08281ee9357b730e23c85256a0be373319f7ce870b33"} Feb 27 01:24:30 crc kubenswrapper[4771]: I0227 01:24:30.855740 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="431659f0c36b7dc87ded08281ee9357b730e23c85256a0be373319f7ce870b33" Feb 27 01:24:30 crc kubenswrapper[4771]: I0227 01:24:30.855793 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535924-2x5jq" Feb 27 01:24:31 crc kubenswrapper[4771]: I0227 01:24:31.749368 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535918-kd9mw"] Feb 27 01:24:31 crc kubenswrapper[4771]: I0227 01:24:31.758872 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535918-kd9mw"] Feb 27 01:24:31 crc kubenswrapper[4771]: I0227 01:24:31.784356 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5cdfa47-132f-4eb8-95c0-efd8ba314ab7" path="/var/lib/kubelet/pods/f5cdfa47-132f-4eb8-95c0-efd8ba314ab7/volumes" Feb 27 01:24:32 crc kubenswrapper[4771]: I0227 01:24:32.392968 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:32 crc kubenswrapper[4771]: I0227 01:24:32.393005 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:32 crc kubenswrapper[4771]: I0227 01:24:32.645131 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:32 crc kubenswrapper[4771]: I0227 01:24:32.645380 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:33 crc kubenswrapper[4771]: I0227 01:24:33.595027 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" Feb 27 01:24:33 crc kubenswrapper[4771]: I0227 01:24:33.677118 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-lmh2v"] Feb 27 01:24:33 crc kubenswrapper[4771]: I0227 01:24:33.678592 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" podUID="394871e9-ec61-4b01-8d2a-90ce7785052b" containerName="dnsmasq-dns" containerID="cri-o://d939a357abf1eccc747917a70bbadfbe2d16e5df05e7b92b9be91f16e946efbd" gracePeriod=10 Feb 27 01:24:33 crc kubenswrapper[4771]: I0227 01:24:33.838086 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-594f74c97c-r6bp5" Feb 27 01:24:33 crc kubenswrapper[4771]: I0227 01:24:33.891982 4771 generic.go:334] "Generic (PLEG): container finished" podID="394871e9-ec61-4b01-8d2a-90ce7785052b" containerID="d939a357abf1eccc747917a70bbadfbe2d16e5df05e7b92b9be91f16e946efbd" exitCode=0 Feb 27 01:24:33 crc kubenswrapper[4771]: I0227 01:24:33.892023 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" event={"ID":"394871e9-ec61-4b01-8d2a-90ce7785052b","Type":"ContainerDied","Data":"d939a357abf1eccc747917a70bbadfbe2d16e5df05e7b92b9be91f16e946efbd"} Feb 27 01:24:33 crc kubenswrapper[4771]: I0227 01:24:33.944587 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" podUID="394871e9-ec61-4b01-8d2a-90ce7785052b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.150:5353: connect: connection refused" Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.697255 4771 scope.go:117] "RemoveContainer" containerID="08126fd2f44b949d894404c7c136317e8709e495ba119b3350a4bb74ebbe92db" Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.820394 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kqdqm" Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.871642 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.882815 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-scripts\") pod \"57ec654f-6921-476d-8001-aec299744492\" (UID: \"57ec654f-6921-476d-8001-aec299744492\") " Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.882923 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzls4\" (UniqueName: \"kubernetes.io/projected/57ec654f-6921-476d-8001-aec299744492-kube-api-access-hzls4\") pod \"57ec654f-6921-476d-8001-aec299744492\" (UID: \"57ec654f-6921-476d-8001-aec299744492\") " Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.883017 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-fernet-keys\") pod \"57ec654f-6921-476d-8001-aec299744492\" (UID: \"57ec654f-6921-476d-8001-aec299744492\") " Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.883107 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-combined-ca-bundle\") pod \"57ec654f-6921-476d-8001-aec299744492\" (UID: \"57ec654f-6921-476d-8001-aec299744492\") " Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.883132 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-config-data\") pod \"57ec654f-6921-476d-8001-aec299744492\" (UID: \"57ec654f-6921-476d-8001-aec299744492\") " Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.883173 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-credential-keys\") pod \"57ec654f-6921-476d-8001-aec299744492\" (UID: \"57ec654f-6921-476d-8001-aec299744492\") " Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.902668 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-scripts" (OuterVolumeSpecName: "scripts") pod "57ec654f-6921-476d-8001-aec299744492" (UID: "57ec654f-6921-476d-8001-aec299744492"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.939417 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "57ec654f-6921-476d-8001-aec299744492" (UID: "57ec654f-6921-476d-8001-aec299744492"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.944383 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57ec654f-6921-476d-8001-aec299744492-kube-api-access-hzls4" (OuterVolumeSpecName: "kube-api-access-hzls4") pod "57ec654f-6921-476d-8001-aec299744492" (UID: "57ec654f-6921-476d-8001-aec299744492"). InnerVolumeSpecName "kube-api-access-hzls4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.948720 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "57ec654f-6921-476d-8001-aec299744492" (UID: "57ec654f-6921-476d-8001-aec299744492"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.977174 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"af436659-00a5-4e7d-80a2-66bd5c1f5e04","Type":"ContainerDied","Data":"1706e27f6aefc5d84f0deb3328ec89fa2a4b038916f5482756c84fb7a59af688"} Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.977231 4771 scope.go:117] "RemoveContainer" containerID="5a6640b56c291cf50f39ebc8d2f8c3dabeb37fad62afd61d0d2487b3fb0dc78c" Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.977368 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.977838 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57ec654f-6921-476d-8001-aec299744492" (UID: "57ec654f-6921-476d-8001-aec299744492"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.987833 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af436659-00a5-4e7d-80a2-66bd5c1f5e04-internal-tls-certs\") pod \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.987999 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af436659-00a5-4e7d-80a2-66bd5c1f5e04-logs\") pod \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.988048 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af436659-00a5-4e7d-80a2-66bd5c1f5e04-httpd-run\") pod \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.988143 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af436659-00a5-4e7d-80a2-66bd5c1f5e04-scripts\") pod \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.988290 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n77l\" (UniqueName: \"kubernetes.io/projected/af436659-00a5-4e7d-80a2-66bd5c1f5e04-kube-api-access-7n77l\") pod \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.988371 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af436659-00a5-4e7d-80a2-66bd5c1f5e04-config-data\") pod \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.988436 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af436659-00a5-4e7d-80a2-66bd5c1f5e04-combined-ca-bundle\") pod \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.988504 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\" (UID: \"af436659-00a5-4e7d-80a2-66bd5c1f5e04\") " Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.989110 4771 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.989123 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.989134 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzls4\" (UniqueName: \"kubernetes.io/projected/57ec654f-6921-476d-8001-aec299744492-kube-api-access-hzls4\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.989145 4771 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.989154 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.990418 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af436659-00a5-4e7d-80a2-66bd5c1f5e04-logs" (OuterVolumeSpecName: "logs") pod "af436659-00a5-4e7d-80a2-66bd5c1f5e04" (UID: "af436659-00a5-4e7d-80a2-66bd5c1f5e04"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.991201 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af436659-00a5-4e7d-80a2-66bd5c1f5e04-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "af436659-00a5-4e7d-80a2-66bd5c1f5e04" (UID: "af436659-00a5-4e7d-80a2-66bd5c1f5e04"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.994684 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-config-data" (OuterVolumeSpecName: "config-data") pod "57ec654f-6921-476d-8001-aec299744492" (UID: "57ec654f-6921-476d-8001-aec299744492"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.996309 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af436659-00a5-4e7d-80a2-66bd5c1f5e04-kube-api-access-7n77l" (OuterVolumeSpecName: "kube-api-access-7n77l") pod "af436659-00a5-4e7d-80a2-66bd5c1f5e04" (UID: "af436659-00a5-4e7d-80a2-66bd5c1f5e04"). InnerVolumeSpecName "kube-api-access-7n77l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:24:37 crc kubenswrapper[4771]: I0227 01:24:37.998623 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af436659-00a5-4e7d-80a2-66bd5c1f5e04-scripts" (OuterVolumeSpecName: "scripts") pod "af436659-00a5-4e7d-80a2-66bd5c1f5e04" (UID: "af436659-00a5-4e7d-80a2-66bd5c1f5e04"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.000324 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kqdqm" event={"ID":"57ec654f-6921-476d-8001-aec299744492","Type":"ContainerDied","Data":"2ae665dea5f555dbff0916bc220a3ab9906c2e5a42df2675362269f6532de747"} Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.000363 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ae665dea5f555dbff0916bc220a3ab9906c2e5a42df2675362269f6532de747" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.000434 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kqdqm" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.010468 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "af436659-00a5-4e7d-80a2-66bd5c1f5e04" (UID: "af436659-00a5-4e7d-80a2-66bd5c1f5e04"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.055245 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af436659-00a5-4e7d-80a2-66bd5c1f5e04-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af436659-00a5-4e7d-80a2-66bd5c1f5e04" (UID: "af436659-00a5-4e7d-80a2-66bd5c1f5e04"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.092667 4771 scope.go:117] "RemoveContainer" containerID="66685625b219e5dc6ee7ae59ebda3ee9c5966b929180288b7b70d54a1d7a242f" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.093522 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af436659-00a5-4e7d-80a2-66bd5c1f5e04-logs\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.096913 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57ec654f-6921-476d-8001-aec299744492-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.096939 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af436659-00a5-4e7d-80a2-66bd5c1f5e04-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.096948 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af436659-00a5-4e7d-80a2-66bd5c1f5e04-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.096958 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n77l\" (UniqueName: \"kubernetes.io/projected/af436659-00a5-4e7d-80a2-66bd5c1f5e04-kube-api-access-7n77l\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.096973 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af436659-00a5-4e7d-80a2-66bd5c1f5e04-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.097005 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.102025 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af436659-00a5-4e7d-80a2-66bd5c1f5e04-config-data" (OuterVolumeSpecName: "config-data") pod "af436659-00a5-4e7d-80a2-66bd5c1f5e04" (UID: "af436659-00a5-4e7d-80a2-66bd5c1f5e04"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.105652 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af436659-00a5-4e7d-80a2-66bd5c1f5e04-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "af436659-00a5-4e7d-80a2-66bd5c1f5e04" (UID: "af436659-00a5-4e7d-80a2-66bd5c1f5e04"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.119402 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.199019 4771 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af436659-00a5-4e7d-80a2-66bd5c1f5e04-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.199052 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af436659-00a5-4e7d-80a2-66bd5c1f5e04-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.199064 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.335659 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.355455 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.369982 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.379847 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 01:24:38 crc kubenswrapper[4771]: E0227 01:24:38.382054 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ec654f-6921-476d-8001-aec299744492" containerName="keystone-bootstrap" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.382072 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ec654f-6921-476d-8001-aec299744492" containerName="keystone-bootstrap" Feb 27 01:24:38 crc kubenswrapper[4771]: E0227 01:24:38.382081 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af436659-00a5-4e7d-80a2-66bd5c1f5e04" containerName="glance-httpd" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.382088 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="af436659-00a5-4e7d-80a2-66bd5c1f5e04" containerName="glance-httpd" Feb 27 01:24:38 crc kubenswrapper[4771]: E0227 01:24:38.382106 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ddad50-134a-4525-ade2-057c655b1a8c" containerName="oc" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.382112 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ddad50-134a-4525-ade2-057c655b1a8c" containerName="oc" Feb 27 01:24:38 crc kubenswrapper[4771]: E0227 01:24:38.382123 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af436659-00a5-4e7d-80a2-66bd5c1f5e04" containerName="glance-log" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.382129 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="af436659-00a5-4e7d-80a2-66bd5c1f5e04" containerName="glance-log" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.382281 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ddad50-134a-4525-ade2-057c655b1a8c" containerName="oc" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.382310 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ec654f-6921-476d-8001-aec299744492" containerName="keystone-bootstrap" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.382326 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="af436659-00a5-4e7d-80a2-66bd5c1f5e04" containerName="glance-log" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.382341 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="af436659-00a5-4e7d-80a2-66bd5c1f5e04" containerName="glance-httpd" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.386850 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.386969 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.392279 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.394508 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.464517 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.506351 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8bef7a7d-188a-4d22-9031-8365098a761f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.506419 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7hmb\" (UniqueName: \"kubernetes.io/projected/8bef7a7d-188a-4d22-9031-8365098a761f-kube-api-access-j7hmb\") pod \"glance-default-internal-api-0\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.506444 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bef7a7d-188a-4d22-9031-8365098a761f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.506467 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.506494 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bef7a7d-188a-4d22-9031-8365098a761f-logs\") pod \"glance-default-internal-api-0\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.506527 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bef7a7d-188a-4d22-9031-8365098a761f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.507451 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bef7a7d-188a-4d22-9031-8365098a761f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.507512 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bef7a7d-188a-4d22-9031-8365098a761f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.610120 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-dns-swift-storage-0\") pod \"394871e9-ec61-4b01-8d2a-90ce7785052b\" (UID: \"394871e9-ec61-4b01-8d2a-90ce7785052b\") " Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.610177 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-config\") pod \"394871e9-ec61-4b01-8d2a-90ce7785052b\" (UID: \"394871e9-ec61-4b01-8d2a-90ce7785052b\") " Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.610226 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wswsc\" (UniqueName: \"kubernetes.io/projected/394871e9-ec61-4b01-8d2a-90ce7785052b-kube-api-access-wswsc\") pod \"394871e9-ec61-4b01-8d2a-90ce7785052b\" (UID: \"394871e9-ec61-4b01-8d2a-90ce7785052b\") " Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.610261 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-ovsdbserver-sb\") pod \"394871e9-ec61-4b01-8d2a-90ce7785052b\" (UID: \"394871e9-ec61-4b01-8d2a-90ce7785052b\") " Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.610334 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-ovsdbserver-nb\") pod \"394871e9-ec61-4b01-8d2a-90ce7785052b\" (UID: \"394871e9-ec61-4b01-8d2a-90ce7785052b\") " Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.610372 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-dns-svc\") pod \"394871e9-ec61-4b01-8d2a-90ce7785052b\" (UID: \"394871e9-ec61-4b01-8d2a-90ce7785052b\") " Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.610694 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7hmb\" (UniqueName: \"kubernetes.io/projected/8bef7a7d-188a-4d22-9031-8365098a761f-kube-api-access-j7hmb\") pod \"glance-default-internal-api-0\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.610729 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bef7a7d-188a-4d22-9031-8365098a761f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.610753 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.611186 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bef7a7d-188a-4d22-9031-8365098a761f-logs\") pod \"glance-default-internal-api-0\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.611230 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bef7a7d-188a-4d22-9031-8365098a761f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.611262 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bef7a7d-188a-4d22-9031-8365098a761f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.611321 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bef7a7d-188a-4d22-9031-8365098a761f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.611361 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8bef7a7d-188a-4d22-9031-8365098a761f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.611934 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8bef7a7d-188a-4d22-9031-8365098a761f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.612223 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.614180 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bef7a7d-188a-4d22-9031-8365098a761f-logs\") pod \"glance-default-internal-api-0\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.615744 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/394871e9-ec61-4b01-8d2a-90ce7785052b-kube-api-access-wswsc" (OuterVolumeSpecName: "kube-api-access-wswsc") pod "394871e9-ec61-4b01-8d2a-90ce7785052b" (UID: "394871e9-ec61-4b01-8d2a-90ce7785052b"). InnerVolumeSpecName "kube-api-access-wswsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.618162 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bef7a7d-188a-4d22-9031-8365098a761f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.619608 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bef7a7d-188a-4d22-9031-8365098a761f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.628172 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bef7a7d-188a-4d22-9031-8365098a761f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.630760 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bef7a7d-188a-4d22-9031-8365098a761f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.635133 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7hmb\" (UniqueName: \"kubernetes.io/projected/8bef7a7d-188a-4d22-9031-8365098a761f-kube-api-access-j7hmb\") pod \"glance-default-internal-api-0\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.667865 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.697220 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "394871e9-ec61-4b01-8d2a-90ce7785052b" (UID: "394871e9-ec61-4b01-8d2a-90ce7785052b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.710636 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "394871e9-ec61-4b01-8d2a-90ce7785052b" (UID: "394871e9-ec61-4b01-8d2a-90ce7785052b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.713244 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.713280 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wswsc\" (UniqueName: \"kubernetes.io/projected/394871e9-ec61-4b01-8d2a-90ce7785052b-kube-api-access-wswsc\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.713296 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.721294 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.725896 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "394871e9-ec61-4b01-8d2a-90ce7785052b" (UID: "394871e9-ec61-4b01-8d2a-90ce7785052b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.739258 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "394871e9-ec61-4b01-8d2a-90ce7785052b" (UID: "394871e9-ec61-4b01-8d2a-90ce7785052b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.743477 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-config" (OuterVolumeSpecName: "config") pod "394871e9-ec61-4b01-8d2a-90ce7785052b" (UID: "394871e9-ec61-4b01-8d2a-90ce7785052b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.814570 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.814612 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.814626 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/394871e9-ec61-4b01-8d2a-90ce7785052b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.928322 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-56bfd8fdf6-rxxnr"] Feb 27 01:24:38 crc kubenswrapper[4771]: E0227 01:24:38.928693 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="394871e9-ec61-4b01-8d2a-90ce7785052b" containerName="init" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.928709 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="394871e9-ec61-4b01-8d2a-90ce7785052b" containerName="init" Feb 27 01:24:38 crc kubenswrapper[4771]: E0227 01:24:38.928723 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="394871e9-ec61-4b01-8d2a-90ce7785052b" containerName="dnsmasq-dns" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.928730 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="394871e9-ec61-4b01-8d2a-90ce7785052b" containerName="dnsmasq-dns" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.928929 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="394871e9-ec61-4b01-8d2a-90ce7785052b" containerName="dnsmasq-dns" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.929477 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.935318 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-7nd8k" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.935589 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.935777 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.935839 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.935900 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.936052 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 27 01:24:38 crc kubenswrapper[4771]: I0227 01:24:38.959144 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-56bfd8fdf6-rxxnr"] Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.019729 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66f9559-0d35-47b3-ab89-06425ff3afd3-internal-tls-certs\") pod \"keystone-56bfd8fdf6-rxxnr\" (UID: \"b66f9559-0d35-47b3-ab89-06425ff3afd3\") " pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.019808 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b66f9559-0d35-47b3-ab89-06425ff3afd3-combined-ca-bundle\") pod \"keystone-56bfd8fdf6-rxxnr\" (UID: \"b66f9559-0d35-47b3-ab89-06425ff3afd3\") " pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.019841 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b66f9559-0d35-47b3-ab89-06425ff3afd3-scripts\") pod \"keystone-56bfd8fdf6-rxxnr\" (UID: \"b66f9559-0d35-47b3-ab89-06425ff3afd3\") " pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.019882 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b66f9559-0d35-47b3-ab89-06425ff3afd3-config-data\") pod \"keystone-56bfd8fdf6-rxxnr\" (UID: \"b66f9559-0d35-47b3-ab89-06425ff3afd3\") " pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.019909 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b66f9559-0d35-47b3-ab89-06425ff3afd3-credential-keys\") pod \"keystone-56bfd8fdf6-rxxnr\" (UID: \"b66f9559-0d35-47b3-ab89-06425ff3afd3\") " pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.019994 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66f9559-0d35-47b3-ab89-06425ff3afd3-public-tls-certs\") pod \"keystone-56bfd8fdf6-rxxnr\" (UID: \"b66f9559-0d35-47b3-ab89-06425ff3afd3\") " pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.020040 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lljm\" (UniqueName: \"kubernetes.io/projected/b66f9559-0d35-47b3-ab89-06425ff3afd3-kube-api-access-9lljm\") pod \"keystone-56bfd8fdf6-rxxnr\" (UID: \"b66f9559-0d35-47b3-ab89-06425ff3afd3\") " pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.020070 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b66f9559-0d35-47b3-ab89-06425ff3afd3-fernet-keys\") pod \"keystone-56bfd8fdf6-rxxnr\" (UID: \"b66f9559-0d35-47b3-ab89-06425ff3afd3\") " pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.044915 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"12727ccf-0860-4f78-9d5e-4a043848ae2f","Type":"ContainerStarted","Data":"e762342ba95a7afb59992ca9768d565ff447eb17bf07346eb04277d028f705b6"} Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.079577 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" event={"ID":"394871e9-ec61-4b01-8d2a-90ce7785052b","Type":"ContainerDied","Data":"59bc82cd2aa6b2f99c765d45ceddb45e3db6fe8591265300c8688097af7a996c"} Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.079664 4771 scope.go:117] "RemoveContainer" containerID="d939a357abf1eccc747917a70bbadfbe2d16e5df05e7b92b9be91f16e946efbd" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.079855 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-lmh2v" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.122177 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lljm\" (UniqueName: \"kubernetes.io/projected/b66f9559-0d35-47b3-ab89-06425ff3afd3-kube-api-access-9lljm\") pod \"keystone-56bfd8fdf6-rxxnr\" (UID: \"b66f9559-0d35-47b3-ab89-06425ff3afd3\") " pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.122245 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b66f9559-0d35-47b3-ab89-06425ff3afd3-fernet-keys\") pod \"keystone-56bfd8fdf6-rxxnr\" (UID: \"b66f9559-0d35-47b3-ab89-06425ff3afd3\") " pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.122301 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66f9559-0d35-47b3-ab89-06425ff3afd3-internal-tls-certs\") pod \"keystone-56bfd8fdf6-rxxnr\" (UID: \"b66f9559-0d35-47b3-ab89-06425ff3afd3\") " pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.122336 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b66f9559-0d35-47b3-ab89-06425ff3afd3-combined-ca-bundle\") pod \"keystone-56bfd8fdf6-rxxnr\" (UID: \"b66f9559-0d35-47b3-ab89-06425ff3afd3\") " pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.122363 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b66f9559-0d35-47b3-ab89-06425ff3afd3-scripts\") pod \"keystone-56bfd8fdf6-rxxnr\" (UID: \"b66f9559-0d35-47b3-ab89-06425ff3afd3\") " pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.122400 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b66f9559-0d35-47b3-ab89-06425ff3afd3-config-data\") pod \"keystone-56bfd8fdf6-rxxnr\" (UID: \"b66f9559-0d35-47b3-ab89-06425ff3afd3\") " pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.122420 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b66f9559-0d35-47b3-ab89-06425ff3afd3-credential-keys\") pod \"keystone-56bfd8fdf6-rxxnr\" (UID: \"b66f9559-0d35-47b3-ab89-06425ff3afd3\") " pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.122485 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66f9559-0d35-47b3-ab89-06425ff3afd3-public-tls-certs\") pod \"keystone-56bfd8fdf6-rxxnr\" (UID: \"b66f9559-0d35-47b3-ab89-06425ff3afd3\") " pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.138025 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66f9559-0d35-47b3-ab89-06425ff3afd3-internal-tls-certs\") pod \"keystone-56bfd8fdf6-rxxnr\" (UID: \"b66f9559-0d35-47b3-ab89-06425ff3afd3\") " pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.138629 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b66f9559-0d35-47b3-ab89-06425ff3afd3-fernet-keys\") pod \"keystone-56bfd8fdf6-rxxnr\" (UID: \"b66f9559-0d35-47b3-ab89-06425ff3afd3\") " pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.140232 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b66f9559-0d35-47b3-ab89-06425ff3afd3-scripts\") pod \"keystone-56bfd8fdf6-rxxnr\" (UID: \"b66f9559-0d35-47b3-ab89-06425ff3afd3\") " pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.144224 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b66f9559-0d35-47b3-ab89-06425ff3afd3-public-tls-certs\") pod \"keystone-56bfd8fdf6-rxxnr\" (UID: \"b66f9559-0d35-47b3-ab89-06425ff3afd3\") " pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.174697 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lljm\" (UniqueName: \"kubernetes.io/projected/b66f9559-0d35-47b3-ab89-06425ff3afd3-kube-api-access-9lljm\") pod \"keystone-56bfd8fdf6-rxxnr\" (UID: \"b66f9559-0d35-47b3-ab89-06425ff3afd3\") " pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.183584 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-lmh2v"] Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.190812 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b66f9559-0d35-47b3-ab89-06425ff3afd3-credential-keys\") pod \"keystone-56bfd8fdf6-rxxnr\" (UID: \"b66f9559-0d35-47b3-ab89-06425ff3afd3\") " pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.190760 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b66f9559-0d35-47b3-ab89-06425ff3afd3-combined-ca-bundle\") pod \"keystone-56bfd8fdf6-rxxnr\" (UID: \"b66f9559-0d35-47b3-ab89-06425ff3afd3\") " pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.191747 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b66f9559-0d35-47b3-ab89-06425ff3afd3-config-data\") pod \"keystone-56bfd8fdf6-rxxnr\" (UID: \"b66f9559-0d35-47b3-ab89-06425ff3afd3\") " pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.197435 4771 scope.go:117] "RemoveContainer" containerID="e322bc4c34bfe742ef9e17db16433643e1554fca8bcb23fe3ecb49970965d046" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.211501 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-lmh2v"] Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.327990 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.622133 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 01:24:39 crc kubenswrapper[4771]: W0227 01:24:39.651962 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bef7a7d_188a_4d22_9031_8365098a761f.slice/crio-443aa11c841954f61b12cb4c64afe7d29f64252f8276714878886676cb35f87a WatchSource:0}: Error finding container 443aa11c841954f61b12cb4c64afe7d29f64252f8276714878886676cb35f87a: Status 404 returned error can't find the container with id 443aa11c841954f61b12cb4c64afe7d29f64252f8276714878886676cb35f87a Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.810723 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="394871e9-ec61-4b01-8d2a-90ce7785052b" path="/var/lib/kubelet/pods/394871e9-ec61-4b01-8d2a-90ce7785052b/volumes" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.812176 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af436659-00a5-4e7d-80a2-66bd5c1f5e04" path="/var/lib/kubelet/pods/af436659-00a5-4e7d-80a2-66bd5c1f5e04/volumes" Feb 27 01:24:39 crc kubenswrapper[4771]: I0227 01:24:39.895446 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-56bfd8fdf6-rxxnr"] Feb 27 01:24:39 crc kubenswrapper[4771]: W0227 01:24:39.919299 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb66f9559_0d35_47b3_ab89_06425ff3afd3.slice/crio-7a01998b3dbfe01bd5179e6d4733af951a916ca0dc723c87013fb9df64bf6adf WatchSource:0}: Error finding container 7a01998b3dbfe01bd5179e6d4733af951a916ca0dc723c87013fb9df64bf6adf: Status 404 returned error can't find the container with id 7a01998b3dbfe01bd5179e6d4733af951a916ca0dc723c87013fb9df64bf6adf Feb 27 01:24:40 crc kubenswrapper[4771]: I0227 01:24:40.150272 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-r69k6" event={"ID":"a592bd48-ea9a-4f6c-a7fe-49185fbbed82","Type":"ContainerStarted","Data":"96a82b8dbdf626beb081b086821121237bcc704dca5923bbb45584bbac786f18"} Feb 27 01:24:40 crc kubenswrapper[4771]: I0227 01:24:40.152390 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8bef7a7d-188a-4d22-9031-8365098a761f","Type":"ContainerStarted","Data":"443aa11c841954f61b12cb4c64afe7d29f64252f8276714878886676cb35f87a"} Feb 27 01:24:40 crc kubenswrapper[4771]: I0227 01:24:40.153514 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"12727ccf-0860-4f78-9d5e-4a043848ae2f","Type":"ContainerStarted","Data":"b557c51300581ae1caf95ce0698f31914ca105125868b4fefa721b4e4d20c3ad"} Feb 27 01:24:40 crc kubenswrapper[4771]: I0227 01:24:40.171207 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3","Type":"ContainerStarted","Data":"ab60c67522c880db20e46078b8aec289c4416dd8b9970d7f7d96cf88261a61c6"} Feb 27 01:24:40 crc kubenswrapper[4771]: I0227 01:24:40.201646 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-r69k6" podStartSLOduration=3.058574633 podStartE2EDuration="47.201623767s" podCreationTimestamp="2026-02-27 01:23:53 +0000 UTC" firstStartedPulling="2026-02-27 01:23:54.670319228 +0000 UTC m=+1147.607880516" lastFinishedPulling="2026-02-27 01:24:38.813368372 +0000 UTC m=+1191.750929650" observedRunningTime="2026-02-27 01:24:40.181033787 +0000 UTC m=+1193.118595075" watchObservedRunningTime="2026-02-27 01:24:40.201623767 +0000 UTC m=+1193.139185055" Feb 27 01:24:40 crc kubenswrapper[4771]: I0227 01:24:40.206229 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-tclqb" event={"ID":"b411543d-f7a2-4a56-acb5-9b2d9598739a","Type":"ContainerStarted","Data":"e92d952b0af34475040583946aa19405a784b9afca08253635c5d9f8c5d9db1d"} Feb 27 01:24:40 crc kubenswrapper[4771]: I0227 01:24:40.220312 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56bfd8fdf6-rxxnr" event={"ID":"b66f9559-0d35-47b3-ab89-06425ff3afd3","Type":"ContainerStarted","Data":"7a01998b3dbfe01bd5179e6d4733af951a916ca0dc723c87013fb9df64bf6adf"} Feb 27 01:24:40 crc kubenswrapper[4771]: I0227 01:24:40.239817 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-tclqb" podStartSLOduration=3.320278974 podStartE2EDuration="47.239799826s" podCreationTimestamp="2026-02-27 01:23:53 +0000 UTC" firstStartedPulling="2026-02-27 01:23:54.893478121 +0000 UTC m=+1147.831039399" lastFinishedPulling="2026-02-27 01:24:38.812998963 +0000 UTC m=+1191.750560251" observedRunningTime="2026-02-27 01:24:40.226917635 +0000 UTC m=+1193.164478923" watchObservedRunningTime="2026-02-27 01:24:40.239799826 +0000 UTC m=+1193.177361114" Feb 27 01:24:41 crc kubenswrapper[4771]: I0227 01:24:41.242146 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zhfdt" event={"ID":"37e7849a-97b9-4e3d-9ad3-c0c942775e64","Type":"ContainerStarted","Data":"cbd0d7c7d1911e59d1ef392693fbaee34ea486d72e649c9b2f23b338d82aa822"} Feb 27 01:24:41 crc kubenswrapper[4771]: I0227 01:24:41.248133 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"12727ccf-0860-4f78-9d5e-4a043848ae2f","Type":"ContainerStarted","Data":"c500d7a15278f96a54099daa331198c5ce41361d0a1b24eed20d43ca05a32c9a"} Feb 27 01:24:41 crc kubenswrapper[4771]: I0227 01:24:41.256246 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56bfd8fdf6-rxxnr" event={"ID":"b66f9559-0d35-47b3-ab89-06425ff3afd3","Type":"ContainerStarted","Data":"0a060fe69aac5361bb3cb3a40d7b5751ad9bcc3523a4e9e008a7751da9611b0c"} Feb 27 01:24:41 crc kubenswrapper[4771]: I0227 01:24:41.256384 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:24:41 crc kubenswrapper[4771]: I0227 01:24:41.264412 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8bef7a7d-188a-4d22-9031-8365098a761f","Type":"ContainerStarted","Data":"737b90e10a3eea907ddf7d3af1f7885d11f3951a631955b0c824af3f50a9825b"} Feb 27 01:24:41 crc kubenswrapper[4771]: I0227 01:24:41.264465 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8bef7a7d-188a-4d22-9031-8365098a761f","Type":"ContainerStarted","Data":"d3facf87ac24c5acda65fef3df745ef287c70c4a52a00da437d5cd470942dbe9"} Feb 27 01:24:41 crc kubenswrapper[4771]: I0227 01:24:41.279527 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-zhfdt" podStartSLOduration=4.076607653 podStartE2EDuration="48.279457705s" podCreationTimestamp="2026-02-27 01:23:53 +0000 UTC" firstStartedPulling="2026-02-27 01:23:54.610150131 +0000 UTC m=+1147.547711419" lastFinishedPulling="2026-02-27 01:24:38.813000183 +0000 UTC m=+1191.750561471" observedRunningTime="2026-02-27 01:24:41.266345859 +0000 UTC m=+1194.203907147" watchObservedRunningTime="2026-02-27 01:24:41.279457705 +0000 UTC m=+1194.217019013" Feb 27 01:24:41 crc kubenswrapper[4771]: I0227 01:24:41.310009 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-56bfd8fdf6-rxxnr" podStartSLOduration=3.309983206 podStartE2EDuration="3.309983206s" podCreationTimestamp="2026-02-27 01:24:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:24:41.292002366 +0000 UTC m=+1194.229563654" watchObservedRunningTime="2026-02-27 01:24:41.309983206 +0000 UTC m=+1194.247544494" Feb 27 01:24:41 crc kubenswrapper[4771]: I0227 01:24:41.328188 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=13.328169481 podStartE2EDuration="13.328169481s" podCreationTimestamp="2026-02-27 01:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:24:41.31895773 +0000 UTC m=+1194.256519018" watchObservedRunningTime="2026-02-27 01:24:41.328169481 +0000 UTC m=+1194.265730769" Feb 27 01:24:41 crc kubenswrapper[4771]: I0227 01:24:41.361365 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.361344334 podStartE2EDuration="3.361344334s" podCreationTimestamp="2026-02-27 01:24:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:24:41.351409383 +0000 UTC m=+1194.288970671" watchObservedRunningTime="2026-02-27 01:24:41.361344334 +0000 UTC m=+1194.298905622" Feb 27 01:24:42 crc kubenswrapper[4771]: I0227 01:24:42.395405 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7fb8f8d788-kjgv6" podUID="9d4b37ec-a290-4c5c-9a05-31e0499c3ed7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Feb 27 01:24:42 crc kubenswrapper[4771]: I0227 01:24:42.646880 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-555c84df64-lmgxw" podUID="9db15a3b-2c83-4d54-b5ea-697e6362b4e9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.157:8443: connect: connection refused" Feb 27 01:24:43 crc kubenswrapper[4771]: I0227 01:24:43.286134 4771 generic.go:334] "Generic (PLEG): container finished" podID="b411543d-f7a2-4a56-acb5-9b2d9598739a" containerID="e92d952b0af34475040583946aa19405a784b9afca08253635c5d9f8c5d9db1d" exitCode=0 Feb 27 01:24:43 crc kubenswrapper[4771]: I0227 01:24:43.286391 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-tclqb" event={"ID":"b411543d-f7a2-4a56-acb5-9b2d9598739a","Type":"ContainerDied","Data":"e92d952b0af34475040583946aa19405a784b9afca08253635c5d9f8c5d9db1d"} Feb 27 01:24:44 crc kubenswrapper[4771]: I0227 01:24:44.298663 4771 generic.go:334] "Generic (PLEG): container finished" podID="a592bd48-ea9a-4f6c-a7fe-49185fbbed82" containerID="96a82b8dbdf626beb081b086821121237bcc704dca5923bbb45584bbac786f18" exitCode=0 Feb 27 01:24:44 crc kubenswrapper[4771]: I0227 01:24:44.298745 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-r69k6" event={"ID":"a592bd48-ea9a-4f6c-a7fe-49185fbbed82","Type":"ContainerDied","Data":"96a82b8dbdf626beb081b086821121237bcc704dca5923bbb45584bbac786f18"} Feb 27 01:24:45 crc kubenswrapper[4771]: I0227 01:24:45.313537 4771 generic.go:334] "Generic (PLEG): container finished" podID="37e7849a-97b9-4e3d-9ad3-c0c942775e64" containerID="cbd0d7c7d1911e59d1ef392693fbaee34ea486d72e649c9b2f23b338d82aa822" exitCode=0 Feb 27 01:24:45 crc kubenswrapper[4771]: I0227 01:24:45.313646 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zhfdt" event={"ID":"37e7849a-97b9-4e3d-9ad3-c0c942775e64","Type":"ContainerDied","Data":"cbd0d7c7d1911e59d1ef392693fbaee34ea486d72e649c9b2f23b338d82aa822"} Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.202695 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-tclqb" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.214738 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-r69k6" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.323887 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-tclqb" event={"ID":"b411543d-f7a2-4a56-acb5-9b2d9598739a","Type":"ContainerDied","Data":"e24d075f680eafd68e20864a624e7af6935c55c75d8bdbc7902e246aac015cb7"} Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.324246 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e24d075f680eafd68e20864a624e7af6935c55c75d8bdbc7902e246aac015cb7" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.324116 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-tclqb" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.327008 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-r69k6" event={"ID":"a592bd48-ea9a-4f6c-a7fe-49185fbbed82","Type":"ContainerDied","Data":"eeee4f76390f988441a32b7ca39a4c74044868e5e415b1893c2124acd0e9c7b2"} Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.327041 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eeee4f76390f988441a32b7ca39a4c74044868e5e415b1893c2124acd0e9c7b2" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.327087 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-r69k6" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.368653 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b411543d-f7a2-4a56-acb5-9b2d9598739a-config-data\") pod \"b411543d-f7a2-4a56-acb5-9b2d9598739a\" (UID: \"b411543d-f7a2-4a56-acb5-9b2d9598739a\") " Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.368709 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b411543d-f7a2-4a56-acb5-9b2d9598739a-scripts\") pod \"b411543d-f7a2-4a56-acb5-9b2d9598739a\" (UID: \"b411543d-f7a2-4a56-acb5-9b2d9598739a\") " Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.368756 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a592bd48-ea9a-4f6c-a7fe-49185fbbed82-db-sync-config-data\") pod \"a592bd48-ea9a-4f6c-a7fe-49185fbbed82\" (UID: \"a592bd48-ea9a-4f6c-a7fe-49185fbbed82\") " Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.368781 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a592bd48-ea9a-4f6c-a7fe-49185fbbed82-combined-ca-bundle\") pod \"a592bd48-ea9a-4f6c-a7fe-49185fbbed82\" (UID: \"a592bd48-ea9a-4f6c-a7fe-49185fbbed82\") " Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.368812 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b411543d-f7a2-4a56-acb5-9b2d9598739a-combined-ca-bundle\") pod \"b411543d-f7a2-4a56-acb5-9b2d9598739a\" (UID: \"b411543d-f7a2-4a56-acb5-9b2d9598739a\") " Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.368873 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwlnl\" (UniqueName: \"kubernetes.io/projected/a592bd48-ea9a-4f6c-a7fe-49185fbbed82-kube-api-access-vwlnl\") pod \"a592bd48-ea9a-4f6c-a7fe-49185fbbed82\" (UID: \"a592bd48-ea9a-4f6c-a7fe-49185fbbed82\") " Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.368930 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5tfq\" (UniqueName: \"kubernetes.io/projected/b411543d-f7a2-4a56-acb5-9b2d9598739a-kube-api-access-d5tfq\") pod \"b411543d-f7a2-4a56-acb5-9b2d9598739a\" (UID: \"b411543d-f7a2-4a56-acb5-9b2d9598739a\") " Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.368950 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b411543d-f7a2-4a56-acb5-9b2d9598739a-logs\") pod \"b411543d-f7a2-4a56-acb5-9b2d9598739a\" (UID: \"b411543d-f7a2-4a56-acb5-9b2d9598739a\") " Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.369715 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b411543d-f7a2-4a56-acb5-9b2d9598739a-logs" (OuterVolumeSpecName: "logs") pod "b411543d-f7a2-4a56-acb5-9b2d9598739a" (UID: "b411543d-f7a2-4a56-acb5-9b2d9598739a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.374965 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b411543d-f7a2-4a56-acb5-9b2d9598739a-scripts" (OuterVolumeSpecName: "scripts") pod "b411543d-f7a2-4a56-acb5-9b2d9598739a" (UID: "b411543d-f7a2-4a56-acb5-9b2d9598739a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.375746 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b411543d-f7a2-4a56-acb5-9b2d9598739a-kube-api-access-d5tfq" (OuterVolumeSpecName: "kube-api-access-d5tfq") pod "b411543d-f7a2-4a56-acb5-9b2d9598739a" (UID: "b411543d-f7a2-4a56-acb5-9b2d9598739a"). InnerVolumeSpecName "kube-api-access-d5tfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.376227 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a592bd48-ea9a-4f6c-a7fe-49185fbbed82-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a592bd48-ea9a-4f6c-a7fe-49185fbbed82" (UID: "a592bd48-ea9a-4f6c-a7fe-49185fbbed82"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.378442 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a592bd48-ea9a-4f6c-a7fe-49185fbbed82-kube-api-access-vwlnl" (OuterVolumeSpecName: "kube-api-access-vwlnl") pod "a592bd48-ea9a-4f6c-a7fe-49185fbbed82" (UID: "a592bd48-ea9a-4f6c-a7fe-49185fbbed82"). InnerVolumeSpecName "kube-api-access-vwlnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.403724 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b411543d-f7a2-4a56-acb5-9b2d9598739a-config-data" (OuterVolumeSpecName: "config-data") pod "b411543d-f7a2-4a56-acb5-9b2d9598739a" (UID: "b411543d-f7a2-4a56-acb5-9b2d9598739a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.403815 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a592bd48-ea9a-4f6c-a7fe-49185fbbed82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a592bd48-ea9a-4f6c-a7fe-49185fbbed82" (UID: "a592bd48-ea9a-4f6c-a7fe-49185fbbed82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.405987 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b411543d-f7a2-4a56-acb5-9b2d9598739a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b411543d-f7a2-4a56-acb5-9b2d9598739a" (UID: "b411543d-f7a2-4a56-acb5-9b2d9598739a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.470963 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b411543d-f7a2-4a56-acb5-9b2d9598739a-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.470997 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b411543d-f7a2-4a56-acb5-9b2d9598739a-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.471005 4771 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a592bd48-ea9a-4f6c-a7fe-49185fbbed82-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.471016 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a592bd48-ea9a-4f6c-a7fe-49185fbbed82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.471024 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b411543d-f7a2-4a56-acb5-9b2d9598739a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.471032 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwlnl\" (UniqueName: \"kubernetes.io/projected/a592bd48-ea9a-4f6c-a7fe-49185fbbed82-kube-api-access-vwlnl\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.471041 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5tfq\" (UniqueName: \"kubernetes.io/projected/b411543d-f7a2-4a56-acb5-9b2d9598739a-kube-api-access-d5tfq\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.471048 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b411543d-f7a2-4a56-acb5-9b2d9598739a-logs\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.504427 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5894b4657f-lj4ff"] Feb 27 01:24:46 crc kubenswrapper[4771]: E0227 01:24:46.504865 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a592bd48-ea9a-4f6c-a7fe-49185fbbed82" containerName="barbican-db-sync" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.504877 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a592bd48-ea9a-4f6c-a7fe-49185fbbed82" containerName="barbican-db-sync" Feb 27 01:24:46 crc kubenswrapper[4771]: E0227 01:24:46.504896 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b411543d-f7a2-4a56-acb5-9b2d9598739a" containerName="placement-db-sync" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.504903 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b411543d-f7a2-4a56-acb5-9b2d9598739a" containerName="placement-db-sync" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.505074 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b411543d-f7a2-4a56-acb5-9b2d9598739a" containerName="placement-db-sync" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.505087 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a592bd48-ea9a-4f6c-a7fe-49185fbbed82" containerName="barbican-db-sync" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.510106 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5894b4657f-lj4ff" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.517030 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.518474 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5894b4657f-lj4ff"] Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.578595 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dvmlh"] Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.579821 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdw58\" (UniqueName: \"kubernetes.io/projected/24cb181d-8c43-4ae8-9af0-b28f570f7f22-kube-api-access-gdw58\") pod \"barbican-worker-5894b4657f-lj4ff\" (UID: \"24cb181d-8c43-4ae8-9af0-b28f570f7f22\") " pod="openstack/barbican-worker-5894b4657f-lj4ff" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.579858 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24cb181d-8c43-4ae8-9af0-b28f570f7f22-config-data-custom\") pod \"barbican-worker-5894b4657f-lj4ff\" (UID: \"24cb181d-8c43-4ae8-9af0-b28f570f7f22\") " pod="openstack/barbican-worker-5894b4657f-lj4ff" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.579911 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24cb181d-8c43-4ae8-9af0-b28f570f7f22-logs\") pod \"barbican-worker-5894b4657f-lj4ff\" (UID: \"24cb181d-8c43-4ae8-9af0-b28f570f7f22\") " pod="openstack/barbican-worker-5894b4657f-lj4ff" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.579939 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24cb181d-8c43-4ae8-9af0-b28f570f7f22-config-data\") pod \"barbican-worker-5894b4657f-lj4ff\" (UID: \"24cb181d-8c43-4ae8-9af0-b28f570f7f22\") " pod="openstack/barbican-worker-5894b4657f-lj4ff" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.579960 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24cb181d-8c43-4ae8-9af0-b28f570f7f22-combined-ca-bundle\") pod \"barbican-worker-5894b4657f-lj4ff\" (UID: \"24cb181d-8c43-4ae8-9af0-b28f570f7f22\") " pod="openstack/barbican-worker-5894b4657f-lj4ff" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.580041 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-dvmlh" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.597135 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-66c7555cc4-mtbzr"] Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.607888 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-66c7555cc4-mtbzr" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.613608 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dvmlh"] Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.614698 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.624215 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-66c7555cc4-mtbzr"] Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.683400 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-dvmlh\" (UID: \"2bfd449e-5331-4591-b41b-b8603798476b\") " pod="openstack/dnsmasq-dns-85ff748b95-dvmlh" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.683450 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdw58\" (UniqueName: \"kubernetes.io/projected/24cb181d-8c43-4ae8-9af0-b28f570f7f22-kube-api-access-gdw58\") pod \"barbican-worker-5894b4657f-lj4ff\" (UID: \"24cb181d-8c43-4ae8-9af0-b28f570f7f22\") " pod="openstack/barbican-worker-5894b4657f-lj4ff" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.683470 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24cb181d-8c43-4ae8-9af0-b28f570f7f22-config-data-custom\") pod \"barbican-worker-5894b4657f-lj4ff\" (UID: \"24cb181d-8c43-4ae8-9af0-b28f570f7f22\") " pod="openstack/barbican-worker-5894b4657f-lj4ff" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.683508 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-dvmlh\" (UID: \"2bfd449e-5331-4591-b41b-b8603798476b\") " pod="openstack/dnsmasq-dns-85ff748b95-dvmlh" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.683576 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24cb181d-8c43-4ae8-9af0-b28f570f7f22-logs\") pod \"barbican-worker-5894b4657f-lj4ff\" (UID: \"24cb181d-8c43-4ae8-9af0-b28f570f7f22\") " pod="openstack/barbican-worker-5894b4657f-lj4ff" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.683602 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13fb6f6e-1dda-4e09-971a-d0629bc44ff4-config-data-custom\") pod \"barbican-keystone-listener-66c7555cc4-mtbzr\" (UID: \"13fb6f6e-1dda-4e09-971a-d0629bc44ff4\") " pod="openstack/barbican-keystone-listener-66c7555cc4-mtbzr" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.683626 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24cb181d-8c43-4ae8-9af0-b28f570f7f22-config-data\") pod \"barbican-worker-5894b4657f-lj4ff\" (UID: \"24cb181d-8c43-4ae8-9af0-b28f570f7f22\") " pod="openstack/barbican-worker-5894b4657f-lj4ff" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.683646 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24cb181d-8c43-4ae8-9af0-b28f570f7f22-combined-ca-bundle\") pod \"barbican-worker-5894b4657f-lj4ff\" (UID: \"24cb181d-8c43-4ae8-9af0-b28f570f7f22\") " pod="openstack/barbican-worker-5894b4657f-lj4ff" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.683667 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-config\") pod \"dnsmasq-dns-85ff748b95-dvmlh\" (UID: \"2bfd449e-5331-4591-b41b-b8603798476b\") " pod="openstack/dnsmasq-dns-85ff748b95-dvmlh" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.683680 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13fb6f6e-1dda-4e09-971a-d0629bc44ff4-logs\") pod \"barbican-keystone-listener-66c7555cc4-mtbzr\" (UID: \"13fb6f6e-1dda-4e09-971a-d0629bc44ff4\") " pod="openstack/barbican-keystone-listener-66c7555cc4-mtbzr" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.683718 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13fb6f6e-1dda-4e09-971a-d0629bc44ff4-config-data\") pod \"barbican-keystone-listener-66c7555cc4-mtbzr\" (UID: \"13fb6f6e-1dda-4e09-971a-d0629bc44ff4\") " pod="openstack/barbican-keystone-listener-66c7555cc4-mtbzr" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.683739 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-dns-svc\") pod \"dnsmasq-dns-85ff748b95-dvmlh\" (UID: \"2bfd449e-5331-4591-b41b-b8603798476b\") " pod="openstack/dnsmasq-dns-85ff748b95-dvmlh" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.683764 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98lf7\" (UniqueName: \"kubernetes.io/projected/2bfd449e-5331-4591-b41b-b8603798476b-kube-api-access-98lf7\") pod \"dnsmasq-dns-85ff748b95-dvmlh\" (UID: \"2bfd449e-5331-4591-b41b-b8603798476b\") " pod="openstack/dnsmasq-dns-85ff748b95-dvmlh" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.683805 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13fb6f6e-1dda-4e09-971a-d0629bc44ff4-combined-ca-bundle\") pod \"barbican-keystone-listener-66c7555cc4-mtbzr\" (UID: \"13fb6f6e-1dda-4e09-971a-d0629bc44ff4\") " pod="openstack/barbican-keystone-listener-66c7555cc4-mtbzr" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.683839 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-dvmlh\" (UID: \"2bfd449e-5331-4591-b41b-b8603798476b\") " pod="openstack/dnsmasq-dns-85ff748b95-dvmlh" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.683857 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ns4f\" (UniqueName: \"kubernetes.io/projected/13fb6f6e-1dda-4e09-971a-d0629bc44ff4-kube-api-access-9ns4f\") pod \"barbican-keystone-listener-66c7555cc4-mtbzr\" (UID: \"13fb6f6e-1dda-4e09-971a-d0629bc44ff4\") " pod="openstack/barbican-keystone-listener-66c7555cc4-mtbzr" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.686068 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24cb181d-8c43-4ae8-9af0-b28f570f7f22-logs\") pod \"barbican-worker-5894b4657f-lj4ff\" (UID: \"24cb181d-8c43-4ae8-9af0-b28f570f7f22\") " pod="openstack/barbican-worker-5894b4657f-lj4ff" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.690446 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24cb181d-8c43-4ae8-9af0-b28f570f7f22-config-data-custom\") pod \"barbican-worker-5894b4657f-lj4ff\" (UID: \"24cb181d-8c43-4ae8-9af0-b28f570f7f22\") " pod="openstack/barbican-worker-5894b4657f-lj4ff" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.693303 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24cb181d-8c43-4ae8-9af0-b28f570f7f22-combined-ca-bundle\") pod \"barbican-worker-5894b4657f-lj4ff\" (UID: \"24cb181d-8c43-4ae8-9af0-b28f570f7f22\") " pod="openstack/barbican-worker-5894b4657f-lj4ff" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.702060 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdw58\" (UniqueName: \"kubernetes.io/projected/24cb181d-8c43-4ae8-9af0-b28f570f7f22-kube-api-access-gdw58\") pod \"barbican-worker-5894b4657f-lj4ff\" (UID: \"24cb181d-8c43-4ae8-9af0-b28f570f7f22\") " pod="openstack/barbican-worker-5894b4657f-lj4ff" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.734265 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24cb181d-8c43-4ae8-9af0-b28f570f7f22-config-data\") pod \"barbican-worker-5894b4657f-lj4ff\" (UID: \"24cb181d-8c43-4ae8-9af0-b28f570f7f22\") " pod="openstack/barbican-worker-5894b4657f-lj4ff" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.784877 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-dvmlh\" (UID: \"2bfd449e-5331-4591-b41b-b8603798476b\") " pod="openstack/dnsmasq-dns-85ff748b95-dvmlh" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.784926 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13fb6f6e-1dda-4e09-971a-d0629bc44ff4-config-data-custom\") pod \"barbican-keystone-listener-66c7555cc4-mtbzr\" (UID: \"13fb6f6e-1dda-4e09-971a-d0629bc44ff4\") " pod="openstack/barbican-keystone-listener-66c7555cc4-mtbzr" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.784972 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-config\") pod \"dnsmasq-dns-85ff748b95-dvmlh\" (UID: \"2bfd449e-5331-4591-b41b-b8603798476b\") " pod="openstack/dnsmasq-dns-85ff748b95-dvmlh" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.784989 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13fb6f6e-1dda-4e09-971a-d0629bc44ff4-logs\") pod \"barbican-keystone-listener-66c7555cc4-mtbzr\" (UID: \"13fb6f6e-1dda-4e09-971a-d0629bc44ff4\") " pod="openstack/barbican-keystone-listener-66c7555cc4-mtbzr" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.785030 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13fb6f6e-1dda-4e09-971a-d0629bc44ff4-config-data\") pod \"barbican-keystone-listener-66c7555cc4-mtbzr\" (UID: \"13fb6f6e-1dda-4e09-971a-d0629bc44ff4\") " pod="openstack/barbican-keystone-listener-66c7555cc4-mtbzr" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.785049 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-dns-svc\") pod \"dnsmasq-dns-85ff748b95-dvmlh\" (UID: \"2bfd449e-5331-4591-b41b-b8603798476b\") " pod="openstack/dnsmasq-dns-85ff748b95-dvmlh" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.785068 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98lf7\" (UniqueName: \"kubernetes.io/projected/2bfd449e-5331-4591-b41b-b8603798476b-kube-api-access-98lf7\") pod \"dnsmasq-dns-85ff748b95-dvmlh\" (UID: \"2bfd449e-5331-4591-b41b-b8603798476b\") " pod="openstack/dnsmasq-dns-85ff748b95-dvmlh" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.785102 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13fb6f6e-1dda-4e09-971a-d0629bc44ff4-combined-ca-bundle\") pod \"barbican-keystone-listener-66c7555cc4-mtbzr\" (UID: \"13fb6f6e-1dda-4e09-971a-d0629bc44ff4\") " pod="openstack/barbican-keystone-listener-66c7555cc4-mtbzr" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.785140 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-dvmlh\" (UID: \"2bfd449e-5331-4591-b41b-b8603798476b\") " pod="openstack/dnsmasq-dns-85ff748b95-dvmlh" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.785157 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ns4f\" (UniqueName: \"kubernetes.io/projected/13fb6f6e-1dda-4e09-971a-d0629bc44ff4-kube-api-access-9ns4f\") pod \"barbican-keystone-listener-66c7555cc4-mtbzr\" (UID: \"13fb6f6e-1dda-4e09-971a-d0629bc44ff4\") " pod="openstack/barbican-keystone-listener-66c7555cc4-mtbzr" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.785175 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-dvmlh\" (UID: \"2bfd449e-5331-4591-b41b-b8603798476b\") " pod="openstack/dnsmasq-dns-85ff748b95-dvmlh" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.785943 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-dvmlh\" (UID: \"2bfd449e-5331-4591-b41b-b8603798476b\") " pod="openstack/dnsmasq-dns-85ff748b95-dvmlh" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.787123 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-dvmlh\" (UID: \"2bfd449e-5331-4591-b41b-b8603798476b\") " pod="openstack/dnsmasq-dns-85ff748b95-dvmlh" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.817018 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13fb6f6e-1dda-4e09-971a-d0629bc44ff4-config-data-custom\") pod \"barbican-keystone-listener-66c7555cc4-mtbzr\" (UID: \"13fb6f6e-1dda-4e09-971a-d0629bc44ff4\") " pod="openstack/barbican-keystone-listener-66c7555cc4-mtbzr" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.818574 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13fb6f6e-1dda-4e09-971a-d0629bc44ff4-config-data\") pod \"barbican-keystone-listener-66c7555cc4-mtbzr\" (UID: \"13fb6f6e-1dda-4e09-971a-d0629bc44ff4\") " pod="openstack/barbican-keystone-listener-66c7555cc4-mtbzr" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.819065 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13fb6f6e-1dda-4e09-971a-d0629bc44ff4-combined-ca-bundle\") pod \"barbican-keystone-listener-66c7555cc4-mtbzr\" (UID: \"13fb6f6e-1dda-4e09-971a-d0629bc44ff4\") " pod="openstack/barbican-keystone-listener-66c7555cc4-mtbzr" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.825071 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13fb6f6e-1dda-4e09-971a-d0629bc44ff4-logs\") pod \"barbican-keystone-listener-66c7555cc4-mtbzr\" (UID: \"13fb6f6e-1dda-4e09-971a-d0629bc44ff4\") " pod="openstack/barbican-keystone-listener-66c7555cc4-mtbzr" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.825163 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-599ccf9f8d-z6nsl"] Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.827056 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ns4f\" (UniqueName: \"kubernetes.io/projected/13fb6f6e-1dda-4e09-971a-d0629bc44ff4-kube-api-access-9ns4f\") pod \"barbican-keystone-listener-66c7555cc4-mtbzr\" (UID: \"13fb6f6e-1dda-4e09-971a-d0629bc44ff4\") " pod="openstack/barbican-keystone-listener-66c7555cc4-mtbzr" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.827166 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98lf7\" (UniqueName: \"kubernetes.io/projected/2bfd449e-5331-4591-b41b-b8603798476b-kube-api-access-98lf7\") pod \"dnsmasq-dns-85ff748b95-dvmlh\" (UID: \"2bfd449e-5331-4591-b41b-b8603798476b\") " pod="openstack/dnsmasq-dns-85ff748b95-dvmlh" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.829107 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-599ccf9f8d-z6nsl" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.832404 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.839598 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-dns-svc\") pod \"dnsmasq-dns-85ff748b95-dvmlh\" (UID: \"2bfd449e-5331-4591-b41b-b8603798476b\") " pod="openstack/dnsmasq-dns-85ff748b95-dvmlh" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.839654 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-dvmlh\" (UID: \"2bfd449e-5331-4591-b41b-b8603798476b\") " pod="openstack/dnsmasq-dns-85ff748b95-dvmlh" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.839876 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-config\") pod \"dnsmasq-dns-85ff748b95-dvmlh\" (UID: \"2bfd449e-5331-4591-b41b-b8603798476b\") " pod="openstack/dnsmasq-dns-85ff748b95-dvmlh" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.839921 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5894b4657f-lj4ff" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.864614 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-599ccf9f8d-z6nsl"] Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.892358 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m729f\" (UniqueName: \"kubernetes.io/projected/b53fd943-6ac9-4338-b3d5-83a9627f1c78-kube-api-access-m729f\") pod \"barbican-api-599ccf9f8d-z6nsl\" (UID: \"b53fd943-6ac9-4338-b3d5-83a9627f1c78\") " pod="openstack/barbican-api-599ccf9f8d-z6nsl" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.892516 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b53fd943-6ac9-4338-b3d5-83a9627f1c78-config-data\") pod \"barbican-api-599ccf9f8d-z6nsl\" (UID: \"b53fd943-6ac9-4338-b3d5-83a9627f1c78\") " pod="openstack/barbican-api-599ccf9f8d-z6nsl" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.892537 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b53fd943-6ac9-4338-b3d5-83a9627f1c78-logs\") pod \"barbican-api-599ccf9f8d-z6nsl\" (UID: \"b53fd943-6ac9-4338-b3d5-83a9627f1c78\") " pod="openstack/barbican-api-599ccf9f8d-z6nsl" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.892735 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b53fd943-6ac9-4338-b3d5-83a9627f1c78-combined-ca-bundle\") pod \"barbican-api-599ccf9f8d-z6nsl\" (UID: \"b53fd943-6ac9-4338-b3d5-83a9627f1c78\") " pod="openstack/barbican-api-599ccf9f8d-z6nsl" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.892793 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b53fd943-6ac9-4338-b3d5-83a9627f1c78-config-data-custom\") pod \"barbican-api-599ccf9f8d-z6nsl\" (UID: \"b53fd943-6ac9-4338-b3d5-83a9627f1c78\") " pod="openstack/barbican-api-599ccf9f8d-z6nsl" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.940926 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-66c7555cc4-mtbzr" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.941287 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-dvmlh" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.993851 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b53fd943-6ac9-4338-b3d5-83a9627f1c78-config-data\") pod \"barbican-api-599ccf9f8d-z6nsl\" (UID: \"b53fd943-6ac9-4338-b3d5-83a9627f1c78\") " pod="openstack/barbican-api-599ccf9f8d-z6nsl" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.993895 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b53fd943-6ac9-4338-b3d5-83a9627f1c78-logs\") pod \"barbican-api-599ccf9f8d-z6nsl\" (UID: \"b53fd943-6ac9-4338-b3d5-83a9627f1c78\") " pod="openstack/barbican-api-599ccf9f8d-z6nsl" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.993945 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b53fd943-6ac9-4338-b3d5-83a9627f1c78-combined-ca-bundle\") pod \"barbican-api-599ccf9f8d-z6nsl\" (UID: \"b53fd943-6ac9-4338-b3d5-83a9627f1c78\") " pod="openstack/barbican-api-599ccf9f8d-z6nsl" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.993998 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b53fd943-6ac9-4338-b3d5-83a9627f1c78-config-data-custom\") pod \"barbican-api-599ccf9f8d-z6nsl\" (UID: \"b53fd943-6ac9-4338-b3d5-83a9627f1c78\") " pod="openstack/barbican-api-599ccf9f8d-z6nsl" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.994026 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m729f\" (UniqueName: \"kubernetes.io/projected/b53fd943-6ac9-4338-b3d5-83a9627f1c78-kube-api-access-m729f\") pod \"barbican-api-599ccf9f8d-z6nsl\" (UID: \"b53fd943-6ac9-4338-b3d5-83a9627f1c78\") " pod="openstack/barbican-api-599ccf9f8d-z6nsl" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.995398 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b53fd943-6ac9-4338-b3d5-83a9627f1c78-logs\") pod \"barbican-api-599ccf9f8d-z6nsl\" (UID: \"b53fd943-6ac9-4338-b3d5-83a9627f1c78\") " pod="openstack/barbican-api-599ccf9f8d-z6nsl" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.998346 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b53fd943-6ac9-4338-b3d5-83a9627f1c78-combined-ca-bundle\") pod \"barbican-api-599ccf9f8d-z6nsl\" (UID: \"b53fd943-6ac9-4338-b3d5-83a9627f1c78\") " pod="openstack/barbican-api-599ccf9f8d-z6nsl" Feb 27 01:24:46 crc kubenswrapper[4771]: I0227 01:24:46.999209 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b53fd943-6ac9-4338-b3d5-83a9627f1c78-config-data-custom\") pod \"barbican-api-599ccf9f8d-z6nsl\" (UID: \"b53fd943-6ac9-4338-b3d5-83a9627f1c78\") " pod="openstack/barbican-api-599ccf9f8d-z6nsl" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.015475 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b53fd943-6ac9-4338-b3d5-83a9627f1c78-config-data\") pod \"barbican-api-599ccf9f8d-z6nsl\" (UID: \"b53fd943-6ac9-4338-b3d5-83a9627f1c78\") " pod="openstack/barbican-api-599ccf9f8d-z6nsl" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.017012 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m729f\" (UniqueName: \"kubernetes.io/projected/b53fd943-6ac9-4338-b3d5-83a9627f1c78-kube-api-access-m729f\") pod \"barbican-api-599ccf9f8d-z6nsl\" (UID: \"b53fd943-6ac9-4338-b3d5-83a9627f1c78\") " pod="openstack/barbican-api-599ccf9f8d-z6nsl" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.173211 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-599ccf9f8d-z6nsl" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.311396 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5cddbc5576-b9kzz"] Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.315181 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.317297 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.317503 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-fn4nn" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.317365 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.317378 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.317690 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.322195 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5cddbc5576-b9kzz"] Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.506503 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/577d7298-4011-4f66-a59c-36b823400652-internal-tls-certs\") pod \"placement-5cddbc5576-b9kzz\" (UID: \"577d7298-4011-4f66-a59c-36b823400652\") " pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.506782 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/577d7298-4011-4f66-a59c-36b823400652-config-data\") pod \"placement-5cddbc5576-b9kzz\" (UID: \"577d7298-4011-4f66-a59c-36b823400652\") " pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.506925 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/577d7298-4011-4f66-a59c-36b823400652-logs\") pod \"placement-5cddbc5576-b9kzz\" (UID: \"577d7298-4011-4f66-a59c-36b823400652\") " pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.507201 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm69f\" (UniqueName: \"kubernetes.io/projected/577d7298-4011-4f66-a59c-36b823400652-kube-api-access-mm69f\") pod \"placement-5cddbc5576-b9kzz\" (UID: \"577d7298-4011-4f66-a59c-36b823400652\") " pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.507331 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/577d7298-4011-4f66-a59c-36b823400652-public-tls-certs\") pod \"placement-5cddbc5576-b9kzz\" (UID: \"577d7298-4011-4f66-a59c-36b823400652\") " pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.507455 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/577d7298-4011-4f66-a59c-36b823400652-combined-ca-bundle\") pod \"placement-5cddbc5576-b9kzz\" (UID: \"577d7298-4011-4f66-a59c-36b823400652\") " pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.507540 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/577d7298-4011-4f66-a59c-36b823400652-scripts\") pod \"placement-5cddbc5576-b9kzz\" (UID: \"577d7298-4011-4f66-a59c-36b823400652\") " pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.608904 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm69f\" (UniqueName: \"kubernetes.io/projected/577d7298-4011-4f66-a59c-36b823400652-kube-api-access-mm69f\") pod \"placement-5cddbc5576-b9kzz\" (UID: \"577d7298-4011-4f66-a59c-36b823400652\") " pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.609240 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/577d7298-4011-4f66-a59c-36b823400652-public-tls-certs\") pod \"placement-5cddbc5576-b9kzz\" (UID: \"577d7298-4011-4f66-a59c-36b823400652\") " pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.609300 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/577d7298-4011-4f66-a59c-36b823400652-combined-ca-bundle\") pod \"placement-5cddbc5576-b9kzz\" (UID: \"577d7298-4011-4f66-a59c-36b823400652\") " pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.609321 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/577d7298-4011-4f66-a59c-36b823400652-scripts\") pod \"placement-5cddbc5576-b9kzz\" (UID: \"577d7298-4011-4f66-a59c-36b823400652\") " pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.609380 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/577d7298-4011-4f66-a59c-36b823400652-internal-tls-certs\") pod \"placement-5cddbc5576-b9kzz\" (UID: \"577d7298-4011-4f66-a59c-36b823400652\") " pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.609410 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/577d7298-4011-4f66-a59c-36b823400652-config-data\") pod \"placement-5cddbc5576-b9kzz\" (UID: \"577d7298-4011-4f66-a59c-36b823400652\") " pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.609430 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/577d7298-4011-4f66-a59c-36b823400652-logs\") pod \"placement-5cddbc5576-b9kzz\" (UID: \"577d7298-4011-4f66-a59c-36b823400652\") " pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.609853 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/577d7298-4011-4f66-a59c-36b823400652-logs\") pod \"placement-5cddbc5576-b9kzz\" (UID: \"577d7298-4011-4f66-a59c-36b823400652\") " pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.613420 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/577d7298-4011-4f66-a59c-36b823400652-public-tls-certs\") pod \"placement-5cddbc5576-b9kzz\" (UID: \"577d7298-4011-4f66-a59c-36b823400652\") " pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.617381 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/577d7298-4011-4f66-a59c-36b823400652-scripts\") pod \"placement-5cddbc5576-b9kzz\" (UID: \"577d7298-4011-4f66-a59c-36b823400652\") " pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.618242 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/577d7298-4011-4f66-a59c-36b823400652-config-data\") pod \"placement-5cddbc5576-b9kzz\" (UID: \"577d7298-4011-4f66-a59c-36b823400652\") " pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.620067 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/577d7298-4011-4f66-a59c-36b823400652-combined-ca-bundle\") pod \"placement-5cddbc5576-b9kzz\" (UID: \"577d7298-4011-4f66-a59c-36b823400652\") " pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.621144 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/577d7298-4011-4f66-a59c-36b823400652-internal-tls-certs\") pod \"placement-5cddbc5576-b9kzz\" (UID: \"577d7298-4011-4f66-a59c-36b823400652\") " pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.628883 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm69f\" (UniqueName: \"kubernetes.io/projected/577d7298-4011-4f66-a59c-36b823400652-kube-api-access-mm69f\") pod \"placement-5cddbc5576-b9kzz\" (UID: \"577d7298-4011-4f66-a59c-36b823400652\") " pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.639857 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.705268 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zhfdt" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.709826 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e7849a-97b9-4e3d-9ad3-c0c942775e64-config-data\") pod \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\" (UID: \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\") " Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.709917 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e7849a-97b9-4e3d-9ad3-c0c942775e64-combined-ca-bundle\") pod \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\" (UID: \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\") " Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.709959 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37e7849a-97b9-4e3d-9ad3-c0c942775e64-scripts\") pod \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\" (UID: \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\") " Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.710000 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scgr9\" (UniqueName: \"kubernetes.io/projected/37e7849a-97b9-4e3d-9ad3-c0c942775e64-kube-api-access-scgr9\") pod \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\" (UID: \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\") " Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.710063 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37e7849a-97b9-4e3d-9ad3-c0c942775e64-etc-machine-id\") pod \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\" (UID: \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\") " Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.710092 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/37e7849a-97b9-4e3d-9ad3-c0c942775e64-db-sync-config-data\") pod \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\" (UID: \"37e7849a-97b9-4e3d-9ad3-c0c942775e64\") " Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.711049 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/37e7849a-97b9-4e3d-9ad3-c0c942775e64-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "37e7849a-97b9-4e3d-9ad3-c0c942775e64" (UID: "37e7849a-97b9-4e3d-9ad3-c0c942775e64"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.771971 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37e7849a-97b9-4e3d-9ad3-c0c942775e64-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "37e7849a-97b9-4e3d-9ad3-c0c942775e64" (UID: "37e7849a-97b9-4e3d-9ad3-c0c942775e64"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.778908 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37e7849a-97b9-4e3d-9ad3-c0c942775e64-scripts" (OuterVolumeSpecName: "scripts") pod "37e7849a-97b9-4e3d-9ad3-c0c942775e64" (UID: "37e7849a-97b9-4e3d-9ad3-c0c942775e64"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.786132 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37e7849a-97b9-4e3d-9ad3-c0c942775e64-kube-api-access-scgr9" (OuterVolumeSpecName: "kube-api-access-scgr9") pod "37e7849a-97b9-4e3d-9ad3-c0c942775e64" (UID: "37e7849a-97b9-4e3d-9ad3-c0c942775e64"). InnerVolumeSpecName "kube-api-access-scgr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.798564 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37e7849a-97b9-4e3d-9ad3-c0c942775e64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37e7849a-97b9-4e3d-9ad3-c0c942775e64" (UID: "37e7849a-97b9-4e3d-9ad3-c0c942775e64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.812185 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37e7849a-97b9-4e3d-9ad3-c0c942775e64-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.812212 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scgr9\" (UniqueName: \"kubernetes.io/projected/37e7849a-97b9-4e3d-9ad3-c0c942775e64-kube-api-access-scgr9\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.812223 4771 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37e7849a-97b9-4e3d-9ad3-c0c942775e64-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.812232 4771 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/37e7849a-97b9-4e3d-9ad3-c0c942775e64-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.812242 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e7849a-97b9-4e3d-9ad3-c0c942775e64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.826684 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37e7849a-97b9-4e3d-9ad3-c0c942775e64-config-data" (OuterVolumeSpecName: "config-data") pod "37e7849a-97b9-4e3d-9ad3-c0c942775e64" (UID: "37e7849a-97b9-4e3d-9ad3-c0c942775e64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:47 crc kubenswrapper[4771]: I0227 01:24:47.913799 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e7849a-97b9-4e3d-9ad3-c0c942775e64-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:48 crc kubenswrapper[4771]: I0227 01:24:48.366916 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zhfdt" event={"ID":"37e7849a-97b9-4e3d-9ad3-c0c942775e64","Type":"ContainerDied","Data":"2cd6bcdc237d5014263bcfe603054958d12eed63049322e3c231e7a0ef90116d"} Feb 27 01:24:48 crc kubenswrapper[4771]: I0227 01:24:48.367249 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cd6bcdc237d5014263bcfe603054958d12eed63049322e3c231e7a0ef90116d" Feb 27 01:24:48 crc kubenswrapper[4771]: I0227 01:24:48.367127 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zhfdt" Feb 27 01:24:48 crc kubenswrapper[4771]: E0227 01:24:48.690799 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="56ed04c4-c2a4-47be-8b9f-faaea9aab6c3" Feb 27 01:24:48 crc kubenswrapper[4771]: I0227 01:24:48.722860 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 27 01:24:48 crc kubenswrapper[4771]: I0227 01:24:48.722919 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 27 01:24:48 crc kubenswrapper[4771]: I0227 01:24:48.760984 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 27 01:24:48 crc kubenswrapper[4771]: I0227 01:24:48.768858 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 27 01:24:48 crc kubenswrapper[4771]: W0227 01:24:48.971345 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bfd449e_5331_4591_b41b_b8603798476b.slice/crio-fa55cea504450c2b6be9d2379f2f71722b781ff3f673a7e18115a75848cb11e4 WatchSource:0}: Error finding container fa55cea504450c2b6be9d2379f2f71722b781ff3f673a7e18115a75848cb11e4: Status 404 returned error can't find the container with id fa55cea504450c2b6be9d2379f2f71722b781ff3f673a7e18115a75848cb11e4 Feb 27 01:24:48 crc kubenswrapper[4771]: I0227 01:24:48.976012 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dvmlh"] Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.014177 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 01:24:49 crc kubenswrapper[4771]: E0227 01:24:49.014833 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e7849a-97b9-4e3d-9ad3-c0c942775e64" containerName="cinder-db-sync" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.014854 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e7849a-97b9-4e3d-9ad3-c0c942775e64" containerName="cinder-db-sync" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.015096 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="37e7849a-97b9-4e3d-9ad3-c0c942775e64" containerName="cinder-db-sync" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.016194 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.040220 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.040479 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.040724 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.041187 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-vdgqk" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.071954 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-599ccf9f8d-z6nsl"] Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.096607 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.134607 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03e3abf3-c61d-4e79-b832-35abf5025c30-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"03e3abf3-c61d-4e79-b832-35abf5025c30\") " pod="openstack/cinder-scheduler-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.134652 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03e3abf3-c61d-4e79-b832-35abf5025c30-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"03e3abf3-c61d-4e79-b832-35abf5025c30\") " pod="openstack/cinder-scheduler-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.134704 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03e3abf3-c61d-4e79-b832-35abf5025c30-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"03e3abf3-c61d-4e79-b832-35abf5025c30\") " pod="openstack/cinder-scheduler-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.134753 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03e3abf3-c61d-4e79-b832-35abf5025c30-scripts\") pod \"cinder-scheduler-0\" (UID: \"03e3abf3-c61d-4e79-b832-35abf5025c30\") " pod="openstack/cinder-scheduler-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.134781 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03e3abf3-c61d-4e79-b832-35abf5025c30-config-data\") pod \"cinder-scheduler-0\" (UID: \"03e3abf3-c61d-4e79-b832-35abf5025c30\") " pod="openstack/cinder-scheduler-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.134834 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz5xl\" (UniqueName: \"kubernetes.io/projected/03e3abf3-c61d-4e79-b832-35abf5025c30-kube-api-access-dz5xl\") pod \"cinder-scheduler-0\" (UID: \"03e3abf3-c61d-4e79-b832-35abf5025c30\") " pod="openstack/cinder-scheduler-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.162893 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dvmlh"] Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.188195 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5894b4657f-lj4ff"] Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.216622 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5cddbc5576-b9kzz"] Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.238225 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03e3abf3-c61d-4e79-b832-35abf5025c30-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"03e3abf3-c61d-4e79-b832-35abf5025c30\") " pod="openstack/cinder-scheduler-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.238297 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03e3abf3-c61d-4e79-b832-35abf5025c30-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"03e3abf3-c61d-4e79-b832-35abf5025c30\") " pod="openstack/cinder-scheduler-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.238348 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03e3abf3-c61d-4e79-b832-35abf5025c30-scripts\") pod \"cinder-scheduler-0\" (UID: \"03e3abf3-c61d-4e79-b832-35abf5025c30\") " pod="openstack/cinder-scheduler-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.238379 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03e3abf3-c61d-4e79-b832-35abf5025c30-config-data\") pod \"cinder-scheduler-0\" (UID: \"03e3abf3-c61d-4e79-b832-35abf5025c30\") " pod="openstack/cinder-scheduler-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.238438 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz5xl\" (UniqueName: \"kubernetes.io/projected/03e3abf3-c61d-4e79-b832-35abf5025c30-kube-api-access-dz5xl\") pod \"cinder-scheduler-0\" (UID: \"03e3abf3-c61d-4e79-b832-35abf5025c30\") " pod="openstack/cinder-scheduler-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.238467 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03e3abf3-c61d-4e79-b832-35abf5025c30-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"03e3abf3-c61d-4e79-b832-35abf5025c30\") " pod="openstack/cinder-scheduler-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.243886 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03e3abf3-c61d-4e79-b832-35abf5025c30-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"03e3abf3-c61d-4e79-b832-35abf5025c30\") " pod="openstack/cinder-scheduler-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.245324 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-tdlnj"] Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.251332 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.254436 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03e3abf3-c61d-4e79-b832-35abf5025c30-scripts\") pod \"cinder-scheduler-0\" (UID: \"03e3abf3-c61d-4e79-b832-35abf5025c30\") " pod="openstack/cinder-scheduler-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.259132 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03e3abf3-c61d-4e79-b832-35abf5025c30-config-data\") pod \"cinder-scheduler-0\" (UID: \"03e3abf3-c61d-4e79-b832-35abf5025c30\") " pod="openstack/cinder-scheduler-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.260259 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03e3abf3-c61d-4e79-b832-35abf5025c30-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"03e3abf3-c61d-4e79-b832-35abf5025c30\") " pod="openstack/cinder-scheduler-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.262937 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03e3abf3-c61d-4e79-b832-35abf5025c30-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"03e3abf3-c61d-4e79-b832-35abf5025c30\") " pod="openstack/cinder-scheduler-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.263686 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-66c7555cc4-mtbzr"] Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.271720 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz5xl\" (UniqueName: \"kubernetes.io/projected/03e3abf3-c61d-4e79-b832-35abf5025c30-kube-api-access-dz5xl\") pod \"cinder-scheduler-0\" (UID: \"03e3abf3-c61d-4e79-b832-35abf5025c30\") " pod="openstack/cinder-scheduler-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.276100 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.276154 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.299958 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-tdlnj"] Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.324298 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-69fd595d46-6k6cs"] Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.334092 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.335850 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.338820 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.338961 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.340411 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-tdlnj\" (UID: \"dd33f007-b06d-4b0a-afb8-64e98985e598\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.340484 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-config\") pod \"dnsmasq-dns-5c9776ccc5-tdlnj\" (UID: \"dd33f007-b06d-4b0a-afb8-64e98985e598\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.340514 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhfwz\" (UniqueName: \"kubernetes.io/projected/dd33f007-b06d-4b0a-afb8-64e98985e598-kube-api-access-qhfwz\") pod \"dnsmasq-dns-5c9776ccc5-tdlnj\" (UID: \"dd33f007-b06d-4b0a-afb8-64e98985e598\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.340531 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-tdlnj\" (UID: \"dd33f007-b06d-4b0a-afb8-64e98985e598\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.340803 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-tdlnj\" (UID: \"dd33f007-b06d-4b0a-afb8-64e98985e598\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.340825 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-tdlnj\" (UID: \"dd33f007-b06d-4b0a-afb8-64e98985e598\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.354385 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-69fd595d46-6k6cs"] Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.354506 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.391363 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.398244 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-66c7555cc4-mtbzr" event={"ID":"13fb6f6e-1dda-4e09-971a-d0629bc44ff4","Type":"ContainerStarted","Data":"141816336d41a63a4a3f03d8f1abf7d4f0c02388e16737d76e17d73028692314"} Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.400276 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3","Type":"ContainerStarted","Data":"f2e74b1e9a38155222a85dba15b0d5b19c791a27557cec98ce384e6562519d88"} Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.400459 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="56ed04c4-c2a4-47be-8b9f-faaea9aab6c3" containerName="ceilometer-notification-agent" containerID="cri-o://0a9b0e1fe52972b5a3bc596d21d1528fa23c441d49b0f1a09d291cf2e6eeea3e" gracePeriod=30 Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.400564 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.401198 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="56ed04c4-c2a4-47be-8b9f-faaea9aab6c3" containerName="proxy-httpd" containerID="cri-o://f2e74b1e9a38155222a85dba15b0d5b19c791a27557cec98ce384e6562519d88" gracePeriod=30 Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.400789 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="56ed04c4-c2a4-47be-8b9f-faaea9aab6c3" containerName="sg-core" containerID="cri-o://ab60c67522c880db20e46078b8aec289c4416dd8b9970d7f7d96cf88261a61c6" gracePeriod=30 Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.405610 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5894b4657f-lj4ff" event={"ID":"24cb181d-8c43-4ae8-9af0-b28f570f7f22","Type":"ContainerStarted","Data":"0816fae2ddf3ee3a3050dc5a921940e55a4ae4c7fb130443e09f4a07cdc73c37"} Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.408186 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-599ccf9f8d-z6nsl" event={"ID":"b53fd943-6ac9-4338-b3d5-83a9627f1c78","Type":"ContainerStarted","Data":"273976fc13151815122afbd74fa287abf905a24e9bc2809d09ffb60d97ee2f05"} Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.411526 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-dvmlh" event={"ID":"2bfd449e-5331-4591-b41b-b8603798476b","Type":"ContainerStarted","Data":"fa55cea504450c2b6be9d2379f2f71722b781ff3f673a7e18115a75848cb11e4"} Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.419401 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5cddbc5576-b9kzz" event={"ID":"577d7298-4011-4f66-a59c-36b823400652","Type":"ContainerStarted","Data":"b9f888d83b9d18606e057a1131494727bdd6ce049e05f0186ef6b0b7a494978b"} Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.424120 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.424175 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.424193 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.424204 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.435666 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.441757 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.442280 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhfwz\" (UniqueName: \"kubernetes.io/projected/dd33f007-b06d-4b0a-afb8-64e98985e598-kube-api-access-qhfwz\") pod \"dnsmasq-dns-5c9776ccc5-tdlnj\" (UID: \"dd33f007-b06d-4b0a-afb8-64e98985e598\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.442320 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-tdlnj\" (UID: \"dd33f007-b06d-4b0a-afb8-64e98985e598\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.442350 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-tdlnj\" (UID: \"dd33f007-b06d-4b0a-afb8-64e98985e598\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.442369 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-tdlnj\" (UID: \"dd33f007-b06d-4b0a-afb8-64e98985e598\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.442391 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d010a73f-6034-48ea-b18b-3bad26fe39ee-config-data-custom\") pod \"barbican-api-69fd595d46-6k6cs\" (UID: \"d010a73f-6034-48ea-b18b-3bad26fe39ee\") " pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.442429 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b98kr\" (UniqueName: \"kubernetes.io/projected/d010a73f-6034-48ea-b18b-3bad26fe39ee-kube-api-access-b98kr\") pod \"barbican-api-69fd595d46-6k6cs\" (UID: \"d010a73f-6034-48ea-b18b-3bad26fe39ee\") " pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.442470 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d010a73f-6034-48ea-b18b-3bad26fe39ee-combined-ca-bundle\") pod \"barbican-api-69fd595d46-6k6cs\" (UID: \"d010a73f-6034-48ea-b18b-3bad26fe39ee\") " pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.442499 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d010a73f-6034-48ea-b18b-3bad26fe39ee-logs\") pod \"barbican-api-69fd595d46-6k6cs\" (UID: \"d010a73f-6034-48ea-b18b-3bad26fe39ee\") " pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.442537 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-tdlnj\" (UID: \"dd33f007-b06d-4b0a-afb8-64e98985e598\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.442644 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d010a73f-6034-48ea-b18b-3bad26fe39ee-internal-tls-certs\") pod \"barbican-api-69fd595d46-6k6cs\" (UID: \"d010a73f-6034-48ea-b18b-3bad26fe39ee\") " pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.443074 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d010a73f-6034-48ea-b18b-3bad26fe39ee-public-tls-certs\") pod \"barbican-api-69fd595d46-6k6cs\" (UID: \"d010a73f-6034-48ea-b18b-3bad26fe39ee\") " pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.443137 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d010a73f-6034-48ea-b18b-3bad26fe39ee-config-data\") pod \"barbican-api-69fd595d46-6k6cs\" (UID: \"d010a73f-6034-48ea-b18b-3bad26fe39ee\") " pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.443181 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-config\") pod \"dnsmasq-dns-5c9776ccc5-tdlnj\" (UID: \"dd33f007-b06d-4b0a-afb8-64e98985e598\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.443862 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-tdlnj\" (UID: \"dd33f007-b06d-4b0a-afb8-64e98985e598\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.444230 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-tdlnj\" (UID: \"dd33f007-b06d-4b0a-afb8-64e98985e598\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.444390 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.445226 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-tdlnj\" (UID: \"dd33f007-b06d-4b0a-afb8-64e98985e598\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.446285 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-config\") pod \"dnsmasq-dns-5c9776ccc5-tdlnj\" (UID: \"dd33f007-b06d-4b0a-afb8-64e98985e598\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.446429 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-tdlnj\" (UID: \"dd33f007-b06d-4b0a-afb8-64e98985e598\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.457429 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.464913 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhfwz\" (UniqueName: \"kubernetes.io/projected/dd33f007-b06d-4b0a-afb8-64e98985e598-kube-api-access-qhfwz\") pod \"dnsmasq-dns-5c9776ccc5-tdlnj\" (UID: \"dd33f007-b06d-4b0a-afb8-64e98985e598\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.547094 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d010a73f-6034-48ea-b18b-3bad26fe39ee-config-data-custom\") pod \"barbican-api-69fd595d46-6k6cs\" (UID: \"d010a73f-6034-48ea-b18b-3bad26fe39ee\") " pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.547911 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b98kr\" (UniqueName: \"kubernetes.io/projected/d010a73f-6034-48ea-b18b-3bad26fe39ee-kube-api-access-b98kr\") pod \"barbican-api-69fd595d46-6k6cs\" (UID: \"d010a73f-6034-48ea-b18b-3bad26fe39ee\") " pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.548010 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bed4760-7b73-42e9-89e3-4d1fce55c607-logs\") pod \"cinder-api-0\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " pod="openstack/cinder-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.548191 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bed4760-7b73-42e9-89e3-4d1fce55c607-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " pod="openstack/cinder-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.548277 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d010a73f-6034-48ea-b18b-3bad26fe39ee-combined-ca-bundle\") pod \"barbican-api-69fd595d46-6k6cs\" (UID: \"d010a73f-6034-48ea-b18b-3bad26fe39ee\") " pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.548414 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bed4760-7b73-42e9-89e3-4d1fce55c607-config-data\") pod \"cinder-api-0\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " pod="openstack/cinder-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.548488 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d010a73f-6034-48ea-b18b-3bad26fe39ee-logs\") pod \"barbican-api-69fd595d46-6k6cs\" (UID: \"d010a73f-6034-48ea-b18b-3bad26fe39ee\") " pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.548692 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d010a73f-6034-48ea-b18b-3bad26fe39ee-internal-tls-certs\") pod \"barbican-api-69fd595d46-6k6cs\" (UID: \"d010a73f-6034-48ea-b18b-3bad26fe39ee\") " pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.549030 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9bed4760-7b73-42e9-89e3-4d1fce55c607-config-data-custom\") pod \"cinder-api-0\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " pod="openstack/cinder-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.549122 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bed4760-7b73-42e9-89e3-4d1fce55c607-scripts\") pod \"cinder-api-0\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " pod="openstack/cinder-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.548940 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d010a73f-6034-48ea-b18b-3bad26fe39ee-logs\") pod \"barbican-api-69fd595d46-6k6cs\" (UID: \"d010a73f-6034-48ea-b18b-3bad26fe39ee\") " pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.550202 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d010a73f-6034-48ea-b18b-3bad26fe39ee-public-tls-certs\") pod \"barbican-api-69fd595d46-6k6cs\" (UID: \"d010a73f-6034-48ea-b18b-3bad26fe39ee\") " pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.550301 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d010a73f-6034-48ea-b18b-3bad26fe39ee-config-data\") pod \"barbican-api-69fd595d46-6k6cs\" (UID: \"d010a73f-6034-48ea-b18b-3bad26fe39ee\") " pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.550378 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9bed4760-7b73-42e9-89e3-4d1fce55c607-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " pod="openstack/cinder-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.550425 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cczkq\" (UniqueName: \"kubernetes.io/projected/9bed4760-7b73-42e9-89e3-4d1fce55c607-kube-api-access-cczkq\") pod \"cinder-api-0\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " pod="openstack/cinder-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.553880 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d010a73f-6034-48ea-b18b-3bad26fe39ee-combined-ca-bundle\") pod \"barbican-api-69fd595d46-6k6cs\" (UID: \"d010a73f-6034-48ea-b18b-3bad26fe39ee\") " pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.554038 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d010a73f-6034-48ea-b18b-3bad26fe39ee-internal-tls-certs\") pod \"barbican-api-69fd595d46-6k6cs\" (UID: \"d010a73f-6034-48ea-b18b-3bad26fe39ee\") " pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.554635 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d010a73f-6034-48ea-b18b-3bad26fe39ee-public-tls-certs\") pod \"barbican-api-69fd595d46-6k6cs\" (UID: \"d010a73f-6034-48ea-b18b-3bad26fe39ee\") " pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.565734 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b98kr\" (UniqueName: \"kubernetes.io/projected/d010a73f-6034-48ea-b18b-3bad26fe39ee-kube-api-access-b98kr\") pod \"barbican-api-69fd595d46-6k6cs\" (UID: \"d010a73f-6034-48ea-b18b-3bad26fe39ee\") " pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.566499 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d010a73f-6034-48ea-b18b-3bad26fe39ee-config-data-custom\") pod \"barbican-api-69fd595d46-6k6cs\" (UID: \"d010a73f-6034-48ea-b18b-3bad26fe39ee\") " pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.569012 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d010a73f-6034-48ea-b18b-3bad26fe39ee-config-data\") pod \"barbican-api-69fd595d46-6k6cs\" (UID: \"d010a73f-6034-48ea-b18b-3bad26fe39ee\") " pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.652406 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9bed4760-7b73-42e9-89e3-4d1fce55c607-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " pod="openstack/cinder-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.652455 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cczkq\" (UniqueName: \"kubernetes.io/projected/9bed4760-7b73-42e9-89e3-4d1fce55c607-kube-api-access-cczkq\") pod \"cinder-api-0\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " pod="openstack/cinder-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.652518 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bed4760-7b73-42e9-89e3-4d1fce55c607-logs\") pod \"cinder-api-0\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " pod="openstack/cinder-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.652563 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bed4760-7b73-42e9-89e3-4d1fce55c607-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " pod="openstack/cinder-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.653216 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9bed4760-7b73-42e9-89e3-4d1fce55c607-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " pod="openstack/cinder-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.653604 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bed4760-7b73-42e9-89e3-4d1fce55c607-logs\") pod \"cinder-api-0\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " pod="openstack/cinder-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.653675 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bed4760-7b73-42e9-89e3-4d1fce55c607-config-data\") pod \"cinder-api-0\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " pod="openstack/cinder-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.654058 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9bed4760-7b73-42e9-89e3-4d1fce55c607-config-data-custom\") pod \"cinder-api-0\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " pod="openstack/cinder-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.654090 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bed4760-7b73-42e9-89e3-4d1fce55c607-scripts\") pod \"cinder-api-0\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " pod="openstack/cinder-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.658537 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9bed4760-7b73-42e9-89e3-4d1fce55c607-config-data-custom\") pod \"cinder-api-0\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " pod="openstack/cinder-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.659192 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bed4760-7b73-42e9-89e3-4d1fce55c607-config-data\") pod \"cinder-api-0\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " pod="openstack/cinder-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.659828 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bed4760-7b73-42e9-89e3-4d1fce55c607-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " pod="openstack/cinder-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.663742 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bed4760-7b73-42e9-89e3-4d1fce55c607-scripts\") pod \"cinder-api-0\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " pod="openstack/cinder-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.668386 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cczkq\" (UniqueName: \"kubernetes.io/projected/9bed4760-7b73-42e9-89e3-4d1fce55c607-kube-api-access-cczkq\") pod \"cinder-api-0\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " pod="openstack/cinder-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.732476 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.801138 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.807244 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 27 01:24:49 crc kubenswrapper[4771]: I0227 01:24:49.914739 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 01:24:50 crc kubenswrapper[4771]: I0227 01:24:50.244876 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-tdlnj"] Feb 27 01:24:50 crc kubenswrapper[4771]: I0227 01:24:50.445144 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"03e3abf3-c61d-4e79-b832-35abf5025c30","Type":"ContainerStarted","Data":"44815cbe87f42e8cabbcd8d4444578bec98e69a240b9ceb901a62ce7296e242b"} Feb 27 01:24:50 crc kubenswrapper[4771]: I0227 01:24:50.460665 4771 generic.go:334] "Generic (PLEG): container finished" podID="56ed04c4-c2a4-47be-8b9f-faaea9aab6c3" containerID="f2e74b1e9a38155222a85dba15b0d5b19c791a27557cec98ce384e6562519d88" exitCode=0 Feb 27 01:24:50 crc kubenswrapper[4771]: I0227 01:24:50.460700 4771 generic.go:334] "Generic (PLEG): container finished" podID="56ed04c4-c2a4-47be-8b9f-faaea9aab6c3" containerID="ab60c67522c880db20e46078b8aec289c4416dd8b9970d7f7d96cf88261a61c6" exitCode=2 Feb 27 01:24:50 crc kubenswrapper[4771]: I0227 01:24:50.460786 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3","Type":"ContainerDied","Data":"f2e74b1e9a38155222a85dba15b0d5b19c791a27557cec98ce384e6562519d88"} Feb 27 01:24:50 crc kubenswrapper[4771]: I0227 01:24:50.460834 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3","Type":"ContainerDied","Data":"ab60c67522c880db20e46078b8aec289c4416dd8b9970d7f7d96cf88261a61c6"} Feb 27 01:24:50 crc kubenswrapper[4771]: I0227 01:24:50.463394 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" event={"ID":"dd33f007-b06d-4b0a-afb8-64e98985e598","Type":"ContainerStarted","Data":"1772810f2c49434d571a068258bc2d5cac4bc1c1755c2e1d2fb83f52d7eaf54d"} Feb 27 01:24:50 crc kubenswrapper[4771]: I0227 01:24:50.482335 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-69fd595d46-6k6cs"] Feb 27 01:24:50 crc kubenswrapper[4771]: I0227 01:24:50.495542 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-599ccf9f8d-z6nsl" event={"ID":"b53fd943-6ac9-4338-b3d5-83a9627f1c78","Type":"ContainerStarted","Data":"b9817d0ae2a6a79c13a517bf8586ad9fd88bef7d7493c68d6934161255c6acce"} Feb 27 01:24:50 crc kubenswrapper[4771]: I0227 01:24:50.495616 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-599ccf9f8d-z6nsl" event={"ID":"b53fd943-6ac9-4338-b3d5-83a9627f1c78","Type":"ContainerStarted","Data":"3822830ec64f4a7ad43ecbbe4a7fef3f9f87b9a14213a637a15ceb89bdf68519"} Feb 27 01:24:50 crc kubenswrapper[4771]: I0227 01:24:50.495937 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-599ccf9f8d-z6nsl" Feb 27 01:24:50 crc kubenswrapper[4771]: I0227 01:24:50.495981 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-599ccf9f8d-z6nsl" Feb 27 01:24:50 crc kubenswrapper[4771]: I0227 01:24:50.501751 4771 generic.go:334] "Generic (PLEG): container finished" podID="2bfd449e-5331-4591-b41b-b8603798476b" containerID="b80c843b6a09582b49758c5b1ac718f22200fc36b40f982c9ce96017bb3987ef" exitCode=0 Feb 27 01:24:50 crc kubenswrapper[4771]: I0227 01:24:50.501849 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-dvmlh" event={"ID":"2bfd449e-5331-4591-b41b-b8603798476b","Type":"ContainerDied","Data":"b80c843b6a09582b49758c5b1ac718f22200fc36b40f982c9ce96017bb3987ef"} Feb 27 01:24:50 crc kubenswrapper[4771]: I0227 01:24:50.507303 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5cddbc5576-b9kzz" event={"ID":"577d7298-4011-4f66-a59c-36b823400652","Type":"ContainerStarted","Data":"f70cd8e74661577c83ed136df93d740830df1c197e31f7a4f7c50235e39f0c59"} Feb 27 01:24:50 crc kubenswrapper[4771]: I0227 01:24:50.507362 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5cddbc5576-b9kzz" event={"ID":"577d7298-4011-4f66-a59c-36b823400652","Type":"ContainerStarted","Data":"7700c38c3ea46efe4616701c7ab6b59c7012ea609dff8c8579c756eb70f8f406"} Feb 27 01:24:50 crc kubenswrapper[4771]: I0227 01:24:50.507382 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:24:50 crc kubenswrapper[4771]: I0227 01:24:50.507538 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:24:50 crc kubenswrapper[4771]: I0227 01:24:50.538774 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-599ccf9f8d-z6nsl" podStartSLOduration=4.538753182 podStartE2EDuration="4.538753182s" podCreationTimestamp="2026-02-27 01:24:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:24:50.523300652 +0000 UTC m=+1203.460861940" watchObservedRunningTime="2026-02-27 01:24:50.538753182 +0000 UTC m=+1203.476314470" Feb 27 01:24:50 crc kubenswrapper[4771]: I0227 01:24:50.560403 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5cddbc5576-b9kzz" podStartSLOduration=3.560382502 podStartE2EDuration="3.560382502s" podCreationTimestamp="2026-02-27 01:24:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:24:50.555699974 +0000 UTC m=+1203.493261272" watchObservedRunningTime="2026-02-27 01:24:50.560382502 +0000 UTC m=+1203.497943790" Feb 27 01:24:50 crc kubenswrapper[4771]: I0227 01:24:50.597084 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 27 01:24:50 crc kubenswrapper[4771]: W0227 01:24:50.743466 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bed4760_7b73_42e9_89e3_4d1fce55c607.slice/crio-208ad725a7483c381dd33fa5297a50d113309e95245e2a3cd93fb2058b42efd7 WatchSource:0}: Error finding container 208ad725a7483c381dd33fa5297a50d113309e95245e2a3cd93fb2058b42efd7: Status 404 returned error can't find the container with id 208ad725a7483c381dd33fa5297a50d113309e95245e2a3cd93fb2058b42efd7 Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.532981 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-dvmlh" Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.536955 4771 generic.go:334] "Generic (PLEG): container finished" podID="dd33f007-b06d-4b0a-afb8-64e98985e598" containerID="b98b33b34bf9699ba434450a4e8a742ba1cc8b6e07ff73ba995651810efd7dfe" exitCode=0 Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.537013 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" event={"ID":"dd33f007-b06d-4b0a-afb8-64e98985e598","Type":"ContainerDied","Data":"b98b33b34bf9699ba434450a4e8a742ba1cc8b6e07ff73ba995651810efd7dfe"} Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.539004 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9bed4760-7b73-42e9-89e3-4d1fce55c607","Type":"ContainerStarted","Data":"208ad725a7483c381dd33fa5297a50d113309e95245e2a3cd93fb2058b42efd7"} Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.567353 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-dvmlh" event={"ID":"2bfd449e-5331-4591-b41b-b8603798476b","Type":"ContainerDied","Data":"fa55cea504450c2b6be9d2379f2f71722b781ff3f673a7e18115a75848cb11e4"} Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.567420 4771 scope.go:117] "RemoveContainer" containerID="b80c843b6a09582b49758c5b1ac718f22200fc36b40f982c9ce96017bb3987ef" Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.567423 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-dvmlh" Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.577220 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69fd595d46-6k6cs" event={"ID":"d010a73f-6034-48ea-b18b-3bad26fe39ee","Type":"ContainerStarted","Data":"bf4a444cedbbb493f116b34cd00d8a936e768b1db0a1cce83a957535083a4f60"} Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.577275 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69fd595d46-6k6cs" event={"ID":"d010a73f-6034-48ea-b18b-3bad26fe39ee","Type":"ContainerStarted","Data":"aa86d92f19e4072cfa23d30320888dbbf2204cd43025d4db2fcb4d2d1a6dceb5"} Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.603939 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-dns-swift-storage-0\") pod \"2bfd449e-5331-4591-b41b-b8603798476b\" (UID: \"2bfd449e-5331-4591-b41b-b8603798476b\") " Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.604101 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98lf7\" (UniqueName: \"kubernetes.io/projected/2bfd449e-5331-4591-b41b-b8603798476b-kube-api-access-98lf7\") pod \"2bfd449e-5331-4591-b41b-b8603798476b\" (UID: \"2bfd449e-5331-4591-b41b-b8603798476b\") " Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.604178 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-dns-svc\") pod \"2bfd449e-5331-4591-b41b-b8603798476b\" (UID: \"2bfd449e-5331-4591-b41b-b8603798476b\") " Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.604275 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-config\") pod \"2bfd449e-5331-4591-b41b-b8603798476b\" (UID: \"2bfd449e-5331-4591-b41b-b8603798476b\") " Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.604417 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-ovsdbserver-nb\") pod \"2bfd449e-5331-4591-b41b-b8603798476b\" (UID: \"2bfd449e-5331-4591-b41b-b8603798476b\") " Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.604571 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-ovsdbserver-sb\") pod \"2bfd449e-5331-4591-b41b-b8603798476b\" (UID: \"2bfd449e-5331-4591-b41b-b8603798476b\") " Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.636080 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2bfd449e-5331-4591-b41b-b8603798476b" (UID: "2bfd449e-5331-4591-b41b-b8603798476b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.636227 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.637606 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.640346 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2bfd449e-5331-4591-b41b-b8603798476b" (UID: "2bfd449e-5331-4591-b41b-b8603798476b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.641095 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bfd449e-5331-4591-b41b-b8603798476b-kube-api-access-98lf7" (OuterVolumeSpecName: "kube-api-access-98lf7") pod "2bfd449e-5331-4591-b41b-b8603798476b" (UID: "2bfd449e-5331-4591-b41b-b8603798476b"). InnerVolumeSpecName "kube-api-access-98lf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.669505 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-config" (OuterVolumeSpecName: "config") pod "2bfd449e-5331-4591-b41b-b8603798476b" (UID: "2bfd449e-5331-4591-b41b-b8603798476b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.683404 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.683503 4771 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.704075 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2bfd449e-5331-4591-b41b-b8603798476b" (UID: "2bfd449e-5331-4591-b41b-b8603798476b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.704351 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.707850 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.707881 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.707891 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98lf7\" (UniqueName: \"kubernetes.io/projected/2bfd449e-5331-4591-b41b-b8603798476b-kube-api-access-98lf7\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.707901 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.707911 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.737089 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2bfd449e-5331-4591-b41b-b8603798476b" (UID: "2bfd449e-5331-4591-b41b-b8603798476b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.809986 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bfd449e-5331-4591-b41b-b8603798476b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.932823 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dvmlh"] Feb 27 01:24:51 crc kubenswrapper[4771]: I0227 01:24:51.938690 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dvmlh"] Feb 27 01:24:52 crc kubenswrapper[4771]: I0227 01:24:52.590631 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"03e3abf3-c61d-4e79-b832-35abf5025c30","Type":"ContainerStarted","Data":"77272668990de879afe181aa43c8313372c5351b2a6364c3078ca0004de832e0"} Feb 27 01:24:52 crc kubenswrapper[4771]: I0227 01:24:52.598916 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-66c7555cc4-mtbzr" event={"ID":"13fb6f6e-1dda-4e09-971a-d0629bc44ff4","Type":"ContainerStarted","Data":"66d814936305872af7af6214bda7fcec81aaaa3dea022c447c47567d15cb2f71"} Feb 27 01:24:52 crc kubenswrapper[4771]: I0227 01:24:52.598969 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-66c7555cc4-mtbzr" event={"ID":"13fb6f6e-1dda-4e09-971a-d0629bc44ff4","Type":"ContainerStarted","Data":"f992ba29b2cc8f2be61269b5da60c5c05653f58878e13795bd314521c2dc18ed"} Feb 27 01:24:52 crc kubenswrapper[4771]: I0227 01:24:52.610407 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69fd595d46-6k6cs" event={"ID":"d010a73f-6034-48ea-b18b-3bad26fe39ee","Type":"ContainerStarted","Data":"48e5a9f66891964cab8ce863a8b7f858114597dedf10ac38910cb329cdf224ce"} Feb 27 01:24:52 crc kubenswrapper[4771]: I0227 01:24:52.610738 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:24:52 crc kubenswrapper[4771]: I0227 01:24:52.610777 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:24:52 crc kubenswrapper[4771]: I0227 01:24:52.619625 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-66c7555cc4-mtbzr" podStartSLOduration=4.308123608 podStartE2EDuration="6.619605533s" podCreationTimestamp="2026-02-27 01:24:46 +0000 UTC" firstStartedPulling="2026-02-27 01:24:49.228056699 +0000 UTC m=+1202.165617987" lastFinishedPulling="2026-02-27 01:24:51.539538624 +0000 UTC m=+1204.477099912" observedRunningTime="2026-02-27 01:24:52.617819245 +0000 UTC m=+1205.555380543" watchObservedRunningTime="2026-02-27 01:24:52.619605533 +0000 UTC m=+1205.557166831" Feb 27 01:24:52 crc kubenswrapper[4771]: I0227 01:24:52.621991 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" Feb 27 01:24:52 crc kubenswrapper[4771]: I0227 01:24:52.622539 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" event={"ID":"dd33f007-b06d-4b0a-afb8-64e98985e598","Type":"ContainerStarted","Data":"ca181ac2b33ab45be8149bd583e9136329282f6be4045ce3d776d820aae982ea"} Feb 27 01:24:52 crc kubenswrapper[4771]: I0227 01:24:52.643920 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5894b4657f-lj4ff" event={"ID":"24cb181d-8c43-4ae8-9af0-b28f570f7f22","Type":"ContainerStarted","Data":"95a9b5f3417a39d58f16da605e096081dc0f4a5cc61d7cc3342afad9a640c705"} Feb 27 01:24:52 crc kubenswrapper[4771]: I0227 01:24:52.643975 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5894b4657f-lj4ff" event={"ID":"24cb181d-8c43-4ae8-9af0-b28f570f7f22","Type":"ContainerStarted","Data":"3bcbf49b71edd95cfd7752b524177059fc6f84757bf826f3d6845a8a874942aa"} Feb 27 01:24:52 crc kubenswrapper[4771]: I0227 01:24:52.649313 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9bed4760-7b73-42e9-89e3-4d1fce55c607","Type":"ContainerStarted","Data":"c4131aff870d0903040e0b6187cdf1bb3466664f106d2fe9873474a7c94dd285"} Feb 27 01:24:52 crc kubenswrapper[4771]: I0227 01:24:52.658534 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-69fd595d46-6k6cs" podStartSLOduration=3.658515392 podStartE2EDuration="3.658515392s" podCreationTimestamp="2026-02-27 01:24:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:24:52.641697405 +0000 UTC m=+1205.579258703" watchObservedRunningTime="2026-02-27 01:24:52.658515392 +0000 UTC m=+1205.596076680" Feb 27 01:24:52 crc kubenswrapper[4771]: I0227 01:24:52.673717 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" podStartSLOduration=3.673701226 podStartE2EDuration="3.673701226s" podCreationTimestamp="2026-02-27 01:24:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:24:52.669700547 +0000 UTC m=+1205.607261855" watchObservedRunningTime="2026-02-27 01:24:52.673701226 +0000 UTC m=+1205.611262514" Feb 27 01:24:52 crc kubenswrapper[4771]: I0227 01:24:52.697109 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5894b4657f-lj4ff" podStartSLOduration=4.269201879 podStartE2EDuration="6.697087362s" podCreationTimestamp="2026-02-27 01:24:46 +0000 UTC" firstStartedPulling="2026-02-27 01:24:49.138368098 +0000 UTC m=+1202.075929386" lastFinishedPulling="2026-02-27 01:24:51.566253581 +0000 UTC m=+1204.503814869" observedRunningTime="2026-02-27 01:24:52.691377297 +0000 UTC m=+1205.628938585" watchObservedRunningTime="2026-02-27 01:24:52.697087362 +0000 UTC m=+1205.634648650" Feb 27 01:24:53 crc kubenswrapper[4771]: I0227 01:24:53.218309 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 27 01:24:53 crc kubenswrapper[4771]: I0227 01:24:53.673486 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9bed4760-7b73-42e9-89e3-4d1fce55c607","Type":"ContainerStarted","Data":"e7ce76defb68be4302607139bc88494b2ca8ae023fe03c5fc6e305182c3083a5"} Feb 27 01:24:53 crc kubenswrapper[4771]: I0227 01:24:53.673960 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9bed4760-7b73-42e9-89e3-4d1fce55c607" containerName="cinder-api-log" containerID="cri-o://c4131aff870d0903040e0b6187cdf1bb3466664f106d2fe9873474a7c94dd285" gracePeriod=30 Feb 27 01:24:53 crc kubenswrapper[4771]: I0227 01:24:53.674355 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 27 01:24:53 crc kubenswrapper[4771]: I0227 01:24:53.674756 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9bed4760-7b73-42e9-89e3-4d1fce55c607" containerName="cinder-api" containerID="cri-o://e7ce76defb68be4302607139bc88494b2ca8ae023fe03c5fc6e305182c3083a5" gracePeriod=30 Feb 27 01:24:53 crc kubenswrapper[4771]: I0227 01:24:53.682051 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"03e3abf3-c61d-4e79-b832-35abf5025c30","Type":"ContainerStarted","Data":"78931a56a27ade7568dba31e202aafa1126640db6ead34072afa80852fe068c2"} Feb 27 01:24:53 crc kubenswrapper[4771]: I0227 01:24:53.688941 4771 generic.go:334] "Generic (PLEG): container finished" podID="56ed04c4-c2a4-47be-8b9f-faaea9aab6c3" containerID="0a9b0e1fe52972b5a3bc596d21d1528fa23c441d49b0f1a09d291cf2e6eeea3e" exitCode=0 Feb 27 01:24:53 crc kubenswrapper[4771]: I0227 01:24:53.688958 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5fc95dbbd4-gfl9m" Feb 27 01:24:53 crc kubenswrapper[4771]: I0227 01:24:53.689233 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3","Type":"ContainerDied","Data":"0a9b0e1fe52972b5a3bc596d21d1528fa23c441d49b0f1a09d291cf2e6eeea3e"} Feb 27 01:24:53 crc kubenswrapper[4771]: I0227 01:24:53.724870 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.724850268 podStartE2EDuration="4.724850268s" podCreationTimestamp="2026-02-27 01:24:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:24:53.702149241 +0000 UTC m=+1206.639710529" watchObservedRunningTime="2026-02-27 01:24:53.724850268 +0000 UTC m=+1206.662411556" Feb 27 01:24:53 crc kubenswrapper[4771]: I0227 01:24:53.732534 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.972077575 podStartE2EDuration="5.732514036s" podCreationTimestamp="2026-02-27 01:24:48 +0000 UTC" firstStartedPulling="2026-02-27 01:24:49.970224263 +0000 UTC m=+1202.907785551" lastFinishedPulling="2026-02-27 01:24:50.730660724 +0000 UTC m=+1203.668222012" observedRunningTime="2026-02-27 01:24:53.722137674 +0000 UTC m=+1206.659698962" watchObservedRunningTime="2026-02-27 01:24:53.732514036 +0000 UTC m=+1206.670075324" Feb 27 01:24:53 crc kubenswrapper[4771]: I0227 01:24:53.785803 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bfd449e-5331-4591-b41b-b8603798476b" path="/var/lib/kubelet/pods/2bfd449e-5331-4591-b41b-b8603798476b/volumes" Feb 27 01:24:53 crc kubenswrapper[4771]: I0227 01:24:53.970877 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6c5945c865-z7kz7"] Feb 27 01:24:53 crc kubenswrapper[4771]: I0227 01:24:53.971345 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6c5945c865-z7kz7" podUID="eca4c4e6-7c04-473a-921b-c6f7e98c81b3" containerName="neutron-api" containerID="cri-o://ed10914f70340fa9edb88427c9685cc726d0e02226638f2978714fec22b0b45f" gracePeriod=30 Feb 27 01:24:53 crc kubenswrapper[4771]: I0227 01:24:53.972919 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6c5945c865-z7kz7" podUID="eca4c4e6-7c04-473a-921b-c6f7e98c81b3" containerName="neutron-httpd" containerID="cri-o://f61f2708c8d73b030c8335a3c492a7f2c6f3eef399fef9b0a0394e4fc6e69bf6" gracePeriod=30 Feb 27 01:24:53 crc kubenswrapper[4771]: I0227 01:24:53.997083 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6fd6bd959-l4htk"] Feb 27 01:24:53 crc kubenswrapper[4771]: E0227 01:24:53.997421 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bfd449e-5331-4591-b41b-b8603798476b" containerName="init" Feb 27 01:24:53 crc kubenswrapper[4771]: I0227 01:24:53.997432 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bfd449e-5331-4591-b41b-b8603798476b" containerName="init" Feb 27 01:24:53 crc kubenswrapper[4771]: I0227 01:24:53.997608 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bfd449e-5331-4591-b41b-b8603798476b" containerName="init" Feb 27 01:24:53 crc kubenswrapper[4771]: I0227 01:24:53.998466 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6fd6bd959-l4htk" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.010892 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6fd6bd959-l4htk"] Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.083239 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db54a8be-2fc6-4aee-b505-e1a526407006-internal-tls-certs\") pod \"neutron-6fd6bd959-l4htk\" (UID: \"db54a8be-2fc6-4aee-b505-e1a526407006\") " pod="openstack/neutron-6fd6bd959-l4htk" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.083301 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/db54a8be-2fc6-4aee-b505-e1a526407006-httpd-config\") pod \"neutron-6fd6bd959-l4htk\" (UID: \"db54a8be-2fc6-4aee-b505-e1a526407006\") " pod="openstack/neutron-6fd6bd959-l4htk" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.083500 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5whtj\" (UniqueName: \"kubernetes.io/projected/db54a8be-2fc6-4aee-b505-e1a526407006-kube-api-access-5whtj\") pod \"neutron-6fd6bd959-l4htk\" (UID: \"db54a8be-2fc6-4aee-b505-e1a526407006\") " pod="openstack/neutron-6fd6bd959-l4htk" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.083536 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db54a8be-2fc6-4aee-b505-e1a526407006-public-tls-certs\") pod \"neutron-6fd6bd959-l4htk\" (UID: \"db54a8be-2fc6-4aee-b505-e1a526407006\") " pod="openstack/neutron-6fd6bd959-l4htk" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.083592 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/db54a8be-2fc6-4aee-b505-e1a526407006-ovndb-tls-certs\") pod \"neutron-6fd6bd959-l4htk\" (UID: \"db54a8be-2fc6-4aee-b505-e1a526407006\") " pod="openstack/neutron-6fd6bd959-l4htk" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.083623 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db54a8be-2fc6-4aee-b505-e1a526407006-combined-ca-bundle\") pod \"neutron-6fd6bd959-l4htk\" (UID: \"db54a8be-2fc6-4aee-b505-e1a526407006\") " pod="openstack/neutron-6fd6bd959-l4htk" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.083642 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/db54a8be-2fc6-4aee-b505-e1a526407006-config\") pod \"neutron-6fd6bd959-l4htk\" (UID: \"db54a8be-2fc6-4aee-b505-e1a526407006\") " pod="openstack/neutron-6fd6bd959-l4htk" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.087473 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6c5945c865-z7kz7" podUID="eca4c4e6-7c04-473a-921b-c6f7e98c81b3" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.161:9696/\": read tcp 10.217.0.2:38454->10.217.0.161:9696: read: connection reset by peer" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.185307 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5whtj\" (UniqueName: \"kubernetes.io/projected/db54a8be-2fc6-4aee-b505-e1a526407006-kube-api-access-5whtj\") pod \"neutron-6fd6bd959-l4htk\" (UID: \"db54a8be-2fc6-4aee-b505-e1a526407006\") " pod="openstack/neutron-6fd6bd959-l4htk" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.185381 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db54a8be-2fc6-4aee-b505-e1a526407006-public-tls-certs\") pod \"neutron-6fd6bd959-l4htk\" (UID: \"db54a8be-2fc6-4aee-b505-e1a526407006\") " pod="openstack/neutron-6fd6bd959-l4htk" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.185405 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/db54a8be-2fc6-4aee-b505-e1a526407006-ovndb-tls-certs\") pod \"neutron-6fd6bd959-l4htk\" (UID: \"db54a8be-2fc6-4aee-b505-e1a526407006\") " pod="openstack/neutron-6fd6bd959-l4htk" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.185428 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db54a8be-2fc6-4aee-b505-e1a526407006-combined-ca-bundle\") pod \"neutron-6fd6bd959-l4htk\" (UID: \"db54a8be-2fc6-4aee-b505-e1a526407006\") " pod="openstack/neutron-6fd6bd959-l4htk" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.185473 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/db54a8be-2fc6-4aee-b505-e1a526407006-config\") pod \"neutron-6fd6bd959-l4htk\" (UID: \"db54a8be-2fc6-4aee-b505-e1a526407006\") " pod="openstack/neutron-6fd6bd959-l4htk" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.185618 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db54a8be-2fc6-4aee-b505-e1a526407006-internal-tls-certs\") pod \"neutron-6fd6bd959-l4htk\" (UID: \"db54a8be-2fc6-4aee-b505-e1a526407006\") " pod="openstack/neutron-6fd6bd959-l4htk" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.185646 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/db54a8be-2fc6-4aee-b505-e1a526407006-httpd-config\") pod \"neutron-6fd6bd959-l4htk\" (UID: \"db54a8be-2fc6-4aee-b505-e1a526407006\") " pod="openstack/neutron-6fd6bd959-l4htk" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.195905 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db54a8be-2fc6-4aee-b505-e1a526407006-internal-tls-certs\") pod \"neutron-6fd6bd959-l4htk\" (UID: \"db54a8be-2fc6-4aee-b505-e1a526407006\") " pod="openstack/neutron-6fd6bd959-l4htk" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.195995 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/db54a8be-2fc6-4aee-b505-e1a526407006-httpd-config\") pod \"neutron-6fd6bd959-l4htk\" (UID: \"db54a8be-2fc6-4aee-b505-e1a526407006\") " pod="openstack/neutron-6fd6bd959-l4htk" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.198463 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/db54a8be-2fc6-4aee-b505-e1a526407006-config\") pod \"neutron-6fd6bd959-l4htk\" (UID: \"db54a8be-2fc6-4aee-b505-e1a526407006\") " pod="openstack/neutron-6fd6bd959-l4htk" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.201392 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db54a8be-2fc6-4aee-b505-e1a526407006-combined-ca-bundle\") pod \"neutron-6fd6bd959-l4htk\" (UID: \"db54a8be-2fc6-4aee-b505-e1a526407006\") " pod="openstack/neutron-6fd6bd959-l4htk" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.204859 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/db54a8be-2fc6-4aee-b505-e1a526407006-ovndb-tls-certs\") pod \"neutron-6fd6bd959-l4htk\" (UID: \"db54a8be-2fc6-4aee-b505-e1a526407006\") " pod="openstack/neutron-6fd6bd959-l4htk" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.209664 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db54a8be-2fc6-4aee-b505-e1a526407006-public-tls-certs\") pod \"neutron-6fd6bd959-l4htk\" (UID: \"db54a8be-2fc6-4aee-b505-e1a526407006\") " pod="openstack/neutron-6fd6bd959-l4htk" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.219189 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5whtj\" (UniqueName: \"kubernetes.io/projected/db54a8be-2fc6-4aee-b505-e1a526407006-kube-api-access-5whtj\") pod \"neutron-6fd6bd959-l4htk\" (UID: \"db54a8be-2fc6-4aee-b505-e1a526407006\") " pod="openstack/neutron-6fd6bd959-l4htk" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.288492 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.340441 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6fd6bd959-l4htk" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.392671 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.393230 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkfnl\" (UniqueName: \"kubernetes.io/projected/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-kube-api-access-qkfnl\") pod \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.393265 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-scripts\") pod \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.393330 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-sg-core-conf-yaml\") pod \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.393396 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-log-httpd\") pod \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.393471 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-combined-ca-bundle\") pod \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.393517 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-config-data\") pod \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.393537 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-run-httpd\") pod \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\" (UID: \"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3\") " Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.394283 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "56ed04c4-c2a4-47be-8b9f-faaea9aab6c3" (UID: "56ed04c4-c2a4-47be-8b9f-faaea9aab6c3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.396090 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "56ed04c4-c2a4-47be-8b9f-faaea9aab6c3" (UID: "56ed04c4-c2a4-47be-8b9f-faaea9aab6c3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.401741 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-kube-api-access-qkfnl" (OuterVolumeSpecName: "kube-api-access-qkfnl") pod "56ed04c4-c2a4-47be-8b9f-faaea9aab6c3" (UID: "56ed04c4-c2a4-47be-8b9f-faaea9aab6c3"). InnerVolumeSpecName "kube-api-access-qkfnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.402497 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-scripts" (OuterVolumeSpecName: "scripts") pod "56ed04c4-c2a4-47be-8b9f-faaea9aab6c3" (UID: "56ed04c4-c2a4-47be-8b9f-faaea9aab6c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.451948 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "56ed04c4-c2a4-47be-8b9f-faaea9aab6c3" (UID: "56ed04c4-c2a4-47be-8b9f-faaea9aab6c3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.495250 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.495289 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.495302 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkfnl\" (UniqueName: \"kubernetes.io/projected/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-kube-api-access-qkfnl\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.495317 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.495330 4771 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.512521 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56ed04c4-c2a4-47be-8b9f-faaea9aab6c3" (UID: "56ed04c4-c2a4-47be-8b9f-faaea9aab6c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.535195 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-config-data" (OuterVolumeSpecName: "config-data") pod "56ed04c4-c2a4-47be-8b9f-faaea9aab6c3" (UID: "56ed04c4-c2a4-47be-8b9f-faaea9aab6c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.597914 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.598157 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.700308 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56ed04c4-c2a4-47be-8b9f-faaea9aab6c3","Type":"ContainerDied","Data":"a65e85a159a9b17486796d380a3351deecb3414ee3e1d85ac5d1b6c55b301bc2"} Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.700348 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.700374 4771 scope.go:117] "RemoveContainer" containerID="f2e74b1e9a38155222a85dba15b0d5b19c791a27557cec98ce384e6562519d88" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.710113 4771 generic.go:334] "Generic (PLEG): container finished" podID="eca4c4e6-7c04-473a-921b-c6f7e98c81b3" containerID="f61f2708c8d73b030c8335a3c492a7f2c6f3eef399fef9b0a0394e4fc6e69bf6" exitCode=0 Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.710168 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c5945c865-z7kz7" event={"ID":"eca4c4e6-7c04-473a-921b-c6f7e98c81b3","Type":"ContainerDied","Data":"f61f2708c8d73b030c8335a3c492a7f2c6f3eef399fef9b0a0394e4fc6e69bf6"} Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.712700 4771 generic.go:334] "Generic (PLEG): container finished" podID="9bed4760-7b73-42e9-89e3-4d1fce55c607" containerID="c4131aff870d0903040e0b6187cdf1bb3466664f106d2fe9873474a7c94dd285" exitCode=143 Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.712746 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9bed4760-7b73-42e9-89e3-4d1fce55c607","Type":"ContainerDied","Data":"c4131aff870d0903040e0b6187cdf1bb3466664f106d2fe9873474a7c94dd285"} Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.714397 4771 generic.go:334] "Generic (PLEG): container finished" podID="e291ec97-2bfe-4bbe-a39d-9eca937f1855" containerID="fc53994384bdc2f24024e47615918ee993ad81722b3c10093142c5fd1d9a7756" exitCode=137 Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.714413 4771 generic.go:334] "Generic (PLEG): container finished" podID="e291ec97-2bfe-4bbe-a39d-9eca937f1855" containerID="c6a969f9e4477c27a136509637556ccc0114acef1770783092c387907188497e" exitCode=137 Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.715287 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-594f74c97c-r6bp5" event={"ID":"e291ec97-2bfe-4bbe-a39d-9eca937f1855","Type":"ContainerDied","Data":"fc53994384bdc2f24024e47615918ee993ad81722b3c10093142c5fd1d9a7756"} Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.715304 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-594f74c97c-r6bp5" event={"ID":"e291ec97-2bfe-4bbe-a39d-9eca937f1855","Type":"ContainerDied","Data":"c6a969f9e4477c27a136509637556ccc0114acef1770783092c387907188497e"} Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.745341 4771 scope.go:117] "RemoveContainer" containerID="ab60c67522c880db20e46078b8aec289c4416dd8b9970d7f7d96cf88261a61c6" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.811340 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.824916 4771 scope.go:117] "RemoveContainer" containerID="0a9b0e1fe52972b5a3bc596d21d1528fa23c441d49b0f1a09d291cf2e6eeea3e" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.837035 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.842530 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:24:54 crc kubenswrapper[4771]: E0227 01:24:54.842954 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ed04c4-c2a4-47be-8b9f-faaea9aab6c3" containerName="proxy-httpd" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.842967 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ed04c4-c2a4-47be-8b9f-faaea9aab6c3" containerName="proxy-httpd" Feb 27 01:24:54 crc kubenswrapper[4771]: E0227 01:24:54.842983 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ed04c4-c2a4-47be-8b9f-faaea9aab6c3" containerName="sg-core" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.842989 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ed04c4-c2a4-47be-8b9f-faaea9aab6c3" containerName="sg-core" Feb 27 01:24:54 crc kubenswrapper[4771]: E0227 01:24:54.843003 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ed04c4-c2a4-47be-8b9f-faaea9aab6c3" containerName="ceilometer-notification-agent" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.843010 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ed04c4-c2a4-47be-8b9f-faaea9aab6c3" containerName="ceilometer-notification-agent" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.843181 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="56ed04c4-c2a4-47be-8b9f-faaea9aab6c3" containerName="sg-core" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.843198 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="56ed04c4-c2a4-47be-8b9f-faaea9aab6c3" containerName="ceilometer-notification-agent" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.843211 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="56ed04c4-c2a4-47be-8b9f-faaea9aab6c3" containerName="proxy-httpd" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.844745 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.850146 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.850305 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.860596 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.924253 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-594f74c97c-r6bp5" Feb 27 01:24:54 crc kubenswrapper[4771]: E0227 01:24:54.939129 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56ed04c4_c2a4_47be_8b9f_faaea9aab6c3.slice/crio-a65e85a159a9b17486796d380a3351deecb3414ee3e1d85ac5d1b6c55b301bc2\": RecentStats: unable to find data in memory cache]" Feb 27 01:24:54 crc kubenswrapper[4771]: W0227 01:24:54.974765 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb54a8be_2fc6_4aee_b505_e1a526407006.slice/crio-771abe68ef044179bbf5d679144dd8fda7ecdf4b970a78038ea64819799444c9 WatchSource:0}: Error finding container 771abe68ef044179bbf5d679144dd8fda7ecdf4b970a78038ea64819799444c9: Status 404 returned error can't find the container with id 771abe68ef044179bbf5d679144dd8fda7ecdf4b970a78038ea64819799444c9 Feb 27 01:24:54 crc kubenswrapper[4771]: I0227 01:24:54.981748 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6fd6bd959-l4htk"] Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.007197 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e291ec97-2bfe-4bbe-a39d-9eca937f1855-scripts\") pod \"e291ec97-2bfe-4bbe-a39d-9eca937f1855\" (UID: \"e291ec97-2bfe-4bbe-a39d-9eca937f1855\") " Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.007377 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx2kh\" (UniqueName: \"kubernetes.io/projected/e291ec97-2bfe-4bbe-a39d-9eca937f1855-kube-api-access-mx2kh\") pod \"e291ec97-2bfe-4bbe-a39d-9eca937f1855\" (UID: \"e291ec97-2bfe-4bbe-a39d-9eca937f1855\") " Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.007473 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e291ec97-2bfe-4bbe-a39d-9eca937f1855-horizon-secret-key\") pod \"e291ec97-2bfe-4bbe-a39d-9eca937f1855\" (UID: \"e291ec97-2bfe-4bbe-a39d-9eca937f1855\") " Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.007625 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e291ec97-2bfe-4bbe-a39d-9eca937f1855-logs\") pod \"e291ec97-2bfe-4bbe-a39d-9eca937f1855\" (UID: \"e291ec97-2bfe-4bbe-a39d-9eca937f1855\") " Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.007756 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e291ec97-2bfe-4bbe-a39d-9eca937f1855-config-data\") pod \"e291ec97-2bfe-4bbe-a39d-9eca937f1855\" (UID: \"e291ec97-2bfe-4bbe-a39d-9eca937f1855\") " Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.007976 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e291ec97-2bfe-4bbe-a39d-9eca937f1855-logs" (OuterVolumeSpecName: "logs") pod "e291ec97-2bfe-4bbe-a39d-9eca937f1855" (UID: "e291ec97-2bfe-4bbe-a39d-9eca937f1855"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.008144 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27da755f-7147-4fee-af32-994932f0b715-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " pod="openstack/ceilometer-0" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.008234 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27da755f-7147-4fee-af32-994932f0b715-config-data\") pod \"ceilometer-0\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " pod="openstack/ceilometer-0" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.008330 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27da755f-7147-4fee-af32-994932f0b715-log-httpd\") pod \"ceilometer-0\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " pod="openstack/ceilometer-0" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.008396 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27da755f-7147-4fee-af32-994932f0b715-run-httpd\") pod \"ceilometer-0\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " pod="openstack/ceilometer-0" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.008470 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27da755f-7147-4fee-af32-994932f0b715-scripts\") pod \"ceilometer-0\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " pod="openstack/ceilometer-0" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.008618 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27da755f-7147-4fee-af32-994932f0b715-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " pod="openstack/ceilometer-0" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.008694 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frmd5\" (UniqueName: \"kubernetes.io/projected/27da755f-7147-4fee-af32-994932f0b715-kube-api-access-frmd5\") pod \"ceilometer-0\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " pod="openstack/ceilometer-0" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.008811 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e291ec97-2bfe-4bbe-a39d-9eca937f1855-logs\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.013621 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e291ec97-2bfe-4bbe-a39d-9eca937f1855-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e291ec97-2bfe-4bbe-a39d-9eca937f1855" (UID: "e291ec97-2bfe-4bbe-a39d-9eca937f1855"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.013729 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e291ec97-2bfe-4bbe-a39d-9eca937f1855-kube-api-access-mx2kh" (OuterVolumeSpecName: "kube-api-access-mx2kh") pod "e291ec97-2bfe-4bbe-a39d-9eca937f1855" (UID: "e291ec97-2bfe-4bbe-a39d-9eca937f1855"). InnerVolumeSpecName "kube-api-access-mx2kh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.027816 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e291ec97-2bfe-4bbe-a39d-9eca937f1855-scripts" (OuterVolumeSpecName: "scripts") pod "e291ec97-2bfe-4bbe-a39d-9eca937f1855" (UID: "e291ec97-2bfe-4bbe-a39d-9eca937f1855"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.028370 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e291ec97-2bfe-4bbe-a39d-9eca937f1855-config-data" (OuterVolumeSpecName: "config-data") pod "e291ec97-2bfe-4bbe-a39d-9eca937f1855" (UID: "e291ec97-2bfe-4bbe-a39d-9eca937f1855"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.110395 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27da755f-7147-4fee-af32-994932f0b715-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " pod="openstack/ceilometer-0" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.111189 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27da755f-7147-4fee-af32-994932f0b715-config-data\") pod \"ceilometer-0\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " pod="openstack/ceilometer-0" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.111272 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27da755f-7147-4fee-af32-994932f0b715-log-httpd\") pod \"ceilometer-0\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " pod="openstack/ceilometer-0" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.111296 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27da755f-7147-4fee-af32-994932f0b715-run-httpd\") pod \"ceilometer-0\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " pod="openstack/ceilometer-0" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.111363 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27da755f-7147-4fee-af32-994932f0b715-scripts\") pod \"ceilometer-0\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " pod="openstack/ceilometer-0" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.112022 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27da755f-7147-4fee-af32-994932f0b715-log-httpd\") pod \"ceilometer-0\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " pod="openstack/ceilometer-0" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.112109 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27da755f-7147-4fee-af32-994932f0b715-run-httpd\") pod \"ceilometer-0\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " pod="openstack/ceilometer-0" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.112211 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27da755f-7147-4fee-af32-994932f0b715-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " pod="openstack/ceilometer-0" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.112247 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frmd5\" (UniqueName: \"kubernetes.io/projected/27da755f-7147-4fee-af32-994932f0b715-kube-api-access-frmd5\") pod \"ceilometer-0\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " pod="openstack/ceilometer-0" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.112423 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e291ec97-2bfe-4bbe-a39d-9eca937f1855-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.112438 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e291ec97-2bfe-4bbe-a39d-9eca937f1855-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.112449 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx2kh\" (UniqueName: \"kubernetes.io/projected/e291ec97-2bfe-4bbe-a39d-9eca937f1855-kube-api-access-mx2kh\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.112459 4771 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e291ec97-2bfe-4bbe-a39d-9eca937f1855-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.118158 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27da755f-7147-4fee-af32-994932f0b715-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " pod="openstack/ceilometer-0" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.118320 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27da755f-7147-4fee-af32-994932f0b715-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " pod="openstack/ceilometer-0" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.119299 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27da755f-7147-4fee-af32-994932f0b715-config-data\") pod \"ceilometer-0\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " pod="openstack/ceilometer-0" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.121623 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27da755f-7147-4fee-af32-994932f0b715-scripts\") pod \"ceilometer-0\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " pod="openstack/ceilometer-0" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.130744 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frmd5\" (UniqueName: \"kubernetes.io/projected/27da755f-7147-4fee-af32-994932f0b715-kube-api-access-frmd5\") pod \"ceilometer-0\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " pod="openstack/ceilometer-0" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.188992 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.220981 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.491228 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.583703 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.721755 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cczkq\" (UniqueName: \"kubernetes.io/projected/9bed4760-7b73-42e9-89e3-4d1fce55c607-kube-api-access-cczkq\") pod \"9bed4760-7b73-42e9-89e3-4d1fce55c607\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.722111 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bed4760-7b73-42e9-89e3-4d1fce55c607-scripts\") pod \"9bed4760-7b73-42e9-89e3-4d1fce55c607\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.722246 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bed4760-7b73-42e9-89e3-4d1fce55c607-config-data\") pod \"9bed4760-7b73-42e9-89e3-4d1fce55c607\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.722434 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bed4760-7b73-42e9-89e3-4d1fce55c607-logs\") pod \"9bed4760-7b73-42e9-89e3-4d1fce55c607\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.722578 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9bed4760-7b73-42e9-89e3-4d1fce55c607-etc-machine-id\") pod \"9bed4760-7b73-42e9-89e3-4d1fce55c607\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.722641 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9bed4760-7b73-42e9-89e3-4d1fce55c607-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9bed4760-7b73-42e9-89e3-4d1fce55c607" (UID: "9bed4760-7b73-42e9-89e3-4d1fce55c607"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.722805 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bed4760-7b73-42e9-89e3-4d1fce55c607-logs" (OuterVolumeSpecName: "logs") pod "9bed4760-7b73-42e9-89e3-4d1fce55c607" (UID: "9bed4760-7b73-42e9-89e3-4d1fce55c607"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.723002 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9bed4760-7b73-42e9-89e3-4d1fce55c607-config-data-custom\") pod \"9bed4760-7b73-42e9-89e3-4d1fce55c607\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.723134 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bed4760-7b73-42e9-89e3-4d1fce55c607-combined-ca-bundle\") pod \"9bed4760-7b73-42e9-89e3-4d1fce55c607\" (UID: \"9bed4760-7b73-42e9-89e3-4d1fce55c607\") " Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.723824 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bed4760-7b73-42e9-89e3-4d1fce55c607-logs\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.723967 4771 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9bed4760-7b73-42e9-89e3-4d1fce55c607-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.727062 4771 generic.go:334] "Generic (PLEG): container finished" podID="9bed4760-7b73-42e9-89e3-4d1fce55c607" containerID="e7ce76defb68be4302607139bc88494b2ca8ae023fe03c5fc6e305182c3083a5" exitCode=0 Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.727128 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9bed4760-7b73-42e9-89e3-4d1fce55c607","Type":"ContainerDied","Data":"e7ce76defb68be4302607139bc88494b2ca8ae023fe03c5fc6e305182c3083a5"} Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.727149 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9bed4760-7b73-42e9-89e3-4d1fce55c607","Type":"ContainerDied","Data":"208ad725a7483c381dd33fa5297a50d113309e95245e2a3cd93fb2058b42efd7"} Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.727166 4771 scope.go:117] "RemoveContainer" containerID="e7ce76defb68be4302607139bc88494b2ca8ae023fe03c5fc6e305182c3083a5" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.727361 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.729746 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bed4760-7b73-42e9-89e3-4d1fce55c607-kube-api-access-cczkq" (OuterVolumeSpecName: "kube-api-access-cczkq") pod "9bed4760-7b73-42e9-89e3-4d1fce55c607" (UID: "9bed4760-7b73-42e9-89e3-4d1fce55c607"). InnerVolumeSpecName "kube-api-access-cczkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.734084 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-594f74c97c-r6bp5" event={"ID":"e291ec97-2bfe-4bbe-a39d-9eca937f1855","Type":"ContainerDied","Data":"e6bfd19b6bc88cee5b51d9a873846b38bed6b86f360da5995b0f86d8c030aeae"} Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.734198 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-594f74c97c-r6bp5" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.735743 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bed4760-7b73-42e9-89e3-4d1fce55c607-scripts" (OuterVolumeSpecName: "scripts") pod "9bed4760-7b73-42e9-89e3-4d1fce55c607" (UID: "9bed4760-7b73-42e9-89e3-4d1fce55c607"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.740666 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bed4760-7b73-42e9-89e3-4d1fce55c607-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9bed4760-7b73-42e9-89e3-4d1fce55c607" (UID: "9bed4760-7b73-42e9-89e3-4d1fce55c607"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.748612 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fd6bd959-l4htk" event={"ID":"db54a8be-2fc6-4aee-b505-e1a526407006","Type":"ContainerStarted","Data":"89aafa3426e45af249bcacf3fe04141502142ccf9c294f4d60484db77c3ce35f"} Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.748682 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fd6bd959-l4htk" event={"ID":"db54a8be-2fc6-4aee-b505-e1a526407006","Type":"ContainerStarted","Data":"ca215bbef9615695cac9de71fe8a76dcca769995a77a04fb2893eec23a5c32c6"} Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.748695 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fd6bd959-l4htk" event={"ID":"db54a8be-2fc6-4aee-b505-e1a526407006","Type":"ContainerStarted","Data":"771abe68ef044179bbf5d679144dd8fda7ecdf4b970a78038ea64819799444c9"} Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.748711 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6fd6bd959-l4htk" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.755997 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bed4760-7b73-42e9-89e3-4d1fce55c607-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9bed4760-7b73-42e9-89e3-4d1fce55c607" (UID: "9bed4760-7b73-42e9-89e3-4d1fce55c607"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.757534 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.764100 4771 scope.go:117] "RemoveContainer" containerID="c4131aff870d0903040e0b6187cdf1bb3466664f106d2fe9873474a7c94dd285" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.781743 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6c5945c865-z7kz7" podUID="eca4c4e6-7c04-473a-921b-c6f7e98c81b3" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.161:9696/\": dial tcp 10.217.0.161:9696: connect: connection refused" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.790270 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6fd6bd959-l4htk" podStartSLOduration=2.790245448 podStartE2EDuration="2.790245448s" podCreationTimestamp="2026-02-27 01:24:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:24:55.774385097 +0000 UTC m=+1208.711946385" watchObservedRunningTime="2026-02-27 01:24:55.790245448 +0000 UTC m=+1208.727806746" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.796166 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56ed04c4-c2a4-47be-8b9f-faaea9aab6c3" path="/var/lib/kubelet/pods/56ed04c4-c2a4-47be-8b9f-faaea9aab6c3/volumes" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.803989 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-594f74c97c-r6bp5"] Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.811425 4771 scope.go:117] "RemoveContainer" containerID="e7ce76defb68be4302607139bc88494b2ca8ae023fe03c5fc6e305182c3083a5" Feb 27 01:24:55 crc kubenswrapper[4771]: E0227 01:24:55.812015 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7ce76defb68be4302607139bc88494b2ca8ae023fe03c5fc6e305182c3083a5\": container with ID starting with e7ce76defb68be4302607139bc88494b2ca8ae023fe03c5fc6e305182c3083a5 not found: ID does not exist" containerID="e7ce76defb68be4302607139bc88494b2ca8ae023fe03c5fc6e305182c3083a5" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.812161 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7ce76defb68be4302607139bc88494b2ca8ae023fe03c5fc6e305182c3083a5"} err="failed to get container status \"e7ce76defb68be4302607139bc88494b2ca8ae023fe03c5fc6e305182c3083a5\": rpc error: code = NotFound desc = could not find container \"e7ce76defb68be4302607139bc88494b2ca8ae023fe03c5fc6e305182c3083a5\": container with ID starting with e7ce76defb68be4302607139bc88494b2ca8ae023fe03c5fc6e305182c3083a5 not found: ID does not exist" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.812316 4771 scope.go:117] "RemoveContainer" containerID="c4131aff870d0903040e0b6187cdf1bb3466664f106d2fe9873474a7c94dd285" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.812170 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-594f74c97c-r6bp5"] Feb 27 01:24:55 crc kubenswrapper[4771]: E0227 01:24:55.813011 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4131aff870d0903040e0b6187cdf1bb3466664f106d2fe9873474a7c94dd285\": container with ID starting with c4131aff870d0903040e0b6187cdf1bb3466664f106d2fe9873474a7c94dd285 not found: ID does not exist" containerID="c4131aff870d0903040e0b6187cdf1bb3466664f106d2fe9873474a7c94dd285" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.813060 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4131aff870d0903040e0b6187cdf1bb3466664f106d2fe9873474a7c94dd285"} err="failed to get container status \"c4131aff870d0903040e0b6187cdf1bb3466664f106d2fe9873474a7c94dd285\": rpc error: code = NotFound desc = could not find container \"c4131aff870d0903040e0b6187cdf1bb3466664f106d2fe9873474a7c94dd285\": container with ID starting with c4131aff870d0903040e0b6187cdf1bb3466664f106d2fe9873474a7c94dd285 not found: ID does not exist" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.813086 4771 scope.go:117] "RemoveContainer" containerID="fc53994384bdc2f24024e47615918ee993ad81722b3c10093142c5fd1d9a7756" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.814280 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bed4760-7b73-42e9-89e3-4d1fce55c607-config-data" (OuterVolumeSpecName: "config-data") pod "9bed4760-7b73-42e9-89e3-4d1fce55c607" (UID: "9bed4760-7b73-42e9-89e3-4d1fce55c607"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.829940 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9bed4760-7b73-42e9-89e3-4d1fce55c607-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.829972 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bed4760-7b73-42e9-89e3-4d1fce55c607-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.829981 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cczkq\" (UniqueName: \"kubernetes.io/projected/9bed4760-7b73-42e9-89e3-4d1fce55c607-kube-api-access-cczkq\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.829992 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bed4760-7b73-42e9-89e3-4d1fce55c607-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.830000 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bed4760-7b73-42e9-89e3-4d1fce55c607-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:24:55 crc kubenswrapper[4771]: I0227 01:24:55.995880 4771 scope.go:117] "RemoveContainer" containerID="c6a969f9e4477c27a136509637556ccc0114acef1770783092c387907188497e" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.067375 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.086424 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.099694 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 27 01:24:56 crc kubenswrapper[4771]: E0227 01:24:56.100117 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e291ec97-2bfe-4bbe-a39d-9eca937f1855" containerName="horizon" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.100133 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e291ec97-2bfe-4bbe-a39d-9eca937f1855" containerName="horizon" Feb 27 01:24:56 crc kubenswrapper[4771]: E0227 01:24:56.100156 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bed4760-7b73-42e9-89e3-4d1fce55c607" containerName="cinder-api" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.100164 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bed4760-7b73-42e9-89e3-4d1fce55c607" containerName="cinder-api" Feb 27 01:24:56 crc kubenswrapper[4771]: E0227 01:24:56.100180 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bed4760-7b73-42e9-89e3-4d1fce55c607" containerName="cinder-api-log" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.100187 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bed4760-7b73-42e9-89e3-4d1fce55c607" containerName="cinder-api-log" Feb 27 01:24:56 crc kubenswrapper[4771]: E0227 01:24:56.100201 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e291ec97-2bfe-4bbe-a39d-9eca937f1855" containerName="horizon-log" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.100206 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e291ec97-2bfe-4bbe-a39d-9eca937f1855" containerName="horizon-log" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.100368 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bed4760-7b73-42e9-89e3-4d1fce55c607" containerName="cinder-api" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.100379 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e291ec97-2bfe-4bbe-a39d-9eca937f1855" containerName="horizon" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.100386 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e291ec97-2bfe-4bbe-a39d-9eca937f1855" containerName="horizon-log" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.100400 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bed4760-7b73-42e9-89e3-4d1fce55c607" containerName="cinder-api-log" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.101307 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.106650 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.106725 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.106580 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.109190 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.244829 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b708a5c-dd83-482a-bf4a-988909a38d76-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.244878 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b708a5c-dd83-482a-bf4a-988909a38d76-logs\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.244911 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b708a5c-dd83-482a-bf4a-988909a38d76-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.245201 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b708a5c-dd83-482a-bf4a-988909a38d76-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.245287 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hxkp\" (UniqueName: \"kubernetes.io/projected/3b708a5c-dd83-482a-bf4a-988909a38d76-kube-api-access-4hxkp\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.245346 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b708a5c-dd83-482a-bf4a-988909a38d76-scripts\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.245385 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b708a5c-dd83-482a-bf4a-988909a38d76-config-data\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.245470 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b708a5c-dd83-482a-bf4a-988909a38d76-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.245506 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b708a5c-dd83-482a-bf4a-988909a38d76-config-data-custom\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.346822 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b708a5c-dd83-482a-bf4a-988909a38d76-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.346877 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b708a5c-dd83-482a-bf4a-988909a38d76-config-data-custom\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.346949 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b708a5c-dd83-482a-bf4a-988909a38d76-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.346981 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b708a5c-dd83-482a-bf4a-988909a38d76-logs\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.347000 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b708a5c-dd83-482a-bf4a-988909a38d76-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.347079 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b708a5c-dd83-482a-bf4a-988909a38d76-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.347125 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hxkp\" (UniqueName: \"kubernetes.io/projected/3b708a5c-dd83-482a-bf4a-988909a38d76-kube-api-access-4hxkp\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.347143 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b708a5c-dd83-482a-bf4a-988909a38d76-scripts\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.347161 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b708a5c-dd83-482a-bf4a-988909a38d76-config-data\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.347954 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b708a5c-dd83-482a-bf4a-988909a38d76-logs\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.350768 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b708a5c-dd83-482a-bf4a-988909a38d76-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.351330 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b708a5c-dd83-482a-bf4a-988909a38d76-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.355414 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b708a5c-dd83-482a-bf4a-988909a38d76-config-data\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.358081 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b708a5c-dd83-482a-bf4a-988909a38d76-config-data-custom\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.363327 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b708a5c-dd83-482a-bf4a-988909a38d76-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.369085 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b708a5c-dd83-482a-bf4a-988909a38d76-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.369504 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hxkp\" (UniqueName: \"kubernetes.io/projected/3b708a5c-dd83-482a-bf4a-988909a38d76-kube-api-access-4hxkp\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.370109 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b708a5c-dd83-482a-bf4a-988909a38d76-scripts\") pod \"cinder-api-0\" (UID: \"3b708a5c-dd83-482a-bf4a-988909a38d76\") " pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.435894 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 27 01:24:56 crc kubenswrapper[4771]: I0227 01:24:56.781165 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27da755f-7147-4fee-af32-994932f0b715","Type":"ContainerStarted","Data":"4a1c936adc8313926f5ce4da7087dea8f9f9f83321d2ee37df906d9553bbd2c4"} Feb 27 01:24:57 crc kubenswrapper[4771]: W0227 01:24:57.027871 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b708a5c_dd83_482a_bf4a_988909a38d76.slice/crio-74df3a4b4d3cc5b632d6eed6db4ac1c90b07d413b1c8d59484a3f5a4bc4af210 WatchSource:0}: Error finding container 74df3a4b4d3cc5b632d6eed6db4ac1c90b07d413b1c8d59484a3f5a4bc4af210: Status 404 returned error can't find the container with id 74df3a4b4d3cc5b632d6eed6db4ac1c90b07d413b1c8d59484a3f5a4bc4af210 Feb 27 01:24:57 crc kubenswrapper[4771]: I0227 01:24:57.032424 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 27 01:24:57 crc kubenswrapper[4771]: I0227 01:24:57.158103 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:24:57 crc kubenswrapper[4771]: I0227 01:24:57.668068 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-555c84df64-lmgxw" Feb 27 01:24:57 crc kubenswrapper[4771]: I0227 01:24:57.749413 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7fb8f8d788-kjgv6"] Feb 27 01:24:57 crc kubenswrapper[4771]: I0227 01:24:57.806894 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bed4760-7b73-42e9-89e3-4d1fce55c607" path="/var/lib/kubelet/pods/9bed4760-7b73-42e9-89e3-4d1fce55c607/volumes" Feb 27 01:24:57 crc kubenswrapper[4771]: I0227 01:24:57.812681 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e291ec97-2bfe-4bbe-a39d-9eca937f1855" path="/var/lib/kubelet/pods/e291ec97-2bfe-4bbe-a39d-9eca937f1855/volumes" Feb 27 01:24:57 crc kubenswrapper[4771]: I0227 01:24:57.828106 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27da755f-7147-4fee-af32-994932f0b715","Type":"ContainerStarted","Data":"11ce1410c8bac052456f2674c6d6df4df2b88857e5958f89e1bd872be3ce0ff5"} Feb 27 01:24:57 crc kubenswrapper[4771]: I0227 01:24:57.839928 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7fb8f8d788-kjgv6" podUID="9d4b37ec-a290-4c5c-9a05-31e0499c3ed7" containerName="horizon-log" containerID="cri-o://f00f02e0a0a413602901c163d4c12c6a7fe17ee6a0b5395d18a0d9b6c3ffe9f3" gracePeriod=30 Feb 27 01:24:57 crc kubenswrapper[4771]: I0227 01:24:57.840238 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3b708a5c-dd83-482a-bf4a-988909a38d76","Type":"ContainerStarted","Data":"d565ef37634d2c2685f27c453bb77d551a920de41ae20baa978c526c4b18ff59"} Feb 27 01:24:57 crc kubenswrapper[4771]: I0227 01:24:57.840266 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3b708a5c-dd83-482a-bf4a-988909a38d76","Type":"ContainerStarted","Data":"74df3a4b4d3cc5b632d6eed6db4ac1c90b07d413b1c8d59484a3f5a4bc4af210"} Feb 27 01:24:57 crc kubenswrapper[4771]: I0227 01:24:57.840626 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7fb8f8d788-kjgv6" podUID="9d4b37ec-a290-4c5c-9a05-31e0499c3ed7" containerName="horizon" containerID="cri-o://ff183ff6573a5763d5b2e109ee1eb7352e632d657f81e62dff6a2c6bc7b3dd1d" gracePeriod=30 Feb 27 01:24:58 crc kubenswrapper[4771]: I0227 01:24:58.860916 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3b708a5c-dd83-482a-bf4a-988909a38d76","Type":"ContainerStarted","Data":"6a6a0f31f2086a600367a44e167cd72416f2b64f91bfe3e7ab6602292ee8b673"} Feb 27 01:24:58 crc kubenswrapper[4771]: I0227 01:24:58.862343 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 27 01:24:58 crc kubenswrapper[4771]: I0227 01:24:58.870814 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27da755f-7147-4fee-af32-994932f0b715","Type":"ContainerStarted","Data":"fa6776706eed4113429819150516809f826cee28acfecc2925bd2ca2e23044da"} Feb 27 01:24:58 crc kubenswrapper[4771]: I0227 01:24:58.895457 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.895438051 podStartE2EDuration="2.895438051s" podCreationTimestamp="2026-02-27 01:24:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:24:58.881732738 +0000 UTC m=+1211.819294036" watchObservedRunningTime="2026-02-27 01:24:58.895438051 +0000 UTC m=+1211.832999339" Feb 27 01:24:58 crc kubenswrapper[4771]: I0227 01:24:58.924272 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-599ccf9f8d-z6nsl" Feb 27 01:24:58 crc kubenswrapper[4771]: I0227 01:24:58.948979 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-599ccf9f8d-z6nsl" Feb 27 01:24:59 crc kubenswrapper[4771]: I0227 01:24:59.635304 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 27 01:24:59 crc kubenswrapper[4771]: I0227 01:24:59.704242 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 01:24:59 crc kubenswrapper[4771]: I0227 01:24:59.734794 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" Feb 27 01:24:59 crc kubenswrapper[4771]: I0227 01:24:59.797181 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-n5x8h"] Feb 27 01:24:59 crc kubenswrapper[4771]: I0227 01:24:59.805787 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" podUID="345ab929-8a28-4d72-a196-bd831e1f3d0a" containerName="dnsmasq-dns" containerID="cri-o://617b858b6cec96202071d7138e71cfb49b7320690803df419dee57403c1b814c" gracePeriod=10 Feb 27 01:24:59 crc kubenswrapper[4771]: I0227 01:24:59.930882 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27da755f-7147-4fee-af32-994932f0b715","Type":"ContainerStarted","Data":"550648b9982885f7e4e9c7c22d09cc61b35ae7c81dca6526f000a6f13a2ecb1b"} Feb 27 01:24:59 crc kubenswrapper[4771]: I0227 01:24:59.931808 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="03e3abf3-c61d-4e79-b832-35abf5025c30" containerName="cinder-scheduler" containerID="cri-o://77272668990de879afe181aa43c8313372c5351b2a6364c3078ca0004de832e0" gracePeriod=30 Feb 27 01:24:59 crc kubenswrapper[4771]: I0227 01:24:59.932611 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="03e3abf3-c61d-4e79-b832-35abf5025c30" containerName="probe" containerID="cri-o://78931a56a27ade7568dba31e202aafa1126640db6ead34072afa80852fe068c2" gracePeriod=30 Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.337509 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.422733 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-ovsdbserver-sb\") pod \"345ab929-8a28-4d72-a196-bd831e1f3d0a\" (UID: \"345ab929-8a28-4d72-a196-bd831e1f3d0a\") " Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.422813 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-config\") pod \"345ab929-8a28-4d72-a196-bd831e1f3d0a\" (UID: \"345ab929-8a28-4d72-a196-bd831e1f3d0a\") " Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.422847 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlfzm\" (UniqueName: \"kubernetes.io/projected/345ab929-8a28-4d72-a196-bd831e1f3d0a-kube-api-access-tlfzm\") pod \"345ab929-8a28-4d72-a196-bd831e1f3d0a\" (UID: \"345ab929-8a28-4d72-a196-bd831e1f3d0a\") " Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.422882 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-dns-swift-storage-0\") pod \"345ab929-8a28-4d72-a196-bd831e1f3d0a\" (UID: \"345ab929-8a28-4d72-a196-bd831e1f3d0a\") " Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.422937 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-ovsdbserver-nb\") pod \"345ab929-8a28-4d72-a196-bd831e1f3d0a\" (UID: \"345ab929-8a28-4d72-a196-bd831e1f3d0a\") " Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.423044 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-dns-svc\") pod \"345ab929-8a28-4d72-a196-bd831e1f3d0a\" (UID: \"345ab929-8a28-4d72-a196-bd831e1f3d0a\") " Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.431328 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/345ab929-8a28-4d72-a196-bd831e1f3d0a-kube-api-access-tlfzm" (OuterVolumeSpecName: "kube-api-access-tlfzm") pod "345ab929-8a28-4d72-a196-bd831e1f3d0a" (UID: "345ab929-8a28-4d72-a196-bd831e1f3d0a"). InnerVolumeSpecName "kube-api-access-tlfzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.525441 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlfzm\" (UniqueName: \"kubernetes.io/projected/345ab929-8a28-4d72-a196-bd831e1f3d0a-kube-api-access-tlfzm\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.544320 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "345ab929-8a28-4d72-a196-bd831e1f3d0a" (UID: "345ab929-8a28-4d72-a196-bd831e1f3d0a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.556273 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-config" (OuterVolumeSpecName: "config") pod "345ab929-8a28-4d72-a196-bd831e1f3d0a" (UID: "345ab929-8a28-4d72-a196-bd831e1f3d0a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.556840 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "345ab929-8a28-4d72-a196-bd831e1f3d0a" (UID: "345ab929-8a28-4d72-a196-bd831e1f3d0a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.560281 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "345ab929-8a28-4d72-a196-bd831e1f3d0a" (UID: "345ab929-8a28-4d72-a196-bd831e1f3d0a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.574598 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "345ab929-8a28-4d72-a196-bd831e1f3d0a" (UID: "345ab929-8a28-4d72-a196-bd831e1f3d0a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.628720 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.628756 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.628766 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.628775 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.628784 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/345ab929-8a28-4d72-a196-bd831e1f3d0a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.943189 4771 generic.go:334] "Generic (PLEG): container finished" podID="345ab929-8a28-4d72-a196-bd831e1f3d0a" containerID="617b858b6cec96202071d7138e71cfb49b7320690803df419dee57403c1b814c" exitCode=0 Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.943460 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" event={"ID":"345ab929-8a28-4d72-a196-bd831e1f3d0a","Type":"ContainerDied","Data":"617b858b6cec96202071d7138e71cfb49b7320690803df419dee57403c1b814c"} Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.943486 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" event={"ID":"345ab929-8a28-4d72-a196-bd831e1f3d0a","Type":"ContainerDied","Data":"a523b83193d12a4f2500465609748f068d334769a2f8403113a9fd53af811240"} Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.943501 4771 scope.go:117] "RemoveContainer" containerID="617b858b6cec96202071d7138e71cfb49b7320690803df419dee57403c1b814c" Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.943636 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-n5x8h" Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.972422 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27da755f-7147-4fee-af32-994932f0b715","Type":"ContainerStarted","Data":"89e7cd8bc60506a2a24f7532e7eab47b97372fc8e3b8d634886cd4a769da84d2"} Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.972467 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 01:25:00 crc kubenswrapper[4771]: I0227 01:25:00.999743 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.204682414 podStartE2EDuration="6.999727059s" podCreationTimestamp="2026-02-27 01:24:54 +0000 UTC" firstStartedPulling="2026-02-27 01:24:55.785043466 +0000 UTC m=+1208.722604754" lastFinishedPulling="2026-02-27 01:25:00.580088121 +0000 UTC m=+1213.517649399" observedRunningTime="2026-02-27 01:25:00.994359853 +0000 UTC m=+1213.931921141" watchObservedRunningTime="2026-02-27 01:25:00.999727059 +0000 UTC m=+1213.937288347" Feb 27 01:25:01 crc kubenswrapper[4771]: I0227 01:25:01.036516 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-n5x8h"] Feb 27 01:25:01 crc kubenswrapper[4771]: I0227 01:25:01.043561 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-n5x8h"] Feb 27 01:25:01 crc kubenswrapper[4771]: I0227 01:25:01.049273 4771 scope.go:117] "RemoveContainer" containerID="c23cc082f148d95e0671b126b2c45b531b74f8c6f74e9b5c6d5df5e1e3ea8e8f" Feb 27 01:25:01 crc kubenswrapper[4771]: I0227 01:25:01.118317 4771 scope.go:117] "RemoveContainer" containerID="617b858b6cec96202071d7138e71cfb49b7320690803df419dee57403c1b814c" Feb 27 01:25:01 crc kubenswrapper[4771]: E0227 01:25:01.118731 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"617b858b6cec96202071d7138e71cfb49b7320690803df419dee57403c1b814c\": container with ID starting with 617b858b6cec96202071d7138e71cfb49b7320690803df419dee57403c1b814c not found: ID does not exist" containerID="617b858b6cec96202071d7138e71cfb49b7320690803df419dee57403c1b814c" Feb 27 01:25:01 crc kubenswrapper[4771]: I0227 01:25:01.118755 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"617b858b6cec96202071d7138e71cfb49b7320690803df419dee57403c1b814c"} err="failed to get container status \"617b858b6cec96202071d7138e71cfb49b7320690803df419dee57403c1b814c\": rpc error: code = NotFound desc = could not find container \"617b858b6cec96202071d7138e71cfb49b7320690803df419dee57403c1b814c\": container with ID starting with 617b858b6cec96202071d7138e71cfb49b7320690803df419dee57403c1b814c not found: ID does not exist" Feb 27 01:25:01 crc kubenswrapper[4771]: I0227 01:25:01.118777 4771 scope.go:117] "RemoveContainer" containerID="c23cc082f148d95e0671b126b2c45b531b74f8c6f74e9b5c6d5df5e1e3ea8e8f" Feb 27 01:25:01 crc kubenswrapper[4771]: E0227 01:25:01.119000 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c23cc082f148d95e0671b126b2c45b531b74f8c6f74e9b5c6d5df5e1e3ea8e8f\": container with ID starting with c23cc082f148d95e0671b126b2c45b531b74f8c6f74e9b5c6d5df5e1e3ea8e8f not found: ID does not exist" containerID="c23cc082f148d95e0671b126b2c45b531b74f8c6f74e9b5c6d5df5e1e3ea8e8f" Feb 27 01:25:01 crc kubenswrapper[4771]: I0227 01:25:01.119022 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c23cc082f148d95e0671b126b2c45b531b74f8c6f74e9b5c6d5df5e1e3ea8e8f"} err="failed to get container status \"c23cc082f148d95e0671b126b2c45b531b74f8c6f74e9b5c6d5df5e1e3ea8e8f\": rpc error: code = NotFound desc = could not find container \"c23cc082f148d95e0671b126b2c45b531b74f8c6f74e9b5c6d5df5e1e3ea8e8f\": container with ID starting with c23cc082f148d95e0671b126b2c45b531b74f8c6f74e9b5c6d5df5e1e3ea8e8f not found: ID does not exist" Feb 27 01:25:01 crc kubenswrapper[4771]: I0227 01:25:01.575500 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:25:01 crc kubenswrapper[4771]: I0227 01:25:01.751753 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-69fd595d46-6k6cs" Feb 27 01:25:01 crc kubenswrapper[4771]: I0227 01:25:01.802674 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="345ab929-8a28-4d72-a196-bd831e1f3d0a" path="/var/lib/kubelet/pods/345ab929-8a28-4d72-a196-bd831e1f3d0a/volumes" Feb 27 01:25:01 crc kubenswrapper[4771]: I0227 01:25:01.855725 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-599ccf9f8d-z6nsl"] Feb 27 01:25:01 crc kubenswrapper[4771]: I0227 01:25:01.856076 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-599ccf9f8d-z6nsl" podUID="b53fd943-6ac9-4338-b3d5-83a9627f1c78" containerName="barbican-api-log" containerID="cri-o://3822830ec64f4a7ad43ecbbe4a7fef3f9f87b9a14213a637a15ceb89bdf68519" gracePeriod=30 Feb 27 01:25:01 crc kubenswrapper[4771]: I0227 01:25:01.856267 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-599ccf9f8d-z6nsl" podUID="b53fd943-6ac9-4338-b3d5-83a9627f1c78" containerName="barbican-api" containerID="cri-o://b9817d0ae2a6a79c13a517bf8586ad9fd88bef7d7493c68d6934161255c6acce" gracePeriod=30 Feb 27 01:25:02 crc kubenswrapper[4771]: I0227 01:25:02.001307 4771 generic.go:334] "Generic (PLEG): container finished" podID="9d4b37ec-a290-4c5c-9a05-31e0499c3ed7" containerID="ff183ff6573a5763d5b2e109ee1eb7352e632d657f81e62dff6a2c6bc7b3dd1d" exitCode=0 Feb 27 01:25:02 crc kubenswrapper[4771]: I0227 01:25:02.001393 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fb8f8d788-kjgv6" event={"ID":"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7","Type":"ContainerDied","Data":"ff183ff6573a5763d5b2e109ee1eb7352e632d657f81e62dff6a2c6bc7b3dd1d"} Feb 27 01:25:02 crc kubenswrapper[4771]: I0227 01:25:02.017854 4771 generic.go:334] "Generic (PLEG): container finished" podID="03e3abf3-c61d-4e79-b832-35abf5025c30" containerID="78931a56a27ade7568dba31e202aafa1126640db6ead34072afa80852fe068c2" exitCode=0 Feb 27 01:25:02 crc kubenswrapper[4771]: I0227 01:25:02.017994 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"03e3abf3-c61d-4e79-b832-35abf5025c30","Type":"ContainerDied","Data":"78931a56a27ade7568dba31e202aafa1126640db6ead34072afa80852fe068c2"} Feb 27 01:25:02 crc kubenswrapper[4771]: I0227 01:25:02.394153 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7fb8f8d788-kjgv6" podUID="9d4b37ec-a290-4c5c-9a05-31e0499c3ed7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Feb 27 01:25:03 crc kubenswrapper[4771]: I0227 01:25:03.048486 4771 generic.go:334] "Generic (PLEG): container finished" podID="b53fd943-6ac9-4338-b3d5-83a9627f1c78" containerID="3822830ec64f4a7ad43ecbbe4a7fef3f9f87b9a14213a637a15ceb89bdf68519" exitCode=143 Feb 27 01:25:03 crc kubenswrapper[4771]: I0227 01:25:03.049242 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-599ccf9f8d-z6nsl" event={"ID":"b53fd943-6ac9-4338-b3d5-83a9627f1c78","Type":"ContainerDied","Data":"3822830ec64f4a7ad43ecbbe4a7fef3f9f87b9a14213a637a15ceb89bdf68519"} Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.065676 4771 generic.go:334] "Generic (PLEG): container finished" podID="eca4c4e6-7c04-473a-921b-c6f7e98c81b3" containerID="ed10914f70340fa9edb88427c9685cc726d0e02226638f2978714fec22b0b45f" exitCode=0 Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.066279 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c5945c865-z7kz7" event={"ID":"eca4c4e6-7c04-473a-921b-c6f7e98c81b3","Type":"ContainerDied","Data":"ed10914f70340fa9edb88427c9685cc726d0e02226638f2978714fec22b0b45f"} Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.196282 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c5945c865-z7kz7" Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.295740 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-config\") pod \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.295789 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-combined-ca-bundle\") pod \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.295847 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-internal-tls-certs\") pod \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.295892 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-httpd-config\") pod \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.295951 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-ovndb-tls-certs\") pod \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.296021 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-public-tls-certs\") pod \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.296123 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk2lj\" (UniqueName: \"kubernetes.io/projected/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-kube-api-access-wk2lj\") pod \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\" (UID: \"eca4c4e6-7c04-473a-921b-c6f7e98c81b3\") " Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.319975 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-kube-api-access-wk2lj" (OuterVolumeSpecName: "kube-api-access-wk2lj") pod "eca4c4e6-7c04-473a-921b-c6f7e98c81b3" (UID: "eca4c4e6-7c04-473a-921b-c6f7e98c81b3"). InnerVolumeSpecName "kube-api-access-wk2lj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.326674 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "eca4c4e6-7c04-473a-921b-c6f7e98c81b3" (UID: "eca4c4e6-7c04-473a-921b-c6f7e98c81b3"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.348157 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-config" (OuterVolumeSpecName: "config") pod "eca4c4e6-7c04-473a-921b-c6f7e98c81b3" (UID: "eca4c4e6-7c04-473a-921b-c6f7e98c81b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.350990 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "eca4c4e6-7c04-473a-921b-c6f7e98c81b3" (UID: "eca4c4e6-7c04-473a-921b-c6f7e98c81b3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.361623 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eca4c4e6-7c04-473a-921b-c6f7e98c81b3" (UID: "eca4c4e6-7c04-473a-921b-c6f7e98c81b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.363702 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eca4c4e6-7c04-473a-921b-c6f7e98c81b3" (UID: "eca4c4e6-7c04-473a-921b-c6f7e98c81b3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.397921 4771 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.398007 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk2lj\" (UniqueName: \"kubernetes.io/projected/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-kube-api-access-wk2lj\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.398027 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.398043 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.398055 4771 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.398066 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.402654 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "eca4c4e6-7c04-473a-921b-c6f7e98c81b3" (UID: "eca4c4e6-7c04-473a-921b-c6f7e98c81b3"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.499813 4771 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eca4c4e6-7c04-473a-921b-c6f7e98c81b3-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.861683 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.911511 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03e3abf3-c61d-4e79-b832-35abf5025c30-scripts\") pod \"03e3abf3-c61d-4e79-b832-35abf5025c30\" (UID: \"03e3abf3-c61d-4e79-b832-35abf5025c30\") " Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.911945 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03e3abf3-c61d-4e79-b832-35abf5025c30-combined-ca-bundle\") pod \"03e3abf3-c61d-4e79-b832-35abf5025c30\" (UID: \"03e3abf3-c61d-4e79-b832-35abf5025c30\") " Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.912031 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03e3abf3-c61d-4e79-b832-35abf5025c30-etc-machine-id\") pod \"03e3abf3-c61d-4e79-b832-35abf5025c30\" (UID: \"03e3abf3-c61d-4e79-b832-35abf5025c30\") " Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.912140 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03e3abf3-c61d-4e79-b832-35abf5025c30-config-data-custom\") pod \"03e3abf3-c61d-4e79-b832-35abf5025c30\" (UID: \"03e3abf3-c61d-4e79-b832-35abf5025c30\") " Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.912265 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03e3abf3-c61d-4e79-b832-35abf5025c30-config-data\") pod \"03e3abf3-c61d-4e79-b832-35abf5025c30\" (UID: \"03e3abf3-c61d-4e79-b832-35abf5025c30\") " Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.912360 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dz5xl\" (UniqueName: \"kubernetes.io/projected/03e3abf3-c61d-4e79-b832-35abf5025c30-kube-api-access-dz5xl\") pod \"03e3abf3-c61d-4e79-b832-35abf5025c30\" (UID: \"03e3abf3-c61d-4e79-b832-35abf5025c30\") " Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.912742 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/03e3abf3-c61d-4e79-b832-35abf5025c30-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "03e3abf3-c61d-4e79-b832-35abf5025c30" (UID: "03e3abf3-c61d-4e79-b832-35abf5025c30"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.919035 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03e3abf3-c61d-4e79-b832-35abf5025c30-scripts" (OuterVolumeSpecName: "scripts") pod "03e3abf3-c61d-4e79-b832-35abf5025c30" (UID: "03e3abf3-c61d-4e79-b832-35abf5025c30"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.921432 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03e3abf3-c61d-4e79-b832-35abf5025c30-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "03e3abf3-c61d-4e79-b832-35abf5025c30" (UID: "03e3abf3-c61d-4e79-b832-35abf5025c30"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.923200 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03e3abf3-c61d-4e79-b832-35abf5025c30-kube-api-access-dz5xl" (OuterVolumeSpecName: "kube-api-access-dz5xl") pod "03e3abf3-c61d-4e79-b832-35abf5025c30" (UID: "03e3abf3-c61d-4e79-b832-35abf5025c30"). InnerVolumeSpecName "kube-api-access-dz5xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:25:04 crc kubenswrapper[4771]: I0227 01:25:04.970776 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03e3abf3-c61d-4e79-b832-35abf5025c30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03e3abf3-c61d-4e79-b832-35abf5025c30" (UID: "03e3abf3-c61d-4e79-b832-35abf5025c30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.014148 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dz5xl\" (UniqueName: \"kubernetes.io/projected/03e3abf3-c61d-4e79-b832-35abf5025c30-kube-api-access-dz5xl\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.014184 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03e3abf3-c61d-4e79-b832-35abf5025c30-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.014197 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03e3abf3-c61d-4e79-b832-35abf5025c30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.014208 4771 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03e3abf3-c61d-4e79-b832-35abf5025c30-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.014220 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03e3abf3-c61d-4e79-b832-35abf5025c30-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.018033 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-599ccf9f8d-z6nsl" podUID="b53fd943-6ac9-4338-b3d5-83a9627f1c78" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.168:9311/healthcheck\": read tcp 10.217.0.2:51112->10.217.0.168:9311: read: connection reset by peer" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.018122 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-599ccf9f8d-z6nsl" podUID="b53fd943-6ac9-4338-b3d5-83a9627f1c78" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.168:9311/healthcheck\": read tcp 10.217.0.2:51100->10.217.0.168:9311: read: connection reset by peer" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.029584 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03e3abf3-c61d-4e79-b832-35abf5025c30-config-data" (OuterVolumeSpecName: "config-data") pod "03e3abf3-c61d-4e79-b832-35abf5025c30" (UID: "03e3abf3-c61d-4e79-b832-35abf5025c30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.081347 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c5945c865-z7kz7" event={"ID":"eca4c4e6-7c04-473a-921b-c6f7e98c81b3","Type":"ContainerDied","Data":"fb38757c81a6e8a2f500cbd4f718222cf897a2c41b9583c86be6d120203ab552"} Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.081394 4771 scope.go:117] "RemoveContainer" containerID="f61f2708c8d73b030c8335a3c492a7f2c6f3eef399fef9b0a0394e4fc6e69bf6" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.081489 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c5945c865-z7kz7" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.089282 4771 generic.go:334] "Generic (PLEG): container finished" podID="03e3abf3-c61d-4e79-b832-35abf5025c30" containerID="77272668990de879afe181aa43c8313372c5351b2a6364c3078ca0004de832e0" exitCode=0 Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.089323 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"03e3abf3-c61d-4e79-b832-35abf5025c30","Type":"ContainerDied","Data":"77272668990de879afe181aa43c8313372c5351b2a6364c3078ca0004de832e0"} Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.089349 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"03e3abf3-c61d-4e79-b832-35abf5025c30","Type":"ContainerDied","Data":"44815cbe87f42e8cabbcd8d4444578bec98e69a240b9ceb901a62ce7296e242b"} Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.089410 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.115406 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03e3abf3-c61d-4e79-b832-35abf5025c30-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.137986 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6c5945c865-z7kz7"] Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.157244 4771 scope.go:117] "RemoveContainer" containerID="ed10914f70340fa9edb88427c9685cc726d0e02226638f2978714fec22b0b45f" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.197767 4771 scope.go:117] "RemoveContainer" containerID="78931a56a27ade7568dba31e202aafa1126640db6ead34072afa80852fe068c2" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.197924 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6c5945c865-z7kz7"] Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.221219 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.224432 4771 scope.go:117] "RemoveContainer" containerID="77272668990de879afe181aa43c8313372c5351b2a6364c3078ca0004de832e0" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.234280 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.243407 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 01:25:05 crc kubenswrapper[4771]: E0227 01:25:05.244067 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03e3abf3-c61d-4e79-b832-35abf5025c30" containerName="probe" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.244151 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="03e3abf3-c61d-4e79-b832-35abf5025c30" containerName="probe" Feb 27 01:25:05 crc kubenswrapper[4771]: E0227 01:25:05.244226 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="345ab929-8a28-4d72-a196-bd831e1f3d0a" containerName="dnsmasq-dns" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.244320 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="345ab929-8a28-4d72-a196-bd831e1f3d0a" containerName="dnsmasq-dns" Feb 27 01:25:05 crc kubenswrapper[4771]: E0227 01:25:05.244387 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="345ab929-8a28-4d72-a196-bd831e1f3d0a" containerName="init" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.244438 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="345ab929-8a28-4d72-a196-bd831e1f3d0a" containerName="init" Feb 27 01:25:05 crc kubenswrapper[4771]: E0227 01:25:05.244511 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca4c4e6-7c04-473a-921b-c6f7e98c81b3" containerName="neutron-httpd" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.244578 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca4c4e6-7c04-473a-921b-c6f7e98c81b3" containerName="neutron-httpd" Feb 27 01:25:05 crc kubenswrapper[4771]: E0227 01:25:05.244659 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03e3abf3-c61d-4e79-b832-35abf5025c30" containerName="cinder-scheduler" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.244751 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="03e3abf3-c61d-4e79-b832-35abf5025c30" containerName="cinder-scheduler" Feb 27 01:25:05 crc kubenswrapper[4771]: E0227 01:25:05.244850 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca4c4e6-7c04-473a-921b-c6f7e98c81b3" containerName="neutron-api" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.245038 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca4c4e6-7c04-473a-921b-c6f7e98c81b3" containerName="neutron-api" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.245159 4771 scope.go:117] "RemoveContainer" containerID="78931a56a27ade7568dba31e202aafa1126640db6ead34072afa80852fe068c2" Feb 27 01:25:05 crc kubenswrapper[4771]: E0227 01:25:05.246168 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78931a56a27ade7568dba31e202aafa1126640db6ead34072afa80852fe068c2\": container with ID starting with 78931a56a27ade7568dba31e202aafa1126640db6ead34072afa80852fe068c2 not found: ID does not exist" containerID="78931a56a27ade7568dba31e202aafa1126640db6ead34072afa80852fe068c2" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.246214 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78931a56a27ade7568dba31e202aafa1126640db6ead34072afa80852fe068c2"} err="failed to get container status \"78931a56a27ade7568dba31e202aafa1126640db6ead34072afa80852fe068c2\": rpc error: code = NotFound desc = could not find container \"78931a56a27ade7568dba31e202aafa1126640db6ead34072afa80852fe068c2\": container with ID starting with 78931a56a27ade7568dba31e202aafa1126640db6ead34072afa80852fe068c2 not found: ID does not exist" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.246237 4771 scope.go:117] "RemoveContainer" containerID="77272668990de879afe181aa43c8313372c5351b2a6364c3078ca0004de832e0" Feb 27 01:25:05 crc kubenswrapper[4771]: E0227 01:25:05.246620 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77272668990de879afe181aa43c8313372c5351b2a6364c3078ca0004de832e0\": container with ID starting with 77272668990de879afe181aa43c8313372c5351b2a6364c3078ca0004de832e0 not found: ID does not exist" containerID="77272668990de879afe181aa43c8313372c5351b2a6364c3078ca0004de832e0" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.246644 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77272668990de879afe181aa43c8313372c5351b2a6364c3078ca0004de832e0"} err="failed to get container status \"77272668990de879afe181aa43c8313372c5351b2a6364c3078ca0004de832e0\": rpc error: code = NotFound desc = could not find container \"77272668990de879afe181aa43c8313372c5351b2a6364c3078ca0004de832e0\": container with ID starting with 77272668990de879afe181aa43c8313372c5351b2a6364c3078ca0004de832e0 not found: ID does not exist" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.246836 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="eca4c4e6-7c04-473a-921b-c6f7e98c81b3" containerName="neutron-api" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.246929 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="eca4c4e6-7c04-473a-921b-c6f7e98c81b3" containerName="neutron-httpd" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.247017 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="03e3abf3-c61d-4e79-b832-35abf5025c30" containerName="cinder-scheduler" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.247094 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="03e3abf3-c61d-4e79-b832-35abf5025c30" containerName="probe" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.247172 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="345ab929-8a28-4d72-a196-bd831e1f3d0a" containerName="dnsmasq-dns" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.248215 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 27 01:25:05 crc kubenswrapper[4771]: E0227 01:25:05.248289 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03e3abf3_c61d_4e79_b832_35abf5025c30.slice/crio-44815cbe87f42e8cabbcd8d4444578bec98e69a240b9ceb901a62ce7296e242b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeca4c4e6_7c04_473a_921b_c6f7e98c81b3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb53fd943_6ac9_4338_b3d5_83a9627f1c78.slice/crio-conmon-b9817d0ae2a6a79c13a517bf8586ad9fd88bef7d7493c68d6934161255c6acce.scope\": RecentStats: unable to find data in memory cache]" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.250269 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.256471 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.318951 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e15c68f3-a904-4d91-a778-4e5b5a728c9f-scripts\") pod \"cinder-scheduler-0\" (UID: \"e15c68f3-a904-4d91-a778-4e5b5a728c9f\") " pod="openstack/cinder-scheduler-0" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.318989 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e15c68f3-a904-4d91-a778-4e5b5a728c9f-config-data\") pod \"cinder-scheduler-0\" (UID: \"e15c68f3-a904-4d91-a778-4e5b5a728c9f\") " pod="openstack/cinder-scheduler-0" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.319030 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e15c68f3-a904-4d91-a778-4e5b5a728c9f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e15c68f3-a904-4d91-a778-4e5b5a728c9f\") " pod="openstack/cinder-scheduler-0" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.319165 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e15c68f3-a904-4d91-a778-4e5b5a728c9f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e15c68f3-a904-4d91-a778-4e5b5a728c9f\") " pod="openstack/cinder-scheduler-0" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.319235 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15c68f3-a904-4d91-a778-4e5b5a728c9f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e15c68f3-a904-4d91-a778-4e5b5a728c9f\") " pod="openstack/cinder-scheduler-0" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.319436 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qxkj\" (UniqueName: \"kubernetes.io/projected/e15c68f3-a904-4d91-a778-4e5b5a728c9f-kube-api-access-7qxkj\") pod \"cinder-scheduler-0\" (UID: \"e15c68f3-a904-4d91-a778-4e5b5a728c9f\") " pod="openstack/cinder-scheduler-0" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.381685 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-599ccf9f8d-z6nsl" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.421344 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b53fd943-6ac9-4338-b3d5-83a9627f1c78-logs\") pod \"b53fd943-6ac9-4338-b3d5-83a9627f1c78\" (UID: \"b53fd943-6ac9-4338-b3d5-83a9627f1c78\") " Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.421406 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b53fd943-6ac9-4338-b3d5-83a9627f1c78-config-data-custom\") pod \"b53fd943-6ac9-4338-b3d5-83a9627f1c78\" (UID: \"b53fd943-6ac9-4338-b3d5-83a9627f1c78\") " Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.421488 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m729f\" (UniqueName: \"kubernetes.io/projected/b53fd943-6ac9-4338-b3d5-83a9627f1c78-kube-api-access-m729f\") pod \"b53fd943-6ac9-4338-b3d5-83a9627f1c78\" (UID: \"b53fd943-6ac9-4338-b3d5-83a9627f1c78\") " Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.421652 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b53fd943-6ac9-4338-b3d5-83a9627f1c78-config-data\") pod \"b53fd943-6ac9-4338-b3d5-83a9627f1c78\" (UID: \"b53fd943-6ac9-4338-b3d5-83a9627f1c78\") " Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.421698 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b53fd943-6ac9-4338-b3d5-83a9627f1c78-combined-ca-bundle\") pod \"b53fd943-6ac9-4338-b3d5-83a9627f1c78\" (UID: \"b53fd943-6ac9-4338-b3d5-83a9627f1c78\") " Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.421939 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e15c68f3-a904-4d91-a778-4e5b5a728c9f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e15c68f3-a904-4d91-a778-4e5b5a728c9f\") " pod="openstack/cinder-scheduler-0" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.421975 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15c68f3-a904-4d91-a778-4e5b5a728c9f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e15c68f3-a904-4d91-a778-4e5b5a728c9f\") " pod="openstack/cinder-scheduler-0" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.422052 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qxkj\" (UniqueName: \"kubernetes.io/projected/e15c68f3-a904-4d91-a778-4e5b5a728c9f-kube-api-access-7qxkj\") pod \"cinder-scheduler-0\" (UID: \"e15c68f3-a904-4d91-a778-4e5b5a728c9f\") " pod="openstack/cinder-scheduler-0" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.422177 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e15c68f3-a904-4d91-a778-4e5b5a728c9f-scripts\") pod \"cinder-scheduler-0\" (UID: \"e15c68f3-a904-4d91-a778-4e5b5a728c9f\") " pod="openstack/cinder-scheduler-0" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.422208 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e15c68f3-a904-4d91-a778-4e5b5a728c9f-config-data\") pod \"cinder-scheduler-0\" (UID: \"e15c68f3-a904-4d91-a778-4e5b5a728c9f\") " pod="openstack/cinder-scheduler-0" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.422257 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e15c68f3-a904-4d91-a778-4e5b5a728c9f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e15c68f3-a904-4d91-a778-4e5b5a728c9f\") " pod="openstack/cinder-scheduler-0" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.422357 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e15c68f3-a904-4d91-a778-4e5b5a728c9f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e15c68f3-a904-4d91-a778-4e5b5a728c9f\") " pod="openstack/cinder-scheduler-0" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.426410 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b53fd943-6ac9-4338-b3d5-83a9627f1c78-logs" (OuterVolumeSpecName: "logs") pod "b53fd943-6ac9-4338-b3d5-83a9627f1c78" (UID: "b53fd943-6ac9-4338-b3d5-83a9627f1c78"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.431530 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b53fd943-6ac9-4338-b3d5-83a9627f1c78-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b53fd943-6ac9-4338-b3d5-83a9627f1c78" (UID: "b53fd943-6ac9-4338-b3d5-83a9627f1c78"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.432794 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e15c68f3-a904-4d91-a778-4e5b5a728c9f-scripts\") pod \"cinder-scheduler-0\" (UID: \"e15c68f3-a904-4d91-a778-4e5b5a728c9f\") " pod="openstack/cinder-scheduler-0" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.433103 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e15c68f3-a904-4d91-a778-4e5b5a728c9f-config-data\") pod \"cinder-scheduler-0\" (UID: \"e15c68f3-a904-4d91-a778-4e5b5a728c9f\") " pod="openstack/cinder-scheduler-0" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.434074 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e15c68f3-a904-4d91-a778-4e5b5a728c9f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e15c68f3-a904-4d91-a778-4e5b5a728c9f\") " pod="openstack/cinder-scheduler-0" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.435842 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b53fd943-6ac9-4338-b3d5-83a9627f1c78-kube-api-access-m729f" (OuterVolumeSpecName: "kube-api-access-m729f") pod "b53fd943-6ac9-4338-b3d5-83a9627f1c78" (UID: "b53fd943-6ac9-4338-b3d5-83a9627f1c78"). InnerVolumeSpecName "kube-api-access-m729f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.444531 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15c68f3-a904-4d91-a778-4e5b5a728c9f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e15c68f3-a904-4d91-a778-4e5b5a728c9f\") " pod="openstack/cinder-scheduler-0" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.449636 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qxkj\" (UniqueName: \"kubernetes.io/projected/e15c68f3-a904-4d91-a778-4e5b5a728c9f-kube-api-access-7qxkj\") pod \"cinder-scheduler-0\" (UID: \"e15c68f3-a904-4d91-a778-4e5b5a728c9f\") " pod="openstack/cinder-scheduler-0" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.463590 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b53fd943-6ac9-4338-b3d5-83a9627f1c78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b53fd943-6ac9-4338-b3d5-83a9627f1c78" (UID: "b53fd943-6ac9-4338-b3d5-83a9627f1c78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.481907 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b53fd943-6ac9-4338-b3d5-83a9627f1c78-config-data" (OuterVolumeSpecName: "config-data") pod "b53fd943-6ac9-4338-b3d5-83a9627f1c78" (UID: "b53fd943-6ac9-4338-b3d5-83a9627f1c78"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.523747 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b53fd943-6ac9-4338-b3d5-83a9627f1c78-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.523785 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m729f\" (UniqueName: \"kubernetes.io/projected/b53fd943-6ac9-4338-b3d5-83a9627f1c78-kube-api-access-m729f\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.523801 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b53fd943-6ac9-4338-b3d5-83a9627f1c78-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.523842 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b53fd943-6ac9-4338-b3d5-83a9627f1c78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.523853 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b53fd943-6ac9-4338-b3d5-83a9627f1c78-logs\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.576766 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.786332 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03e3abf3-c61d-4e79-b832-35abf5025c30" path="/var/lib/kubelet/pods/03e3abf3-c61d-4e79-b832-35abf5025c30/volumes" Feb 27 01:25:05 crc kubenswrapper[4771]: I0227 01:25:05.787419 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eca4c4e6-7c04-473a-921b-c6f7e98c81b3" path="/var/lib/kubelet/pods/eca4c4e6-7c04-473a-921b-c6f7e98c81b3/volumes" Feb 27 01:25:06 crc kubenswrapper[4771]: I0227 01:25:06.056148 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 27 01:25:06 crc kubenswrapper[4771]: I0227 01:25:06.102299 4771 generic.go:334] "Generic (PLEG): container finished" podID="b53fd943-6ac9-4338-b3d5-83a9627f1c78" containerID="b9817d0ae2a6a79c13a517bf8586ad9fd88bef7d7493c68d6934161255c6acce" exitCode=0 Feb 27 01:25:06 crc kubenswrapper[4771]: I0227 01:25:06.102389 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-599ccf9f8d-z6nsl" event={"ID":"b53fd943-6ac9-4338-b3d5-83a9627f1c78","Type":"ContainerDied","Data":"b9817d0ae2a6a79c13a517bf8586ad9fd88bef7d7493c68d6934161255c6acce"} Feb 27 01:25:06 crc kubenswrapper[4771]: I0227 01:25:06.102441 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-599ccf9f8d-z6nsl" event={"ID":"b53fd943-6ac9-4338-b3d5-83a9627f1c78","Type":"ContainerDied","Data":"273976fc13151815122afbd74fa287abf905a24e9bc2809d09ffb60d97ee2f05"} Feb 27 01:25:06 crc kubenswrapper[4771]: I0227 01:25:06.102473 4771 scope.go:117] "RemoveContainer" containerID="b9817d0ae2a6a79c13a517bf8586ad9fd88bef7d7493c68d6934161255c6acce" Feb 27 01:25:06 crc kubenswrapper[4771]: I0227 01:25:06.102729 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-599ccf9f8d-z6nsl" Feb 27 01:25:06 crc kubenswrapper[4771]: I0227 01:25:06.106960 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e15c68f3-a904-4d91-a778-4e5b5a728c9f","Type":"ContainerStarted","Data":"58e7f125e7ccbc15205a8bfbe5f06056f1945adddfbcf6e69226362cb683c29c"} Feb 27 01:25:06 crc kubenswrapper[4771]: I0227 01:25:06.205222 4771 scope.go:117] "RemoveContainer" containerID="3822830ec64f4a7ad43ecbbe4a7fef3f9f87b9a14213a637a15ceb89bdf68519" Feb 27 01:25:06 crc kubenswrapper[4771]: I0227 01:25:06.223457 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-599ccf9f8d-z6nsl"] Feb 27 01:25:06 crc kubenswrapper[4771]: I0227 01:25:06.232712 4771 scope.go:117] "RemoveContainer" containerID="b9817d0ae2a6a79c13a517bf8586ad9fd88bef7d7493c68d6934161255c6acce" Feb 27 01:25:06 crc kubenswrapper[4771]: I0227 01:25:06.232888 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-599ccf9f8d-z6nsl"] Feb 27 01:25:06 crc kubenswrapper[4771]: E0227 01:25:06.233430 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9817d0ae2a6a79c13a517bf8586ad9fd88bef7d7493c68d6934161255c6acce\": container with ID starting with b9817d0ae2a6a79c13a517bf8586ad9fd88bef7d7493c68d6934161255c6acce not found: ID does not exist" containerID="b9817d0ae2a6a79c13a517bf8586ad9fd88bef7d7493c68d6934161255c6acce" Feb 27 01:25:06 crc kubenswrapper[4771]: I0227 01:25:06.233470 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9817d0ae2a6a79c13a517bf8586ad9fd88bef7d7493c68d6934161255c6acce"} err="failed to get container status \"b9817d0ae2a6a79c13a517bf8586ad9fd88bef7d7493c68d6934161255c6acce\": rpc error: code = NotFound desc = could not find container \"b9817d0ae2a6a79c13a517bf8586ad9fd88bef7d7493c68d6934161255c6acce\": container with ID starting with b9817d0ae2a6a79c13a517bf8586ad9fd88bef7d7493c68d6934161255c6acce not found: ID does not exist" Feb 27 01:25:06 crc kubenswrapper[4771]: I0227 01:25:06.233516 4771 scope.go:117] "RemoveContainer" containerID="3822830ec64f4a7ad43ecbbe4a7fef3f9f87b9a14213a637a15ceb89bdf68519" Feb 27 01:25:06 crc kubenswrapper[4771]: E0227 01:25:06.233908 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3822830ec64f4a7ad43ecbbe4a7fef3f9f87b9a14213a637a15ceb89bdf68519\": container with ID starting with 3822830ec64f4a7ad43ecbbe4a7fef3f9f87b9a14213a637a15ceb89bdf68519 not found: ID does not exist" containerID="3822830ec64f4a7ad43ecbbe4a7fef3f9f87b9a14213a637a15ceb89bdf68519" Feb 27 01:25:06 crc kubenswrapper[4771]: I0227 01:25:06.233944 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3822830ec64f4a7ad43ecbbe4a7fef3f9f87b9a14213a637a15ceb89bdf68519"} err="failed to get container status \"3822830ec64f4a7ad43ecbbe4a7fef3f9f87b9a14213a637a15ceb89bdf68519\": rpc error: code = NotFound desc = could not find container \"3822830ec64f4a7ad43ecbbe4a7fef3f9f87b9a14213a637a15ceb89bdf68519\": container with ID starting with 3822830ec64f4a7ad43ecbbe4a7fef3f9f87b9a14213a637a15ceb89bdf68519 not found: ID does not exist" Feb 27 01:25:07 crc kubenswrapper[4771]: I0227 01:25:07.123809 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e15c68f3-a904-4d91-a778-4e5b5a728c9f","Type":"ContainerStarted","Data":"3ff6ca61013cda1205e3fbde8ecc0700027391e43b7580a94155f0d2e18ab386"} Feb 27 01:25:07 crc kubenswrapper[4771]: I0227 01:25:07.787051 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b53fd943-6ac9-4338-b3d5-83a9627f1c78" path="/var/lib/kubelet/pods/b53fd943-6ac9-4338-b3d5-83a9627f1c78/volumes" Feb 27 01:25:08 crc kubenswrapper[4771]: I0227 01:25:08.137392 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e15c68f3-a904-4d91-a778-4e5b5a728c9f","Type":"ContainerStarted","Data":"00df435162845147de7833e82055c73d89692880f1f4bdc14703770f551de8a4"} Feb 27 01:25:08 crc kubenswrapper[4771]: I0227 01:25:08.157082 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.157068782 podStartE2EDuration="3.157068782s" podCreationTimestamp="2026-02-27 01:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:25:08.154355288 +0000 UTC m=+1221.091916576" watchObservedRunningTime="2026-02-27 01:25:08.157068782 +0000 UTC m=+1221.094630070" Feb 27 01:25:08 crc kubenswrapper[4771]: I0227 01:25:08.226420 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 27 01:25:10 crc kubenswrapper[4771]: I0227 01:25:10.577083 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 27 01:25:11 crc kubenswrapper[4771]: I0227 01:25:11.033284 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-56bfd8fdf6-rxxnr" Feb 27 01:25:12 crc kubenswrapper[4771]: I0227 01:25:12.393925 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7fb8f8d788-kjgv6" podUID="9d4b37ec-a290-4c5c-9a05-31e0499c3ed7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Feb 27 01:25:14 crc kubenswrapper[4771]: I0227 01:25:14.009395 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 27 01:25:14 crc kubenswrapper[4771]: E0227 01:25:14.010159 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b53fd943-6ac9-4338-b3d5-83a9627f1c78" containerName="barbican-api" Feb 27 01:25:14 crc kubenswrapper[4771]: I0227 01:25:14.010175 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b53fd943-6ac9-4338-b3d5-83a9627f1c78" containerName="barbican-api" Feb 27 01:25:14 crc kubenswrapper[4771]: E0227 01:25:14.010203 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b53fd943-6ac9-4338-b3d5-83a9627f1c78" containerName="barbican-api-log" Feb 27 01:25:14 crc kubenswrapper[4771]: I0227 01:25:14.010210 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b53fd943-6ac9-4338-b3d5-83a9627f1c78" containerName="barbican-api-log" Feb 27 01:25:14 crc kubenswrapper[4771]: I0227 01:25:14.010407 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b53fd943-6ac9-4338-b3d5-83a9627f1c78" containerName="barbican-api" Feb 27 01:25:14 crc kubenswrapper[4771]: I0227 01:25:14.010440 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b53fd943-6ac9-4338-b3d5-83a9627f1c78" containerName="barbican-api-log" Feb 27 01:25:14 crc kubenswrapper[4771]: I0227 01:25:14.011158 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 27 01:25:14 crc kubenswrapper[4771]: I0227 01:25:14.013204 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-gprtm" Feb 27 01:25:14 crc kubenswrapper[4771]: I0227 01:25:14.014199 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 27 01:25:14 crc kubenswrapper[4771]: I0227 01:25:14.014720 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 27 01:25:14 crc kubenswrapper[4771]: I0227 01:25:14.025177 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 27 01:25:14 crc kubenswrapper[4771]: I0227 01:25:14.147599 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de-openstack-config-secret\") pod \"openstackclient\" (UID: \"2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de\") " pod="openstack/openstackclient" Feb 27 01:25:14 crc kubenswrapper[4771]: I0227 01:25:14.147746 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de\") " pod="openstack/openstackclient" Feb 27 01:25:14 crc kubenswrapper[4771]: I0227 01:25:14.147842 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzfw4\" (UniqueName: \"kubernetes.io/projected/2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de-kube-api-access-jzfw4\") pod \"openstackclient\" (UID: \"2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de\") " pod="openstack/openstackclient" Feb 27 01:25:14 crc kubenswrapper[4771]: I0227 01:25:14.147931 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de-openstack-config\") pod \"openstackclient\" (UID: \"2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de\") " pod="openstack/openstackclient" Feb 27 01:25:14 crc kubenswrapper[4771]: I0227 01:25:14.249317 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de-openstack-config-secret\") pod \"openstackclient\" (UID: \"2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de\") " pod="openstack/openstackclient" Feb 27 01:25:14 crc kubenswrapper[4771]: I0227 01:25:14.249425 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de\") " pod="openstack/openstackclient" Feb 27 01:25:14 crc kubenswrapper[4771]: I0227 01:25:14.249467 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzfw4\" (UniqueName: \"kubernetes.io/projected/2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de-kube-api-access-jzfw4\") pod \"openstackclient\" (UID: \"2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de\") " pod="openstack/openstackclient" Feb 27 01:25:14 crc kubenswrapper[4771]: I0227 01:25:14.249501 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de-openstack-config\") pod \"openstackclient\" (UID: \"2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de\") " pod="openstack/openstackclient" Feb 27 01:25:14 crc kubenswrapper[4771]: I0227 01:25:14.250474 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de-openstack-config\") pod \"openstackclient\" (UID: \"2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de\") " pod="openstack/openstackclient" Feb 27 01:25:14 crc kubenswrapper[4771]: I0227 01:25:14.255671 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de-openstack-config-secret\") pod \"openstackclient\" (UID: \"2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de\") " pod="openstack/openstackclient" Feb 27 01:25:14 crc kubenswrapper[4771]: I0227 01:25:14.255952 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de\") " pod="openstack/openstackclient" Feb 27 01:25:14 crc kubenswrapper[4771]: I0227 01:25:14.270376 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzfw4\" (UniqueName: \"kubernetes.io/projected/2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de-kube-api-access-jzfw4\") pod \"openstackclient\" (UID: \"2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de\") " pod="openstack/openstackclient" Feb 27 01:25:14 crc kubenswrapper[4771]: I0227 01:25:14.379898 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 27 01:25:14 crc kubenswrapper[4771]: I0227 01:25:14.869199 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 27 01:25:14 crc kubenswrapper[4771]: W0227 01:25:14.880701 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c2dc0ad_4c8c_42bf_a442_b0c51ed3f8de.slice/crio-1dc3e502f89bca1799074cf77e2881f4f005fcec431c2a8c6e3a85832b4de2cf WatchSource:0}: Error finding container 1dc3e502f89bca1799074cf77e2881f4f005fcec431c2a8c6e3a85832b4de2cf: Status 404 returned error can't find the container with id 1dc3e502f89bca1799074cf77e2881f4f005fcec431c2a8c6e3a85832b4de2cf Feb 27 01:25:15 crc kubenswrapper[4771]: I0227 01:25:15.207810 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de","Type":"ContainerStarted","Data":"1dc3e502f89bca1799074cf77e2881f4f005fcec431c2a8c6e3a85832b4de2cf"} Feb 27 01:25:15 crc kubenswrapper[4771]: I0227 01:25:15.799711 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.599054 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7b8d8fb79c-qxz4q"] Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.600888 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.603065 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.605236 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.607424 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.621100 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7b8d8fb79c-qxz4q"] Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.713466 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0f1ec21-667d-46de-abbb-cb95d29e861c-run-httpd\") pod \"swift-proxy-7b8d8fb79c-qxz4q\" (UID: \"d0f1ec21-667d-46de-abbb-cb95d29e861c\") " pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.713512 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0f1ec21-667d-46de-abbb-cb95d29e861c-log-httpd\") pod \"swift-proxy-7b8d8fb79c-qxz4q\" (UID: \"d0f1ec21-667d-46de-abbb-cb95d29e861c\") " pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.713818 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f1ec21-667d-46de-abbb-cb95d29e861c-public-tls-certs\") pod \"swift-proxy-7b8d8fb79c-qxz4q\" (UID: \"d0f1ec21-667d-46de-abbb-cb95d29e861c\") " pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.713894 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f1ec21-667d-46de-abbb-cb95d29e861c-config-data\") pod \"swift-proxy-7b8d8fb79c-qxz4q\" (UID: \"d0f1ec21-667d-46de-abbb-cb95d29e861c\") " pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.714137 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbrwg\" (UniqueName: \"kubernetes.io/projected/d0f1ec21-667d-46de-abbb-cb95d29e861c-kube-api-access-hbrwg\") pod \"swift-proxy-7b8d8fb79c-qxz4q\" (UID: \"d0f1ec21-667d-46de-abbb-cb95d29e861c\") " pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.714243 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f1ec21-667d-46de-abbb-cb95d29e861c-internal-tls-certs\") pod \"swift-proxy-7b8d8fb79c-qxz4q\" (UID: \"d0f1ec21-667d-46de-abbb-cb95d29e861c\") " pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.714261 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d0f1ec21-667d-46de-abbb-cb95d29e861c-etc-swift\") pod \"swift-proxy-7b8d8fb79c-qxz4q\" (UID: \"d0f1ec21-667d-46de-abbb-cb95d29e861c\") " pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.714280 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f1ec21-667d-46de-abbb-cb95d29e861c-combined-ca-bundle\") pod \"swift-proxy-7b8d8fb79c-qxz4q\" (UID: \"d0f1ec21-667d-46de-abbb-cb95d29e861c\") " pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.815702 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0f1ec21-667d-46de-abbb-cb95d29e861c-run-httpd\") pod \"swift-proxy-7b8d8fb79c-qxz4q\" (UID: \"d0f1ec21-667d-46de-abbb-cb95d29e861c\") " pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.815755 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0f1ec21-667d-46de-abbb-cb95d29e861c-log-httpd\") pod \"swift-proxy-7b8d8fb79c-qxz4q\" (UID: \"d0f1ec21-667d-46de-abbb-cb95d29e861c\") " pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.815820 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f1ec21-667d-46de-abbb-cb95d29e861c-public-tls-certs\") pod \"swift-proxy-7b8d8fb79c-qxz4q\" (UID: \"d0f1ec21-667d-46de-abbb-cb95d29e861c\") " pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.815850 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f1ec21-667d-46de-abbb-cb95d29e861c-config-data\") pod \"swift-proxy-7b8d8fb79c-qxz4q\" (UID: \"d0f1ec21-667d-46de-abbb-cb95d29e861c\") " pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.815884 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbrwg\" (UniqueName: \"kubernetes.io/projected/d0f1ec21-667d-46de-abbb-cb95d29e861c-kube-api-access-hbrwg\") pod \"swift-proxy-7b8d8fb79c-qxz4q\" (UID: \"d0f1ec21-667d-46de-abbb-cb95d29e861c\") " pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.815925 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f1ec21-667d-46de-abbb-cb95d29e861c-internal-tls-certs\") pod \"swift-proxy-7b8d8fb79c-qxz4q\" (UID: \"d0f1ec21-667d-46de-abbb-cb95d29e861c\") " pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.815941 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d0f1ec21-667d-46de-abbb-cb95d29e861c-etc-swift\") pod \"swift-proxy-7b8d8fb79c-qxz4q\" (UID: \"d0f1ec21-667d-46de-abbb-cb95d29e861c\") " pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.815958 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f1ec21-667d-46de-abbb-cb95d29e861c-combined-ca-bundle\") pod \"swift-proxy-7b8d8fb79c-qxz4q\" (UID: \"d0f1ec21-667d-46de-abbb-cb95d29e861c\") " pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.816337 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0f1ec21-667d-46de-abbb-cb95d29e861c-run-httpd\") pod \"swift-proxy-7b8d8fb79c-qxz4q\" (UID: \"d0f1ec21-667d-46de-abbb-cb95d29e861c\") " pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.816404 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0f1ec21-667d-46de-abbb-cb95d29e861c-log-httpd\") pod \"swift-proxy-7b8d8fb79c-qxz4q\" (UID: \"d0f1ec21-667d-46de-abbb-cb95d29e861c\") " pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.823127 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f1ec21-667d-46de-abbb-cb95d29e861c-public-tls-certs\") pod \"swift-proxy-7b8d8fb79c-qxz4q\" (UID: \"d0f1ec21-667d-46de-abbb-cb95d29e861c\") " pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.836291 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbrwg\" (UniqueName: \"kubernetes.io/projected/d0f1ec21-667d-46de-abbb-cb95d29e861c-kube-api-access-hbrwg\") pod \"swift-proxy-7b8d8fb79c-qxz4q\" (UID: \"d0f1ec21-667d-46de-abbb-cb95d29e861c\") " pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.836605 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f1ec21-667d-46de-abbb-cb95d29e861c-internal-tls-certs\") pod \"swift-proxy-7b8d8fb79c-qxz4q\" (UID: \"d0f1ec21-667d-46de-abbb-cb95d29e861c\") " pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.836818 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d0f1ec21-667d-46de-abbb-cb95d29e861c-etc-swift\") pod \"swift-proxy-7b8d8fb79c-qxz4q\" (UID: \"d0f1ec21-667d-46de-abbb-cb95d29e861c\") " pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.837352 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f1ec21-667d-46de-abbb-cb95d29e861c-combined-ca-bundle\") pod \"swift-proxy-7b8d8fb79c-qxz4q\" (UID: \"d0f1ec21-667d-46de-abbb-cb95d29e861c\") " pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.838403 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0f1ec21-667d-46de-abbb-cb95d29e861c-config-data\") pod \"swift-proxy-7b8d8fb79c-qxz4q\" (UID: \"d0f1ec21-667d-46de-abbb-cb95d29e861c\") " pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.920689 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.958781 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.959014 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="12727ccf-0860-4f78-9d5e-4a043848ae2f" containerName="glance-log" containerID="cri-o://b557c51300581ae1caf95ce0698f31914ca105125868b4fefa721b4e4d20c3ad" gracePeriod=30 Feb 27 01:25:17 crc kubenswrapper[4771]: I0227 01:25:17.959128 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="12727ccf-0860-4f78-9d5e-4a043848ae2f" containerName="glance-httpd" containerID="cri-o://c500d7a15278f96a54099daa331198c5ce41361d0a1b24eed20d43ca05a32c9a" gracePeriod=30 Feb 27 01:25:18 crc kubenswrapper[4771]: I0227 01:25:18.244731 4771 generic.go:334] "Generic (PLEG): container finished" podID="12727ccf-0860-4f78-9d5e-4a043848ae2f" containerID="b557c51300581ae1caf95ce0698f31914ca105125868b4fefa721b4e4d20c3ad" exitCode=143 Feb 27 01:25:18 crc kubenswrapper[4771]: I0227 01:25:18.244979 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"12727ccf-0860-4f78-9d5e-4a043848ae2f","Type":"ContainerDied","Data":"b557c51300581ae1caf95ce0698f31914ca105125868b4fefa721b4e4d20c3ad"} Feb 27 01:25:18 crc kubenswrapper[4771]: I0227 01:25:18.506108 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7b8d8fb79c-qxz4q"] Feb 27 01:25:18 crc kubenswrapper[4771]: W0227 01:25:18.517033 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0f1ec21_667d_46de_abbb_cb95d29e861c.slice/crio-8b8f0bfe8ea91db113fea1ffac5004cfdc0dd7783df2f39aade9398e1f31233e WatchSource:0}: Error finding container 8b8f0bfe8ea91db113fea1ffac5004cfdc0dd7783df2f39aade9398e1f31233e: Status 404 returned error can't find the container with id 8b8f0bfe8ea91db113fea1ffac5004cfdc0dd7783df2f39aade9398e1f31233e Feb 27 01:25:18 crc kubenswrapper[4771]: I0227 01:25:18.855283 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 01:25:18 crc kubenswrapper[4771]: I0227 01:25:18.855695 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8bef7a7d-188a-4d22-9031-8365098a761f" containerName="glance-httpd" containerID="cri-o://737b90e10a3eea907ddf7d3af1f7885d11f3951a631955b0c824af3f50a9825b" gracePeriod=30 Feb 27 01:25:18 crc kubenswrapper[4771]: I0227 01:25:18.864974 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8bef7a7d-188a-4d22-9031-8365098a761f" containerName="glance-log" containerID="cri-o://d3facf87ac24c5acda65fef3df745ef287c70c4a52a00da437d5cd470942dbe9" gracePeriod=30 Feb 27 01:25:19 crc kubenswrapper[4771]: I0227 01:25:19.260404 4771 generic.go:334] "Generic (PLEG): container finished" podID="8bef7a7d-188a-4d22-9031-8365098a761f" containerID="d3facf87ac24c5acda65fef3df745ef287c70c4a52a00da437d5cd470942dbe9" exitCode=143 Feb 27 01:25:19 crc kubenswrapper[4771]: I0227 01:25:19.260501 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8bef7a7d-188a-4d22-9031-8365098a761f","Type":"ContainerDied","Data":"d3facf87ac24c5acda65fef3df745ef287c70c4a52a00da437d5cd470942dbe9"} Feb 27 01:25:19 crc kubenswrapper[4771]: I0227 01:25:19.262845 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" event={"ID":"d0f1ec21-667d-46de-abbb-cb95d29e861c","Type":"ContainerStarted","Data":"f011f2e29575b4d48bfd4e82ae9f89cbb4da91a21135d0670edd30e5525748df"} Feb 27 01:25:19 crc kubenswrapper[4771]: I0227 01:25:19.262880 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" event={"ID":"d0f1ec21-667d-46de-abbb-cb95d29e861c","Type":"ContainerStarted","Data":"2b0ff6e623cbd518d48a0cdc131feead40d12c2f39dc3c212a0a2237e3239dc7"} Feb 27 01:25:19 crc kubenswrapper[4771]: I0227 01:25:19.262894 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" event={"ID":"d0f1ec21-667d-46de-abbb-cb95d29e861c","Type":"ContainerStarted","Data":"8b8f0bfe8ea91db113fea1ffac5004cfdc0dd7783df2f39aade9398e1f31233e"} Feb 27 01:25:19 crc kubenswrapper[4771]: I0227 01:25:19.264054 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:19 crc kubenswrapper[4771]: I0227 01:25:19.264080 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:19 crc kubenswrapper[4771]: I0227 01:25:19.319147 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:25:19 crc kubenswrapper[4771]: I0227 01:25:19.355990 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" podStartSLOduration=2.355973746 podStartE2EDuration="2.355973746s" podCreationTimestamp="2026-02-27 01:25:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:25:19.291581557 +0000 UTC m=+1232.229142835" watchObservedRunningTime="2026-02-27 01:25:19.355973746 +0000 UTC m=+1232.293535034" Feb 27 01:25:19 crc kubenswrapper[4771]: I0227 01:25:19.827378 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:25:19 crc kubenswrapper[4771]: I0227 01:25:19.828725 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="27da755f-7147-4fee-af32-994932f0b715" containerName="sg-core" containerID="cri-o://550648b9982885f7e4e9c7c22d09cc61b35ae7c81dca6526f000a6f13a2ecb1b" gracePeriod=30 Feb 27 01:25:19 crc kubenswrapper[4771]: I0227 01:25:19.828819 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="27da755f-7147-4fee-af32-994932f0b715" containerName="ceilometer-notification-agent" containerID="cri-o://fa6776706eed4113429819150516809f826cee28acfecc2925bd2ca2e23044da" gracePeriod=30 Feb 27 01:25:19 crc kubenswrapper[4771]: I0227 01:25:19.828676 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="27da755f-7147-4fee-af32-994932f0b715" containerName="ceilometer-central-agent" containerID="cri-o://11ce1410c8bac052456f2674c6d6df4df2b88857e5958f89e1bd872be3ce0ff5" gracePeriod=30 Feb 27 01:25:19 crc kubenswrapper[4771]: I0227 01:25:19.828814 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="27da755f-7147-4fee-af32-994932f0b715" containerName="proxy-httpd" containerID="cri-o://89e7cd8bc60506a2a24f7532e7eab47b97372fc8e3b8d634886cd4a769da84d2" gracePeriod=30 Feb 27 01:25:19 crc kubenswrapper[4771]: I0227 01:25:19.932635 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="27da755f-7147-4fee-af32-994932f0b715" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.175:3000/\": read tcp 10.217.0.2:48984->10.217.0.175:3000: read: connection reset by peer" Feb 27 01:25:20 crc kubenswrapper[4771]: I0227 01:25:20.273457 4771 generic.go:334] "Generic (PLEG): container finished" podID="27da755f-7147-4fee-af32-994932f0b715" containerID="89e7cd8bc60506a2a24f7532e7eab47b97372fc8e3b8d634886cd4a769da84d2" exitCode=0 Feb 27 01:25:20 crc kubenswrapper[4771]: I0227 01:25:20.273488 4771 generic.go:334] "Generic (PLEG): container finished" podID="27da755f-7147-4fee-af32-994932f0b715" containerID="550648b9982885f7e4e9c7c22d09cc61b35ae7c81dca6526f000a6f13a2ecb1b" exitCode=2 Feb 27 01:25:20 crc kubenswrapper[4771]: I0227 01:25:20.273543 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27da755f-7147-4fee-af32-994932f0b715","Type":"ContainerDied","Data":"89e7cd8bc60506a2a24f7532e7eab47b97372fc8e3b8d634886cd4a769da84d2"} Feb 27 01:25:20 crc kubenswrapper[4771]: I0227 01:25:20.273599 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27da755f-7147-4fee-af32-994932f0b715","Type":"ContainerDied","Data":"550648b9982885f7e4e9c7c22d09cc61b35ae7c81dca6526f000a6f13a2ecb1b"} Feb 27 01:25:20 crc kubenswrapper[4771]: I0227 01:25:20.309964 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5cddbc5576-b9kzz" Feb 27 01:25:21 crc kubenswrapper[4771]: I0227 01:25:21.285777 4771 generic.go:334] "Generic (PLEG): container finished" podID="27da755f-7147-4fee-af32-994932f0b715" containerID="11ce1410c8bac052456f2674c6d6df4df2b88857e5958f89e1bd872be3ce0ff5" exitCode=0 Feb 27 01:25:21 crc kubenswrapper[4771]: I0227 01:25:21.285857 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27da755f-7147-4fee-af32-994932f0b715","Type":"ContainerDied","Data":"11ce1410c8bac052456f2674c6d6df4df2b88857e5958f89e1bd872be3ce0ff5"} Feb 27 01:25:21 crc kubenswrapper[4771]: I0227 01:25:21.290094 4771 generic.go:334] "Generic (PLEG): container finished" podID="12727ccf-0860-4f78-9d5e-4a043848ae2f" containerID="c500d7a15278f96a54099daa331198c5ce41361d0a1b24eed20d43ca05a32c9a" exitCode=0 Feb 27 01:25:21 crc kubenswrapper[4771]: I0227 01:25:21.290166 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"12727ccf-0860-4f78-9d5e-4a043848ae2f","Type":"ContainerDied","Data":"c500d7a15278f96a54099daa331198c5ce41361d0a1b24eed20d43ca05a32c9a"} Feb 27 01:25:22 crc kubenswrapper[4771]: I0227 01:25:22.302237 4771 generic.go:334] "Generic (PLEG): container finished" podID="8bef7a7d-188a-4d22-9031-8365098a761f" containerID="737b90e10a3eea907ddf7d3af1f7885d11f3951a631955b0c824af3f50a9825b" exitCode=0 Feb 27 01:25:22 crc kubenswrapper[4771]: I0227 01:25:22.302287 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8bef7a7d-188a-4d22-9031-8365098a761f","Type":"ContainerDied","Data":"737b90e10a3eea907ddf7d3af1f7885d11f3951a631955b0c824af3f50a9825b"} Feb 27 01:25:22 crc kubenswrapper[4771]: I0227 01:25:22.393946 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7fb8f8d788-kjgv6" podUID="9d4b37ec-a290-4c5c-9a05-31e0499c3ed7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Feb 27 01:25:22 crc kubenswrapper[4771]: I0227 01:25:22.394041 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:25:23 crc kubenswrapper[4771]: I0227 01:25:23.924995 4771 scope.go:117] "RemoveContainer" containerID="b2398ecd90e0f24eb235ae146443c005e5a83958bb522228dd6c94a74044a3ba" Feb 27 01:25:24 crc kubenswrapper[4771]: I0227 01:25:24.356811 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6fd6bd959-l4htk" Feb 27 01:25:24 crc kubenswrapper[4771]: I0227 01:25:24.429829 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5fc95dbbd4-gfl9m"] Feb 27 01:25:24 crc kubenswrapper[4771]: I0227 01:25:24.434273 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5fc95dbbd4-gfl9m" podUID="dd57f8bd-d811-4740-b644-f8d69d329d5c" containerName="neutron-api" containerID="cri-o://5207c640e61f0641a06714c955a373f99702b9fe4c3240013ee7eb4c13d85379" gracePeriod=30 Feb 27 01:25:24 crc kubenswrapper[4771]: I0227 01:25:24.434423 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5fc95dbbd4-gfl9m" podUID="dd57f8bd-d811-4740-b644-f8d69d329d5c" containerName="neutron-httpd" containerID="cri-o://1deb49c54362ab7c5b82ffdcc324add53e5fc2ca49dfc8cd67dce9ea65914b57" gracePeriod=30 Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.056531 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.187239 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.255749 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12727ccf-0860-4f78-9d5e-4a043848ae2f-httpd-run\") pod \"12727ccf-0860-4f78-9d5e-4a043848ae2f\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.255886 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12727ccf-0860-4f78-9d5e-4a043848ae2f-config-data\") pod \"12727ccf-0860-4f78-9d5e-4a043848ae2f\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.255932 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12727ccf-0860-4f78-9d5e-4a043848ae2f-public-tls-certs\") pod \"12727ccf-0860-4f78-9d5e-4a043848ae2f\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.255992 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12727ccf-0860-4f78-9d5e-4a043848ae2f-combined-ca-bundle\") pod \"12727ccf-0860-4f78-9d5e-4a043848ae2f\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.256233 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12727ccf-0860-4f78-9d5e-4a043848ae2f-logs\") pod \"12727ccf-0860-4f78-9d5e-4a043848ae2f\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.256324 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12727ccf-0860-4f78-9d5e-4a043848ae2f-scripts\") pod \"12727ccf-0860-4f78-9d5e-4a043848ae2f\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.256781 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12727ccf-0860-4f78-9d5e-4a043848ae2f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "12727ccf-0860-4f78-9d5e-4a043848ae2f" (UID: "12727ccf-0860-4f78-9d5e-4a043848ae2f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.256868 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12727ccf-0860-4f78-9d5e-4a043848ae2f-logs" (OuterVolumeSpecName: "logs") pod "12727ccf-0860-4f78-9d5e-4a043848ae2f" (UID: "12727ccf-0860-4f78-9d5e-4a043848ae2f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.257116 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfvr\" (UniqueName: \"kubernetes.io/projected/12727ccf-0860-4f78-9d5e-4a043848ae2f-kube-api-access-htfvr\") pod \"12727ccf-0860-4f78-9d5e-4a043848ae2f\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.257157 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"12727ccf-0860-4f78-9d5e-4a043848ae2f\" (UID: \"12727ccf-0860-4f78-9d5e-4a043848ae2f\") " Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.257652 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12727ccf-0860-4f78-9d5e-4a043848ae2f-logs\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.257663 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12727ccf-0860-4f78-9d5e-4a043848ae2f-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.262128 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "12727ccf-0860-4f78-9d5e-4a043848ae2f" (UID: "12727ccf-0860-4f78-9d5e-4a043848ae2f"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.262226 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12727ccf-0860-4f78-9d5e-4a043848ae2f-kube-api-access-htfvr" (OuterVolumeSpecName: "kube-api-access-htfvr") pod "12727ccf-0860-4f78-9d5e-4a043848ae2f" (UID: "12727ccf-0860-4f78-9d5e-4a043848ae2f"). InnerVolumeSpecName "kube-api-access-htfvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.269336 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12727ccf-0860-4f78-9d5e-4a043848ae2f-scripts" (OuterVolumeSpecName: "scripts") pod "12727ccf-0860-4f78-9d5e-4a043848ae2f" (UID: "12727ccf-0860-4f78-9d5e-4a043848ae2f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.298202 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12727ccf-0860-4f78-9d5e-4a043848ae2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12727ccf-0860-4f78-9d5e-4a043848ae2f" (UID: "12727ccf-0860-4f78-9d5e-4a043848ae2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.321769 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12727ccf-0860-4f78-9d5e-4a043848ae2f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "12727ccf-0860-4f78-9d5e-4a043848ae2f" (UID: "12727ccf-0860-4f78-9d5e-4a043848ae2f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.338819 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de","Type":"ContainerStarted","Data":"266b7b1864a233d540782451da61cebb67db35f131e938d7689429224512e741"} Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.345751 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"12727ccf-0860-4f78-9d5e-4a043848ae2f","Type":"ContainerDied","Data":"e762342ba95a7afb59992ca9768d565ff447eb17bf07346eb04277d028f705b6"} Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.345795 4771 scope.go:117] "RemoveContainer" containerID="c500d7a15278f96a54099daa331198c5ce41361d0a1b24eed20d43ca05a32c9a" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.346270 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.349235 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12727ccf-0860-4f78-9d5e-4a043848ae2f-config-data" (OuterVolumeSpecName: "config-data") pod "12727ccf-0860-4f78-9d5e-4a043848ae2f" (UID: "12727ccf-0860-4f78-9d5e-4a043848ae2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.353572 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.439981918 podStartE2EDuration="12.353536824s" podCreationTimestamp="2026-02-27 01:25:13 +0000 UTC" firstStartedPulling="2026-02-27 01:25:14.884552568 +0000 UTC m=+1227.822113846" lastFinishedPulling="2026-02-27 01:25:24.798107464 +0000 UTC m=+1237.735668752" observedRunningTime="2026-02-27 01:25:25.352217038 +0000 UTC m=+1238.289778326" watchObservedRunningTime="2026-02-27 01:25:25.353536824 +0000 UTC m=+1238.291098112" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.358341 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.358394 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8bef7a7d-188a-4d22-9031-8365098a761f","Type":"ContainerDied","Data":"443aa11c841954f61b12cb4c64afe7d29f64252f8276714878886676cb35f87a"} Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.359498 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bef7a7d-188a-4d22-9031-8365098a761f-logs\") pod \"8bef7a7d-188a-4d22-9031-8365098a761f\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.359590 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8bef7a7d-188a-4d22-9031-8365098a761f-httpd-run\") pod \"8bef7a7d-188a-4d22-9031-8365098a761f\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.360069 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bef7a7d-188a-4d22-9031-8365098a761f-logs" (OuterVolumeSpecName: "logs") pod "8bef7a7d-188a-4d22-9031-8365098a761f" (UID: "8bef7a7d-188a-4d22-9031-8365098a761f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.360110 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7hmb\" (UniqueName: \"kubernetes.io/projected/8bef7a7d-188a-4d22-9031-8365098a761f-kube-api-access-j7hmb\") pod \"8bef7a7d-188a-4d22-9031-8365098a761f\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.360144 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"8bef7a7d-188a-4d22-9031-8365098a761f\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.360572 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bef7a7d-188a-4d22-9031-8365098a761f-config-data\") pod \"8bef7a7d-188a-4d22-9031-8365098a761f\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.360612 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bef7a7d-188a-4d22-9031-8365098a761f-internal-tls-certs\") pod \"8bef7a7d-188a-4d22-9031-8365098a761f\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.360677 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bef7a7d-188a-4d22-9031-8365098a761f-combined-ca-bundle\") pod \"8bef7a7d-188a-4d22-9031-8365098a761f\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.360730 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bef7a7d-188a-4d22-9031-8365098a761f-scripts\") pod \"8bef7a7d-188a-4d22-9031-8365098a761f\" (UID: \"8bef7a7d-188a-4d22-9031-8365098a761f\") " Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.362616 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12727ccf-0860-4f78-9d5e-4a043848ae2f-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.365453 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bef7a7d-188a-4d22-9031-8365098a761f-kube-api-access-j7hmb" (OuterVolumeSpecName: "kube-api-access-j7hmb") pod "8bef7a7d-188a-4d22-9031-8365098a761f" (UID: "8bef7a7d-188a-4d22-9031-8365098a761f"). InnerVolumeSpecName "kube-api-access-j7hmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.366762 4771 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12727ccf-0860-4f78-9d5e-4a043848ae2f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.366796 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12727ccf-0860-4f78-9d5e-4a043848ae2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.366827 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bef7a7d-188a-4d22-9031-8365098a761f-logs\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.366843 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12727ccf-0860-4f78-9d5e-4a043848ae2f-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.366854 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfvr\" (UniqueName: \"kubernetes.io/projected/12727ccf-0860-4f78-9d5e-4a043848ae2f-kube-api-access-htfvr\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.366874 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.367473 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bef7a7d-188a-4d22-9031-8365098a761f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8bef7a7d-188a-4d22-9031-8365098a761f" (UID: "8bef7a7d-188a-4d22-9031-8365098a761f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.383459 4771 generic.go:334] "Generic (PLEG): container finished" podID="27da755f-7147-4fee-af32-994932f0b715" containerID="fa6776706eed4113429819150516809f826cee28acfecc2925bd2ca2e23044da" exitCode=0 Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.383520 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27da755f-7147-4fee-af32-994932f0b715","Type":"ContainerDied","Data":"fa6776706eed4113429819150516809f826cee28acfecc2925bd2ca2e23044da"} Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.384950 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "8bef7a7d-188a-4d22-9031-8365098a761f" (UID: "8bef7a7d-188a-4d22-9031-8365098a761f"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.386187 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bef7a7d-188a-4d22-9031-8365098a761f-scripts" (OuterVolumeSpecName: "scripts") pod "8bef7a7d-188a-4d22-9031-8365098a761f" (UID: "8bef7a7d-188a-4d22-9031-8365098a761f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.386889 4771 generic.go:334] "Generic (PLEG): container finished" podID="dd57f8bd-d811-4740-b644-f8d69d329d5c" containerID="1deb49c54362ab7c5b82ffdcc324add53e5fc2ca49dfc8cd67dce9ea65914b57" exitCode=0 Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.386917 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fc95dbbd4-gfl9m" event={"ID":"dd57f8bd-d811-4740-b644-f8d69d329d5c","Type":"ContainerDied","Data":"1deb49c54362ab7c5b82ffdcc324add53e5fc2ca49dfc8cd67dce9ea65914b57"} Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.388481 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.397401 4771 scope.go:117] "RemoveContainer" containerID="b557c51300581ae1caf95ce0698f31914ca105125868b4fefa721b4e4d20c3ad" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.421386 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.422005 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bef7a7d-188a-4d22-9031-8365098a761f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8bef7a7d-188a-4d22-9031-8365098a761f" (UID: "8bef7a7d-188a-4d22-9031-8365098a761f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.430782 4771 scope.go:117] "RemoveContainer" containerID="737b90e10a3eea907ddf7d3af1f7885d11f3951a631955b0c824af3f50a9825b" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.458085 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bef7a7d-188a-4d22-9031-8365098a761f-config-data" (OuterVolumeSpecName: "config-data") pod "8bef7a7d-188a-4d22-9031-8365098a761f" (UID: "8bef7a7d-188a-4d22-9031-8365098a761f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.466235 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bef7a7d-188a-4d22-9031-8365098a761f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8bef7a7d-188a-4d22-9031-8365098a761f" (UID: "8bef7a7d-188a-4d22-9031-8365098a761f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.469264 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frmd5\" (UniqueName: \"kubernetes.io/projected/27da755f-7147-4fee-af32-994932f0b715-kube-api-access-frmd5\") pod \"27da755f-7147-4fee-af32-994932f0b715\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.471772 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27da755f-7147-4fee-af32-994932f0b715-config-data\") pod \"27da755f-7147-4fee-af32-994932f0b715\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.471813 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27da755f-7147-4fee-af32-994932f0b715-log-httpd\") pod \"27da755f-7147-4fee-af32-994932f0b715\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.471856 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27da755f-7147-4fee-af32-994932f0b715-scripts\") pod \"27da755f-7147-4fee-af32-994932f0b715\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.471887 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27da755f-7147-4fee-af32-994932f0b715-run-httpd\") pod \"27da755f-7147-4fee-af32-994932f0b715\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.471927 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27da755f-7147-4fee-af32-994932f0b715-kube-api-access-frmd5" (OuterVolumeSpecName: "kube-api-access-frmd5") pod "27da755f-7147-4fee-af32-994932f0b715" (UID: "27da755f-7147-4fee-af32-994932f0b715"). InnerVolumeSpecName "kube-api-access-frmd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.471977 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27da755f-7147-4fee-af32-994932f0b715-sg-core-conf-yaml\") pod \"27da755f-7147-4fee-af32-994932f0b715\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.471997 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27da755f-7147-4fee-af32-994932f0b715-combined-ca-bundle\") pod \"27da755f-7147-4fee-af32-994932f0b715\" (UID: \"27da755f-7147-4fee-af32-994932f0b715\") " Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.472749 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bef7a7d-188a-4d22-9031-8365098a761f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.472764 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frmd5\" (UniqueName: \"kubernetes.io/projected/27da755f-7147-4fee-af32-994932f0b715-kube-api-access-frmd5\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.472774 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bef7a7d-188a-4d22-9031-8365098a761f-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.472784 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8bef7a7d-188a-4d22-9031-8365098a761f-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.472792 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7hmb\" (UniqueName: \"kubernetes.io/projected/8bef7a7d-188a-4d22-9031-8365098a761f-kube-api-access-j7hmb\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.472812 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.472822 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.472832 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bef7a7d-188a-4d22-9031-8365098a761f-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.472840 4771 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bef7a7d-188a-4d22-9031-8365098a761f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.472300 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27da755f-7147-4fee-af32-994932f0b715-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "27da755f-7147-4fee-af32-994932f0b715" (UID: "27da755f-7147-4fee-af32-994932f0b715"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.473788 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27da755f-7147-4fee-af32-994932f0b715-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "27da755f-7147-4fee-af32-994932f0b715" (UID: "27da755f-7147-4fee-af32-994932f0b715"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.484183 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27da755f-7147-4fee-af32-994932f0b715-scripts" (OuterVolumeSpecName: "scripts") pod "27da755f-7147-4fee-af32-994932f0b715" (UID: "27da755f-7147-4fee-af32-994932f0b715"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.488243 4771 scope.go:117] "RemoveContainer" containerID="d3facf87ac24c5acda65fef3df745ef287c70c4a52a00da437d5cd470942dbe9" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.493219 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.506422 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27da755f-7147-4fee-af32-994932f0b715-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "27da755f-7147-4fee-af32-994932f0b715" (UID: "27da755f-7147-4fee-af32-994932f0b715"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.560215 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27da755f-7147-4fee-af32-994932f0b715-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27da755f-7147-4fee-af32-994932f0b715" (UID: "27da755f-7147-4fee-af32-994932f0b715"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.575295 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27da755f-7147-4fee-af32-994932f0b715-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.575330 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27da755f-7147-4fee-af32-994932f0b715-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.575341 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27da755f-7147-4fee-af32-994932f0b715-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.575350 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.575359 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27da755f-7147-4fee-af32-994932f0b715-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.575368 4771 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27da755f-7147-4fee-af32-994932f0b715-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.621378 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27da755f-7147-4fee-af32-994932f0b715-config-data" (OuterVolumeSpecName: "config-data") pod "27da755f-7147-4fee-af32-994932f0b715" (UID: "27da755f-7147-4fee-af32-994932f0b715"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.676566 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27da755f-7147-4fee-af32-994932f0b715-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.747094 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.843703 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.843776 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 01:25:25 crc kubenswrapper[4771]: E0227 01:25:25.844238 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27da755f-7147-4fee-af32-994932f0b715" containerName="ceilometer-central-agent" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.844255 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="27da755f-7147-4fee-af32-994932f0b715" containerName="ceilometer-central-agent" Feb 27 01:25:25 crc kubenswrapper[4771]: E0227 01:25:25.844271 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bef7a7d-188a-4d22-9031-8365098a761f" containerName="glance-httpd" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.844276 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bef7a7d-188a-4d22-9031-8365098a761f" containerName="glance-httpd" Feb 27 01:25:25 crc kubenswrapper[4771]: E0227 01:25:25.844289 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12727ccf-0860-4f78-9d5e-4a043848ae2f" containerName="glance-httpd" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.844295 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="12727ccf-0860-4f78-9d5e-4a043848ae2f" containerName="glance-httpd" Feb 27 01:25:25 crc kubenswrapper[4771]: E0227 01:25:25.844331 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27da755f-7147-4fee-af32-994932f0b715" containerName="sg-core" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.844336 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="27da755f-7147-4fee-af32-994932f0b715" containerName="sg-core" Feb 27 01:25:25 crc kubenswrapper[4771]: E0227 01:25:25.844346 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27da755f-7147-4fee-af32-994932f0b715" containerName="proxy-httpd" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.844352 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="27da755f-7147-4fee-af32-994932f0b715" containerName="proxy-httpd" Feb 27 01:25:25 crc kubenswrapper[4771]: E0227 01:25:25.844363 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12727ccf-0860-4f78-9d5e-4a043848ae2f" containerName="glance-log" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.844368 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="12727ccf-0860-4f78-9d5e-4a043848ae2f" containerName="glance-log" Feb 27 01:25:25 crc kubenswrapper[4771]: E0227 01:25:25.844377 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27da755f-7147-4fee-af32-994932f0b715" containerName="ceilometer-notification-agent" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.844399 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="27da755f-7147-4fee-af32-994932f0b715" containerName="ceilometer-notification-agent" Feb 27 01:25:25 crc kubenswrapper[4771]: E0227 01:25:25.844409 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bef7a7d-188a-4d22-9031-8365098a761f" containerName="glance-log" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.844415 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bef7a7d-188a-4d22-9031-8365098a761f" containerName="glance-log" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.844638 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="27da755f-7147-4fee-af32-994932f0b715" containerName="sg-core" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.844651 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="27da755f-7147-4fee-af32-994932f0b715" containerName="proxy-httpd" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.844660 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bef7a7d-188a-4d22-9031-8365098a761f" containerName="glance-log" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.844669 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="12727ccf-0860-4f78-9d5e-4a043848ae2f" containerName="glance-log" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.844679 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="27da755f-7147-4fee-af32-994932f0b715" containerName="ceilometer-central-agent" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.844686 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bef7a7d-188a-4d22-9031-8365098a761f" containerName="glance-httpd" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.844712 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="12727ccf-0860-4f78-9d5e-4a043848ae2f" containerName="glance-httpd" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.844724 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="27da755f-7147-4fee-af32-994932f0b715" containerName="ceilometer-notification-agent" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.845859 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.849765 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-jpjnq" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.849810 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.851740 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.854113 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.854279 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.861978 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.871238 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.880417 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.886010 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.896301 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.901142 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.901318 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.987250 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"03b15be0-3bda-4754-b43a-35e34cb84fcb\") " pod="openstack/glance-default-external-api-0" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.987317 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03b15be0-3bda-4754-b43a-35e34cb84fcb-config-data\") pod \"glance-default-external-api-0\" (UID: \"03b15be0-3bda-4754-b43a-35e34cb84fcb\") " pod="openstack/glance-default-external-api-0" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.987349 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/03b15be0-3bda-4754-b43a-35e34cb84fcb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"03b15be0-3bda-4754-b43a-35e34cb84fcb\") " pod="openstack/glance-default-external-api-0" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.987372 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b15be0-3bda-4754-b43a-35e34cb84fcb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"03b15be0-3bda-4754-b43a-35e34cb84fcb\") " pod="openstack/glance-default-external-api-0" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.987425 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b15be0-3bda-4754-b43a-35e34cb84fcb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"03b15be0-3bda-4754-b43a-35e34cb84fcb\") " pod="openstack/glance-default-external-api-0" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.987539 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvch4\" (UniqueName: \"kubernetes.io/projected/03b15be0-3bda-4754-b43a-35e34cb84fcb-kube-api-access-zvch4\") pod \"glance-default-external-api-0\" (UID: \"03b15be0-3bda-4754-b43a-35e34cb84fcb\") " pod="openstack/glance-default-external-api-0" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.987641 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03b15be0-3bda-4754-b43a-35e34cb84fcb-scripts\") pod \"glance-default-external-api-0\" (UID: \"03b15be0-3bda-4754-b43a-35e34cb84fcb\") " pod="openstack/glance-default-external-api-0" Feb 27 01:25:25 crc kubenswrapper[4771]: I0227 01:25:25.987697 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03b15be0-3bda-4754-b43a-35e34cb84fcb-logs\") pod \"glance-default-external-api-0\" (UID: \"03b15be0-3bda-4754-b43a-35e34cb84fcb\") " pod="openstack/glance-default-external-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.089233 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c197b80-0aa2-49fd-b9d6-19cbb40e59e3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.089289 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c197b80-0aa2-49fd-b9d6-19cbb40e59e3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.089325 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03b15be0-3bda-4754-b43a-35e34cb84fcb-logs\") pod \"glance-default-external-api-0\" (UID: \"03b15be0-3bda-4754-b43a-35e34cb84fcb\") " pod="openstack/glance-default-external-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.089359 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"03b15be0-3bda-4754-b43a-35e34cb84fcb\") " pod="openstack/glance-default-external-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.089405 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03b15be0-3bda-4754-b43a-35e34cb84fcb-config-data\") pod \"glance-default-external-api-0\" (UID: \"03b15be0-3bda-4754-b43a-35e34cb84fcb\") " pod="openstack/glance-default-external-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.089438 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.089462 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/03b15be0-3bda-4754-b43a-35e34cb84fcb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"03b15be0-3bda-4754-b43a-35e34cb84fcb\") " pod="openstack/glance-default-external-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.089486 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b15be0-3bda-4754-b43a-35e34cb84fcb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"03b15be0-3bda-4754-b43a-35e34cb84fcb\") " pod="openstack/glance-default-external-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.089648 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"03b15be0-3bda-4754-b43a-35e34cb84fcb\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.089810 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03b15be0-3bda-4754-b43a-35e34cb84fcb-logs\") pod \"glance-default-external-api-0\" (UID: \"03b15be0-3bda-4754-b43a-35e34cb84fcb\") " pod="openstack/glance-default-external-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.089918 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/03b15be0-3bda-4754-b43a-35e34cb84fcb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"03b15be0-3bda-4754-b43a-35e34cb84fcb\") " pod="openstack/glance-default-external-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.090265 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b15be0-3bda-4754-b43a-35e34cb84fcb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"03b15be0-3bda-4754-b43a-35e34cb84fcb\") " pod="openstack/glance-default-external-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.090305 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c197b80-0aa2-49fd-b9d6-19cbb40e59e3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.090339 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c197b80-0aa2-49fd-b9d6-19cbb40e59e3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.090490 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvch4\" (UniqueName: \"kubernetes.io/projected/03b15be0-3bda-4754-b43a-35e34cb84fcb-kube-api-access-zvch4\") pod \"glance-default-external-api-0\" (UID: \"03b15be0-3bda-4754-b43a-35e34cb84fcb\") " pod="openstack/glance-default-external-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.090523 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmt99\" (UniqueName: \"kubernetes.io/projected/8c197b80-0aa2-49fd-b9d6-19cbb40e59e3-kube-api-access-pmt99\") pod \"glance-default-internal-api-0\" (UID: \"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.090587 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03b15be0-3bda-4754-b43a-35e34cb84fcb-scripts\") pod \"glance-default-external-api-0\" (UID: \"03b15be0-3bda-4754-b43a-35e34cb84fcb\") " pod="openstack/glance-default-external-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.090632 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c197b80-0aa2-49fd-b9d6-19cbb40e59e3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.090673 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c197b80-0aa2-49fd-b9d6-19cbb40e59e3-logs\") pod \"glance-default-internal-api-0\" (UID: \"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.106337 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03b15be0-3bda-4754-b43a-35e34cb84fcb-scripts\") pod \"glance-default-external-api-0\" (UID: \"03b15be0-3bda-4754-b43a-35e34cb84fcb\") " pod="openstack/glance-default-external-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.106821 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b15be0-3bda-4754-b43a-35e34cb84fcb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"03b15be0-3bda-4754-b43a-35e34cb84fcb\") " pod="openstack/glance-default-external-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.106939 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b15be0-3bda-4754-b43a-35e34cb84fcb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"03b15be0-3bda-4754-b43a-35e34cb84fcb\") " pod="openstack/glance-default-external-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.110257 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvch4\" (UniqueName: \"kubernetes.io/projected/03b15be0-3bda-4754-b43a-35e34cb84fcb-kube-api-access-zvch4\") pod \"glance-default-external-api-0\" (UID: \"03b15be0-3bda-4754-b43a-35e34cb84fcb\") " pod="openstack/glance-default-external-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.114777 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03b15be0-3bda-4754-b43a-35e34cb84fcb-config-data\") pod \"glance-default-external-api-0\" (UID: \"03b15be0-3bda-4754-b43a-35e34cb84fcb\") " pod="openstack/glance-default-external-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.127522 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"03b15be0-3bda-4754-b43a-35e34cb84fcb\") " pod="openstack/glance-default-external-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.187021 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.192220 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c197b80-0aa2-49fd-b9d6-19cbb40e59e3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.192274 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c197b80-0aa2-49fd-b9d6-19cbb40e59e3-logs\") pod \"glance-default-internal-api-0\" (UID: \"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.192305 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c197b80-0aa2-49fd-b9d6-19cbb40e59e3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.192330 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c197b80-0aa2-49fd-b9d6-19cbb40e59e3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.192377 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.192431 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c197b80-0aa2-49fd-b9d6-19cbb40e59e3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.192448 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c197b80-0aa2-49fd-b9d6-19cbb40e59e3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.192479 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmt99\" (UniqueName: \"kubernetes.io/projected/8c197b80-0aa2-49fd-b9d6-19cbb40e59e3-kube-api-access-pmt99\") pod \"glance-default-internal-api-0\" (UID: \"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.192888 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c197b80-0aa2-49fd-b9d6-19cbb40e59e3-logs\") pod \"glance-default-internal-api-0\" (UID: \"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.192950 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.193387 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c197b80-0aa2-49fd-b9d6-19cbb40e59e3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.196250 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c197b80-0aa2-49fd-b9d6-19cbb40e59e3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.197245 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c197b80-0aa2-49fd-b9d6-19cbb40e59e3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.197256 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c197b80-0aa2-49fd-b9d6-19cbb40e59e3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.197461 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c197b80-0aa2-49fd-b9d6-19cbb40e59e3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.212134 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmt99\" (UniqueName: \"kubernetes.io/projected/8c197b80-0aa2-49fd-b9d6-19cbb40e59e3-kube-api-access-pmt99\") pod \"glance-default-internal-api-0\" (UID: \"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.226397 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3\") " pod="openstack/glance-default-internal-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.432689 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.432681 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27da755f-7147-4fee-af32-994932f0b715","Type":"ContainerDied","Data":"4a1c936adc8313926f5ce4da7087dea8f9f9f83321d2ee37df906d9553bbd2c4"} Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.433166 4771 scope.go:117] "RemoveContainer" containerID="89e7cd8bc60506a2a24f7532e7eab47b97372fc8e3b8d634886cd4a769da84d2" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.464309 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.480753 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.485093 4771 scope.go:117] "RemoveContainer" containerID="550648b9982885f7e4e9c7c22d09cc61b35ae7c81dca6526f000a6f13a2ecb1b" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.492921 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.495185 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.499217 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.507083 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.511132 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.521725 4771 scope.go:117] "RemoveContainer" containerID="fa6776706eed4113429819150516809f826cee28acfecc2925bd2ca2e23044da" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.530465 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.605564 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dcddaa5-567b-4ee7-ba36-894719a998c9-log-httpd\") pod \"ceilometer-0\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " pod="openstack/ceilometer-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.605610 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dcddaa5-567b-4ee7-ba36-894719a998c9-config-data\") pod \"ceilometer-0\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " pod="openstack/ceilometer-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.605648 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dcddaa5-567b-4ee7-ba36-894719a998c9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " pod="openstack/ceilometer-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.605667 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dcddaa5-567b-4ee7-ba36-894719a998c9-scripts\") pod \"ceilometer-0\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " pod="openstack/ceilometer-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.605718 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dcddaa5-567b-4ee7-ba36-894719a998c9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " pod="openstack/ceilometer-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.605750 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dcddaa5-567b-4ee7-ba36-894719a998c9-run-httpd\") pod \"ceilometer-0\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " pod="openstack/ceilometer-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.605785 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8nmw\" (UniqueName: \"kubernetes.io/projected/5dcddaa5-567b-4ee7-ba36-894719a998c9-kube-api-access-n8nmw\") pod \"ceilometer-0\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " pod="openstack/ceilometer-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.653869 4771 scope.go:117] "RemoveContainer" containerID="11ce1410c8bac052456f2674c6d6df4df2b88857e5958f89e1bd872be3ce0ff5" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.709111 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dcddaa5-567b-4ee7-ba36-894719a998c9-log-httpd\") pod \"ceilometer-0\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " pod="openstack/ceilometer-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.709168 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dcddaa5-567b-4ee7-ba36-894719a998c9-config-data\") pod \"ceilometer-0\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " pod="openstack/ceilometer-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.709209 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dcddaa5-567b-4ee7-ba36-894719a998c9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " pod="openstack/ceilometer-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.709230 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dcddaa5-567b-4ee7-ba36-894719a998c9-scripts\") pod \"ceilometer-0\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " pod="openstack/ceilometer-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.709267 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dcddaa5-567b-4ee7-ba36-894719a998c9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " pod="openstack/ceilometer-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.709301 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dcddaa5-567b-4ee7-ba36-894719a998c9-run-httpd\") pod \"ceilometer-0\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " pod="openstack/ceilometer-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.709338 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8nmw\" (UniqueName: \"kubernetes.io/projected/5dcddaa5-567b-4ee7-ba36-894719a998c9-kube-api-access-n8nmw\") pod \"ceilometer-0\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " pod="openstack/ceilometer-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.710227 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dcddaa5-567b-4ee7-ba36-894719a998c9-log-httpd\") pod \"ceilometer-0\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " pod="openstack/ceilometer-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.711900 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dcddaa5-567b-4ee7-ba36-894719a998c9-run-httpd\") pod \"ceilometer-0\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " pod="openstack/ceilometer-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.723694 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dcddaa5-567b-4ee7-ba36-894719a998c9-scripts\") pod \"ceilometer-0\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " pod="openstack/ceilometer-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.726197 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dcddaa5-567b-4ee7-ba36-894719a998c9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " pod="openstack/ceilometer-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.728745 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dcddaa5-567b-4ee7-ba36-894719a998c9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " pod="openstack/ceilometer-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.730943 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8nmw\" (UniqueName: \"kubernetes.io/projected/5dcddaa5-567b-4ee7-ba36-894719a998c9-kube-api-access-n8nmw\") pod \"ceilometer-0\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " pod="openstack/ceilometer-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.731587 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dcddaa5-567b-4ee7-ba36-894719a998c9-config-data\") pod \"ceilometer-0\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " pod="openstack/ceilometer-0" Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.821415 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 27 01:25:26 crc kubenswrapper[4771]: I0227 01:25:26.945070 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:25:27 crc kubenswrapper[4771]: I0227 01:25:27.114006 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 27 01:25:27 crc kubenswrapper[4771]: W0227 01:25:27.116277 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c197b80_0aa2_49fd_b9d6_19cbb40e59e3.slice/crio-046553cc3295187d68321214217948008fda57b8120db98c6a2762871237d11d WatchSource:0}: Error finding container 046553cc3295187d68321214217948008fda57b8120db98c6a2762871237d11d: Status 404 returned error can't find the container with id 046553cc3295187d68321214217948008fda57b8120db98c6a2762871237d11d Feb 27 01:25:27 crc kubenswrapper[4771]: I0227 01:25:27.450697 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:25:27 crc kubenswrapper[4771]: I0227 01:25:27.465375 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3","Type":"ContainerStarted","Data":"046553cc3295187d68321214217948008fda57b8120db98c6a2762871237d11d"} Feb 27 01:25:27 crc kubenswrapper[4771]: I0227 01:25:27.471748 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"03b15be0-3bda-4754-b43a-35e34cb84fcb","Type":"ContainerStarted","Data":"636d46801fb93ee8e3323a8edd6a5896ae5db37bf0684fb86e377e0cb0aad07b"} Feb 27 01:25:27 crc kubenswrapper[4771]: I0227 01:25:27.471800 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"03b15be0-3bda-4754-b43a-35e34cb84fcb","Type":"ContainerStarted","Data":"cde84bece80028d9add9cce8dd841316c61265e79281cc08649d9a72afbc4d86"} Feb 27 01:25:27 crc kubenswrapper[4771]: I0227 01:25:27.475921 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:25:27 crc kubenswrapper[4771]: I0227 01:25:27.787309 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12727ccf-0860-4f78-9d5e-4a043848ae2f" path="/var/lib/kubelet/pods/12727ccf-0860-4f78-9d5e-4a043848ae2f/volumes" Feb 27 01:25:27 crc kubenswrapper[4771]: I0227 01:25:27.788246 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27da755f-7147-4fee-af32-994932f0b715" path="/var/lib/kubelet/pods/27da755f-7147-4fee-af32-994932f0b715/volumes" Feb 27 01:25:27 crc kubenswrapper[4771]: I0227 01:25:27.790836 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bef7a7d-188a-4d22-9031-8365098a761f" path="/var/lib/kubelet/pods/8bef7a7d-188a-4d22-9031-8365098a761f/volumes" Feb 27 01:25:27 crc kubenswrapper[4771]: I0227 01:25:27.929970 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:27 crc kubenswrapper[4771]: I0227 01:25:27.931660 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.357504 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.463307 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-scripts\") pod \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.463350 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-horizon-secret-key\") pod \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.463464 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-logs\") pod \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.463523 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-combined-ca-bundle\") pod \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.463615 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-config-data\") pod \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.463664 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62nt5\" (UniqueName: \"kubernetes.io/projected/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-kube-api-access-62nt5\") pod \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.463753 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-horizon-tls-certs\") pod \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\" (UID: \"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7\") " Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.473078 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-logs" (OuterVolumeSpecName: "logs") pod "9d4b37ec-a290-4c5c-9a05-31e0499c3ed7" (UID: "9d4b37ec-a290-4c5c-9a05-31e0499c3ed7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.495868 4771 generic.go:334] "Generic (PLEG): container finished" podID="9d4b37ec-a290-4c5c-9a05-31e0499c3ed7" containerID="f00f02e0a0a413602901c163d4c12c6a7fe17ee6a0b5395d18a0d9b6c3ffe9f3" exitCode=137 Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.495929 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fb8f8d788-kjgv6" event={"ID":"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7","Type":"ContainerDied","Data":"f00f02e0a0a413602901c163d4c12c6a7fe17ee6a0b5395d18a0d9b6c3ffe9f3"} Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.495956 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fb8f8d788-kjgv6" event={"ID":"9d4b37ec-a290-4c5c-9a05-31e0499c3ed7","Type":"ContainerDied","Data":"7606139dd1784903ed5746e62418199c84c7f11ef1ad702a2612f7414ea95dcb"} Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.495973 4771 scope.go:117] "RemoveContainer" containerID="ff183ff6573a5763d5b2e109ee1eb7352e632d657f81e62dff6a2c6bc7b3dd1d" Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.496102 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7fb8f8d788-kjgv6" Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.496663 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-kube-api-access-62nt5" (OuterVolumeSpecName: "kube-api-access-62nt5") pod "9d4b37ec-a290-4c5c-9a05-31e0499c3ed7" (UID: "9d4b37ec-a290-4c5c-9a05-31e0499c3ed7"). InnerVolumeSpecName "kube-api-access-62nt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.497130 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "9d4b37ec-a290-4c5c-9a05-31e0499c3ed7" (UID: "9d4b37ec-a290-4c5c-9a05-31e0499c3ed7"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.501471 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-scripts" (OuterVolumeSpecName: "scripts") pod "9d4b37ec-a290-4c5c-9a05-31e0499c3ed7" (UID: "9d4b37ec-a290-4c5c-9a05-31e0499c3ed7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.502926 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dcddaa5-567b-4ee7-ba36-894719a998c9","Type":"ContainerStarted","Data":"81ba5d615a0683d5ca9af691dd3103c55a2955078e8f7e8852bd811f0b0eee66"} Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.502973 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dcddaa5-567b-4ee7-ba36-894719a998c9","Type":"ContainerStarted","Data":"71cf574ac7e9bbe5ce6b9dc88520671ad9b18769f1d45e6aeb12ab23cb24800d"} Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.508264 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3","Type":"ContainerStarted","Data":"835ad71a7366520ceabd61749c89c97ca31e6410506989fdd339ea7c58445738"} Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.511369 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"03b15be0-3bda-4754-b43a-35e34cb84fcb","Type":"ContainerStarted","Data":"6594b07b7958b278ce209743d9232177f31cdf4bbcd0900dd7ea62b23df03df7"} Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.536237 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-config-data" (OuterVolumeSpecName: "config-data") pod "9d4b37ec-a290-4c5c-9a05-31e0499c3ed7" (UID: "9d4b37ec-a290-4c5c-9a05-31e0499c3ed7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.544933 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.544902533 podStartE2EDuration="3.544902533s" podCreationTimestamp="2026-02-27 01:25:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:25:28.534356826 +0000 UTC m=+1241.471918114" watchObservedRunningTime="2026-02-27 01:25:28.544902533 +0000 UTC m=+1241.482463821" Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.567775 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-logs\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.567797 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.567807 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62nt5\" (UniqueName: \"kubernetes.io/projected/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-kube-api-access-62nt5\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.567815 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.567826 4771 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.572061 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "9d4b37ec-a290-4c5c-9a05-31e0499c3ed7" (UID: "9d4b37ec-a290-4c5c-9a05-31e0499c3ed7"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.580353 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d4b37ec-a290-4c5c-9a05-31e0499c3ed7" (UID: "9d4b37ec-a290-4c5c-9a05-31e0499c3ed7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.663917 4771 scope.go:117] "RemoveContainer" containerID="f00f02e0a0a413602901c163d4c12c6a7fe17ee6a0b5395d18a0d9b6c3ffe9f3" Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.670407 4771 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.670436 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.689625 4771 scope.go:117] "RemoveContainer" containerID="ff183ff6573a5763d5b2e109ee1eb7352e632d657f81e62dff6a2c6bc7b3dd1d" Feb 27 01:25:28 crc kubenswrapper[4771]: E0227 01:25:28.693939 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff183ff6573a5763d5b2e109ee1eb7352e632d657f81e62dff6a2c6bc7b3dd1d\": container with ID starting with ff183ff6573a5763d5b2e109ee1eb7352e632d657f81e62dff6a2c6bc7b3dd1d not found: ID does not exist" containerID="ff183ff6573a5763d5b2e109ee1eb7352e632d657f81e62dff6a2c6bc7b3dd1d" Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.693974 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff183ff6573a5763d5b2e109ee1eb7352e632d657f81e62dff6a2c6bc7b3dd1d"} err="failed to get container status \"ff183ff6573a5763d5b2e109ee1eb7352e632d657f81e62dff6a2c6bc7b3dd1d\": rpc error: code = NotFound desc = could not find container \"ff183ff6573a5763d5b2e109ee1eb7352e632d657f81e62dff6a2c6bc7b3dd1d\": container with ID starting with ff183ff6573a5763d5b2e109ee1eb7352e632d657f81e62dff6a2c6bc7b3dd1d not found: ID does not exist" Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.693995 4771 scope.go:117] "RemoveContainer" containerID="f00f02e0a0a413602901c163d4c12c6a7fe17ee6a0b5395d18a0d9b6c3ffe9f3" Feb 27 01:25:28 crc kubenswrapper[4771]: E0227 01:25:28.694224 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f00f02e0a0a413602901c163d4c12c6a7fe17ee6a0b5395d18a0d9b6c3ffe9f3\": container with ID starting with f00f02e0a0a413602901c163d4c12c6a7fe17ee6a0b5395d18a0d9b6c3ffe9f3 not found: ID does not exist" containerID="f00f02e0a0a413602901c163d4c12c6a7fe17ee6a0b5395d18a0d9b6c3ffe9f3" Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.694242 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f00f02e0a0a413602901c163d4c12c6a7fe17ee6a0b5395d18a0d9b6c3ffe9f3"} err="failed to get container status \"f00f02e0a0a413602901c163d4c12c6a7fe17ee6a0b5395d18a0d9b6c3ffe9f3\": rpc error: code = NotFound desc = could not find container \"f00f02e0a0a413602901c163d4c12c6a7fe17ee6a0b5395d18a0d9b6c3ffe9f3\": container with ID starting with f00f02e0a0a413602901c163d4c12c6a7fe17ee6a0b5395d18a0d9b6c3ffe9f3 not found: ID does not exist" Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.982304 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7fb8f8d788-kjgv6"] Feb 27 01:25:28 crc kubenswrapper[4771]: I0227 01:25:28.993797 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7fb8f8d788-kjgv6"] Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.428008 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-7hm42"] Feb 27 01:25:29 crc kubenswrapper[4771]: E0227 01:25:29.428483 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d4b37ec-a290-4c5c-9a05-31e0499c3ed7" containerName="horizon" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.428498 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d4b37ec-a290-4c5c-9a05-31e0499c3ed7" containerName="horizon" Feb 27 01:25:29 crc kubenswrapper[4771]: E0227 01:25:29.428535 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d4b37ec-a290-4c5c-9a05-31e0499c3ed7" containerName="horizon-log" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.428544 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d4b37ec-a290-4c5c-9a05-31e0499c3ed7" containerName="horizon-log" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.428775 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d4b37ec-a290-4c5c-9a05-31e0499c3ed7" containerName="horizon-log" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.428810 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d4b37ec-a290-4c5c-9a05-31e0499c3ed7" containerName="horizon" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.429505 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7hm42" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.435977 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7hm42"] Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.559167 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8c197b80-0aa2-49fd-b9d6-19cbb40e59e3","Type":"ContainerStarted","Data":"b9b9b0fbf6423ea547fcb291b4d17f3edcea589dcad5a4c08a6631829b7e39bc"} Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.587469 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-ssvdh"] Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.588584 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ssvdh" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.588714 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhrf9\" (UniqueName: \"kubernetes.io/projected/ac233507-57ad-484c-817b-270cee86a50a-kube-api-access-vhrf9\") pod \"nova-api-db-create-7hm42\" (UID: \"ac233507-57ad-484c-817b-270cee86a50a\") " pod="openstack/nova-api-db-create-7hm42" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.588860 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac233507-57ad-484c-817b-270cee86a50a-operator-scripts\") pod \"nova-api-db-create-7hm42\" (UID: \"ac233507-57ad-484c-817b-270cee86a50a\") " pod="openstack/nova-api-db-create-7hm42" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.628402 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.62838087 podStartE2EDuration="4.62838087s" podCreationTimestamp="2026-02-27 01:25:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:25:29.603099403 +0000 UTC m=+1242.540660691" watchObservedRunningTime="2026-02-27 01:25:29.62838087 +0000 UTC m=+1242.565942148" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.628974 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-ssvdh"] Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.629027 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dcddaa5-567b-4ee7-ba36-894719a998c9","Type":"ContainerStarted","Data":"db95962d0596b9c9ce82875974fec1606d36f60fbc0524bd6d078544d86cf659"} Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.671450 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-kz4w4"] Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.672591 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kz4w4" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.689397 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-kz4w4"] Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.690468 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhrf9\" (UniqueName: \"kubernetes.io/projected/ac233507-57ad-484c-817b-270cee86a50a-kube-api-access-vhrf9\") pod \"nova-api-db-create-7hm42\" (UID: \"ac233507-57ad-484c-817b-270cee86a50a\") " pod="openstack/nova-api-db-create-7hm42" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.690530 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac233507-57ad-484c-817b-270cee86a50a-operator-scripts\") pod \"nova-api-db-create-7hm42\" (UID: \"ac233507-57ad-484c-817b-270cee86a50a\") " pod="openstack/nova-api-db-create-7hm42" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.690641 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mbt6\" (UniqueName: \"kubernetes.io/projected/d8681229-5d10-47f2-8cdf-fa8b6c584ef8-kube-api-access-8mbt6\") pod \"nova-cell0-db-create-ssvdh\" (UID: \"d8681229-5d10-47f2-8cdf-fa8b6c584ef8\") " pod="openstack/nova-cell0-db-create-ssvdh" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.690669 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8681229-5d10-47f2-8cdf-fa8b6c584ef8-operator-scripts\") pod \"nova-cell0-db-create-ssvdh\" (UID: \"d8681229-5d10-47f2-8cdf-fa8b6c584ef8\") " pod="openstack/nova-cell0-db-create-ssvdh" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.693756 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac233507-57ad-484c-817b-270cee86a50a-operator-scripts\") pod \"nova-api-db-create-7hm42\" (UID: \"ac233507-57ad-484c-817b-270cee86a50a\") " pod="openstack/nova-api-db-create-7hm42" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.707723 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-a6fe-account-create-update-52b7q"] Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.708906 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a6fe-account-create-update-52b7q" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.713857 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.721468 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-a6fe-account-create-update-52b7q"] Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.728573 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhrf9\" (UniqueName: \"kubernetes.io/projected/ac233507-57ad-484c-817b-270cee86a50a-kube-api-access-vhrf9\") pod \"nova-api-db-create-7hm42\" (UID: \"ac233507-57ad-484c-817b-270cee86a50a\") " pod="openstack/nova-api-db-create-7hm42" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.757021 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7hm42" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.792762 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mbt6\" (UniqueName: \"kubernetes.io/projected/d8681229-5d10-47f2-8cdf-fa8b6c584ef8-kube-api-access-8mbt6\") pod \"nova-cell0-db-create-ssvdh\" (UID: \"d8681229-5d10-47f2-8cdf-fa8b6c584ef8\") " pod="openstack/nova-cell0-db-create-ssvdh" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.792832 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8681229-5d10-47f2-8cdf-fa8b6c584ef8-operator-scripts\") pod \"nova-cell0-db-create-ssvdh\" (UID: \"d8681229-5d10-47f2-8cdf-fa8b6c584ef8\") " pod="openstack/nova-cell0-db-create-ssvdh" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.794526 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7-operator-scripts\") pod \"nova-api-a6fe-account-create-update-52b7q\" (UID: \"f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7\") " pod="openstack/nova-api-a6fe-account-create-update-52b7q" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.794691 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8681229-5d10-47f2-8cdf-fa8b6c584ef8-operator-scripts\") pod \"nova-cell0-db-create-ssvdh\" (UID: \"d8681229-5d10-47f2-8cdf-fa8b6c584ef8\") " pod="openstack/nova-cell0-db-create-ssvdh" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.794840 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e18a7c26-064f-4b67-b9e2-d8a66499cec8-operator-scripts\") pod \"nova-cell1-db-create-kz4w4\" (UID: \"e18a7c26-064f-4b67-b9e2-d8a66499cec8\") " pod="openstack/nova-cell1-db-create-kz4w4" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.794922 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5hkg\" (UniqueName: \"kubernetes.io/projected/f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7-kube-api-access-v5hkg\") pod \"nova-api-a6fe-account-create-update-52b7q\" (UID: \"f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7\") " pod="openstack/nova-api-a6fe-account-create-update-52b7q" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.794969 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqkq7\" (UniqueName: \"kubernetes.io/projected/e18a7c26-064f-4b67-b9e2-d8a66499cec8-kube-api-access-qqkq7\") pod \"nova-cell1-db-create-kz4w4\" (UID: \"e18a7c26-064f-4b67-b9e2-d8a66499cec8\") " pod="openstack/nova-cell1-db-create-kz4w4" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.807460 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4b37ec-a290-4c5c-9a05-31e0499c3ed7" path="/var/lib/kubelet/pods/9d4b37ec-a290-4c5c-9a05-31e0499c3ed7/volumes" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.813875 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mbt6\" (UniqueName: \"kubernetes.io/projected/d8681229-5d10-47f2-8cdf-fa8b6c584ef8-kube-api-access-8mbt6\") pod \"nova-cell0-db-create-ssvdh\" (UID: \"d8681229-5d10-47f2-8cdf-fa8b6c584ef8\") " pod="openstack/nova-cell0-db-create-ssvdh" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.848622 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-d71f-account-create-update-qphgx"] Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.849785 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d71f-account-create-update-qphgx" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.853258 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.857415 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d71f-account-create-update-qphgx"] Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.867498 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ssvdh" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.898598 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5hkg\" (UniqueName: \"kubernetes.io/projected/f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7-kube-api-access-v5hkg\") pod \"nova-api-a6fe-account-create-update-52b7q\" (UID: \"f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7\") " pod="openstack/nova-api-a6fe-account-create-update-52b7q" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.898642 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqkq7\" (UniqueName: \"kubernetes.io/projected/e18a7c26-064f-4b67-b9e2-d8a66499cec8-kube-api-access-qqkq7\") pod \"nova-cell1-db-create-kz4w4\" (UID: \"e18a7c26-064f-4b67-b9e2-d8a66499cec8\") " pod="openstack/nova-cell1-db-create-kz4w4" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.898694 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6e8c01e-0567-48bd-aaef-580afc5667af-operator-scripts\") pod \"nova-cell0-d71f-account-create-update-qphgx\" (UID: \"e6e8c01e-0567-48bd-aaef-580afc5667af\") " pod="openstack/nova-cell0-d71f-account-create-update-qphgx" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.898825 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7-operator-scripts\") pod \"nova-api-a6fe-account-create-update-52b7q\" (UID: \"f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7\") " pod="openstack/nova-api-a6fe-account-create-update-52b7q" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.898903 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwqft\" (UniqueName: \"kubernetes.io/projected/e6e8c01e-0567-48bd-aaef-580afc5667af-kube-api-access-mwqft\") pod \"nova-cell0-d71f-account-create-update-qphgx\" (UID: \"e6e8c01e-0567-48bd-aaef-580afc5667af\") " pod="openstack/nova-cell0-d71f-account-create-update-qphgx" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.898964 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e18a7c26-064f-4b67-b9e2-d8a66499cec8-operator-scripts\") pod \"nova-cell1-db-create-kz4w4\" (UID: \"e18a7c26-064f-4b67-b9e2-d8a66499cec8\") " pod="openstack/nova-cell1-db-create-kz4w4" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.899577 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e18a7c26-064f-4b67-b9e2-d8a66499cec8-operator-scripts\") pod \"nova-cell1-db-create-kz4w4\" (UID: \"e18a7c26-064f-4b67-b9e2-d8a66499cec8\") " pod="openstack/nova-cell1-db-create-kz4w4" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.901492 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7-operator-scripts\") pod \"nova-api-a6fe-account-create-update-52b7q\" (UID: \"f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7\") " pod="openstack/nova-api-a6fe-account-create-update-52b7q" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.923937 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqkq7\" (UniqueName: \"kubernetes.io/projected/e18a7c26-064f-4b67-b9e2-d8a66499cec8-kube-api-access-qqkq7\") pod \"nova-cell1-db-create-kz4w4\" (UID: \"e18a7c26-064f-4b67-b9e2-d8a66499cec8\") " pod="openstack/nova-cell1-db-create-kz4w4" Feb 27 01:25:29 crc kubenswrapper[4771]: I0227 01:25:29.925500 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5hkg\" (UniqueName: \"kubernetes.io/projected/f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7-kube-api-access-v5hkg\") pod \"nova-api-a6fe-account-create-update-52b7q\" (UID: \"f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7\") " pod="openstack/nova-api-a6fe-account-create-update-52b7q" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.000218 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwqft\" (UniqueName: \"kubernetes.io/projected/e6e8c01e-0567-48bd-aaef-580afc5667af-kube-api-access-mwqft\") pod \"nova-cell0-d71f-account-create-update-qphgx\" (UID: \"e6e8c01e-0567-48bd-aaef-580afc5667af\") " pod="openstack/nova-cell0-d71f-account-create-update-qphgx" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.000584 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6e8c01e-0567-48bd-aaef-580afc5667af-operator-scripts\") pod \"nova-cell0-d71f-account-create-update-qphgx\" (UID: \"e6e8c01e-0567-48bd-aaef-580afc5667af\") " pod="openstack/nova-cell0-d71f-account-create-update-qphgx" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.001400 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6e8c01e-0567-48bd-aaef-580afc5667af-operator-scripts\") pod \"nova-cell0-d71f-account-create-update-qphgx\" (UID: \"e6e8c01e-0567-48bd-aaef-580afc5667af\") " pod="openstack/nova-cell0-d71f-account-create-update-qphgx" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.019071 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwqft\" (UniqueName: \"kubernetes.io/projected/e6e8c01e-0567-48bd-aaef-580afc5667af-kube-api-access-mwqft\") pod \"nova-cell0-d71f-account-create-update-qphgx\" (UID: \"e6e8c01e-0567-48bd-aaef-580afc5667af\") " pod="openstack/nova-cell0-d71f-account-create-update-qphgx" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.041611 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-eb75-account-create-update-9d68f"] Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.042813 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-eb75-account-create-update-9d68f" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.045054 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.047710 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-eb75-account-create-update-9d68f"] Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.102850 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c70c92b-3989-4fa8-9a8a-ce6be8839d8e-operator-scripts\") pod \"nova-cell1-eb75-account-create-update-9d68f\" (UID: \"1c70c92b-3989-4fa8-9a8a-ce6be8839d8e\") " pod="openstack/nova-cell1-eb75-account-create-update-9d68f" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.103044 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jfbj\" (UniqueName: \"kubernetes.io/projected/1c70c92b-3989-4fa8-9a8a-ce6be8839d8e-kube-api-access-8jfbj\") pod \"nova-cell1-eb75-account-create-update-9d68f\" (UID: \"1c70c92b-3989-4fa8-9a8a-ce6be8839d8e\") " pod="openstack/nova-cell1-eb75-account-create-update-9d68f" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.142827 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fc95dbbd4-gfl9m" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.186066 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kz4w4" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.207242 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd57f8bd-d811-4740-b644-f8d69d329d5c-config\") pod \"dd57f8bd-d811-4740-b644-f8d69d329d5c\" (UID: \"dd57f8bd-d811-4740-b644-f8d69d329d5c\") " Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.207643 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd57f8bd-d811-4740-b644-f8d69d329d5c-ovndb-tls-certs\") pod \"dd57f8bd-d811-4740-b644-f8d69d329d5c\" (UID: \"dd57f8bd-d811-4740-b644-f8d69d329d5c\") " Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.207793 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd57f8bd-d811-4740-b644-f8d69d329d5c-combined-ca-bundle\") pod \"dd57f8bd-d811-4740-b644-f8d69d329d5c\" (UID: \"dd57f8bd-d811-4740-b644-f8d69d329d5c\") " Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.207919 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dd57f8bd-d811-4740-b644-f8d69d329d5c-httpd-config\") pod \"dd57f8bd-d811-4740-b644-f8d69d329d5c\" (UID: \"dd57f8bd-d811-4740-b644-f8d69d329d5c\") " Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.208082 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v5vb\" (UniqueName: \"kubernetes.io/projected/dd57f8bd-d811-4740-b644-f8d69d329d5c-kube-api-access-9v5vb\") pod \"dd57f8bd-d811-4740-b644-f8d69d329d5c\" (UID: \"dd57f8bd-d811-4740-b644-f8d69d329d5c\") " Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.208483 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jfbj\" (UniqueName: \"kubernetes.io/projected/1c70c92b-3989-4fa8-9a8a-ce6be8839d8e-kube-api-access-8jfbj\") pod \"nova-cell1-eb75-account-create-update-9d68f\" (UID: \"1c70c92b-3989-4fa8-9a8a-ce6be8839d8e\") " pod="openstack/nova-cell1-eb75-account-create-update-9d68f" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.208734 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c70c92b-3989-4fa8-9a8a-ce6be8839d8e-operator-scripts\") pod \"nova-cell1-eb75-account-create-update-9d68f\" (UID: \"1c70c92b-3989-4fa8-9a8a-ce6be8839d8e\") " pod="openstack/nova-cell1-eb75-account-create-update-9d68f" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.209613 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c70c92b-3989-4fa8-9a8a-ce6be8839d8e-operator-scripts\") pod \"nova-cell1-eb75-account-create-update-9d68f\" (UID: \"1c70c92b-3989-4fa8-9a8a-ce6be8839d8e\") " pod="openstack/nova-cell1-eb75-account-create-update-9d68f" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.217981 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a6fe-account-create-update-52b7q" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.225724 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd57f8bd-d811-4740-b644-f8d69d329d5c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "dd57f8bd-d811-4740-b644-f8d69d329d5c" (UID: "dd57f8bd-d811-4740-b644-f8d69d329d5c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.230171 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd57f8bd-d811-4740-b644-f8d69d329d5c-kube-api-access-9v5vb" (OuterVolumeSpecName: "kube-api-access-9v5vb") pod "dd57f8bd-d811-4740-b644-f8d69d329d5c" (UID: "dd57f8bd-d811-4740-b644-f8d69d329d5c"). InnerVolumeSpecName "kube-api-access-9v5vb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.242596 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jfbj\" (UniqueName: \"kubernetes.io/projected/1c70c92b-3989-4fa8-9a8a-ce6be8839d8e-kube-api-access-8jfbj\") pod \"nova-cell1-eb75-account-create-update-9d68f\" (UID: \"1c70c92b-3989-4fa8-9a8a-ce6be8839d8e\") " pod="openstack/nova-cell1-eb75-account-create-update-9d68f" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.254833 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d71f-account-create-update-qphgx" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.272419 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd57f8bd-d811-4740-b644-f8d69d329d5c-config" (OuterVolumeSpecName: "config") pod "dd57f8bd-d811-4740-b644-f8d69d329d5c" (UID: "dd57f8bd-d811-4740-b644-f8d69d329d5c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.286773 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd57f8bd-d811-4740-b644-f8d69d329d5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd57f8bd-d811-4740-b644-f8d69d329d5c" (UID: "dd57f8bd-d811-4740-b644-f8d69d329d5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.311362 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd57f8bd-d811-4740-b644-f8d69d329d5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.311389 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dd57f8bd-d811-4740-b644-f8d69d329d5c-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.311398 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v5vb\" (UniqueName: \"kubernetes.io/projected/dd57f8bd-d811-4740-b644-f8d69d329d5c-kube-api-access-9v5vb\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.311408 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd57f8bd-d811-4740-b644-f8d69d329d5c-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.353153 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd57f8bd-d811-4740-b644-f8d69d329d5c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "dd57f8bd-d811-4740-b644-f8d69d329d5c" (UID: "dd57f8bd-d811-4740-b644-f8d69d329d5c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.366616 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-eb75-account-create-update-9d68f" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.410861 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-ssvdh"] Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.413829 4771 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd57f8bd-d811-4740-b644-f8d69d329d5c-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.426978 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7hm42"] Feb 27 01:25:30 crc kubenswrapper[4771]: W0227 01:25:30.432482 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8681229_5d10_47f2_8cdf_fa8b6c584ef8.slice/crio-202046b8fe774b6d107983d0a8eb21e502423b743d746baf8082d1ac8863256d WatchSource:0}: Error finding container 202046b8fe774b6d107983d0a8eb21e502423b743d746baf8082d1ac8863256d: Status 404 returned error can't find the container with id 202046b8fe774b6d107983d0a8eb21e502423b743d746baf8082d1ac8863256d Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.648070 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-ssvdh" event={"ID":"d8681229-5d10-47f2-8cdf-fa8b6c584ef8","Type":"ContainerStarted","Data":"202046b8fe774b6d107983d0a8eb21e502423b743d746baf8082d1ac8863256d"} Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.653837 4771 generic.go:334] "Generic (PLEG): container finished" podID="dd57f8bd-d811-4740-b644-f8d69d329d5c" containerID="5207c640e61f0641a06714c955a373f99702b9fe4c3240013ee7eb4c13d85379" exitCode=0 Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.653897 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fc95dbbd4-gfl9m" event={"ID":"dd57f8bd-d811-4740-b644-f8d69d329d5c","Type":"ContainerDied","Data":"5207c640e61f0641a06714c955a373f99702b9fe4c3240013ee7eb4c13d85379"} Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.653919 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fc95dbbd4-gfl9m" event={"ID":"dd57f8bd-d811-4740-b644-f8d69d329d5c","Type":"ContainerDied","Data":"5f418fc1a98d835b32c70fd3ba2b9ca68bc7ab8b6f499afce4a205a35286b35f"} Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.653935 4771 scope.go:117] "RemoveContainer" containerID="1deb49c54362ab7c5b82ffdcc324add53e5fc2ca49dfc8cd67dce9ea65914b57" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.654062 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fc95dbbd4-gfl9m" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.663421 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dcddaa5-567b-4ee7-ba36-894719a998c9","Type":"ContainerStarted","Data":"3a2fa7eafc391f9ba968f4b7b4707f332fc08175480ee1c53f36f82a4cda7d98"} Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.666580 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7hm42" event={"ID":"ac233507-57ad-484c-817b-270cee86a50a","Type":"ContainerStarted","Data":"894d4e5cec8378a94d97b82598d4a0bbd1c7d048840d729e916ddb6de592ce84"} Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.705360 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5fc95dbbd4-gfl9m"] Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.716651 4771 scope.go:117] "RemoveContainer" containerID="5207c640e61f0641a06714c955a373f99702b9fe4c3240013ee7eb4c13d85379" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.723252 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5fc95dbbd4-gfl9m"] Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.763251 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-a6fe-account-create-update-52b7q"] Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.792741 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-kz4w4"] Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.796728 4771 scope.go:117] "RemoveContainer" containerID="1deb49c54362ab7c5b82ffdcc324add53e5fc2ca49dfc8cd67dce9ea65914b57" Feb 27 01:25:30 crc kubenswrapper[4771]: E0227 01:25:30.797343 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1deb49c54362ab7c5b82ffdcc324add53e5fc2ca49dfc8cd67dce9ea65914b57\": container with ID starting with 1deb49c54362ab7c5b82ffdcc324add53e5fc2ca49dfc8cd67dce9ea65914b57 not found: ID does not exist" containerID="1deb49c54362ab7c5b82ffdcc324add53e5fc2ca49dfc8cd67dce9ea65914b57" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.797402 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1deb49c54362ab7c5b82ffdcc324add53e5fc2ca49dfc8cd67dce9ea65914b57"} err="failed to get container status \"1deb49c54362ab7c5b82ffdcc324add53e5fc2ca49dfc8cd67dce9ea65914b57\": rpc error: code = NotFound desc = could not find container \"1deb49c54362ab7c5b82ffdcc324add53e5fc2ca49dfc8cd67dce9ea65914b57\": container with ID starting with 1deb49c54362ab7c5b82ffdcc324add53e5fc2ca49dfc8cd67dce9ea65914b57 not found: ID does not exist" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.797437 4771 scope.go:117] "RemoveContainer" containerID="5207c640e61f0641a06714c955a373f99702b9fe4c3240013ee7eb4c13d85379" Feb 27 01:25:30 crc kubenswrapper[4771]: E0227 01:25:30.798246 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5207c640e61f0641a06714c955a373f99702b9fe4c3240013ee7eb4c13d85379\": container with ID starting with 5207c640e61f0641a06714c955a373f99702b9fe4c3240013ee7eb4c13d85379 not found: ID does not exist" containerID="5207c640e61f0641a06714c955a373f99702b9fe4c3240013ee7eb4c13d85379" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.798395 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5207c640e61f0641a06714c955a373f99702b9fe4c3240013ee7eb4c13d85379"} err="failed to get container status \"5207c640e61f0641a06714c955a373f99702b9fe4c3240013ee7eb4c13d85379\": rpc error: code = NotFound desc = could not find container \"5207c640e61f0641a06714c955a373f99702b9fe4c3240013ee7eb4c13d85379\": container with ID starting with 5207c640e61f0641a06714c955a373f99702b9fe4c3240013ee7eb4c13d85379 not found: ID does not exist" Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.897950 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-eb75-account-create-update-9d68f"] Feb 27 01:25:30 crc kubenswrapper[4771]: W0227 01:25:30.907828 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c70c92b_3989_4fa8_9a8a_ce6be8839d8e.slice/crio-fcc2f756b29214d647ff7c2ac7c6103a1589b427f44b86cb7d66e673999b807d WatchSource:0}: Error finding container fcc2f756b29214d647ff7c2ac7c6103a1589b427f44b86cb7d66e673999b807d: Status 404 returned error can't find the container with id fcc2f756b29214d647ff7c2ac7c6103a1589b427f44b86cb7d66e673999b807d Feb 27 01:25:30 crc kubenswrapper[4771]: I0227 01:25:30.913833 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d71f-account-create-update-qphgx"] Feb 27 01:25:31 crc kubenswrapper[4771]: I0227 01:25:31.675359 4771 generic.go:334] "Generic (PLEG): container finished" podID="d8681229-5d10-47f2-8cdf-fa8b6c584ef8" containerID="bcdff32e1ad8a2f8ddd816aa35365e0c1ab30ef0c1ab7ec36c45ddd67cde41fe" exitCode=0 Feb 27 01:25:31 crc kubenswrapper[4771]: I0227 01:25:31.676141 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-ssvdh" event={"ID":"d8681229-5d10-47f2-8cdf-fa8b6c584ef8","Type":"ContainerDied","Data":"bcdff32e1ad8a2f8ddd816aa35365e0c1ab30ef0c1ab7ec36c45ddd67cde41fe"} Feb 27 01:25:31 crc kubenswrapper[4771]: I0227 01:25:31.677473 4771 generic.go:334] "Generic (PLEG): container finished" podID="1c70c92b-3989-4fa8-9a8a-ce6be8839d8e" containerID="ee68bff01ce8ca900c86f5a63b7a74e4a67b432bd36c1ab26062f6d21057fcee" exitCode=0 Feb 27 01:25:31 crc kubenswrapper[4771]: I0227 01:25:31.677515 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-eb75-account-create-update-9d68f" event={"ID":"1c70c92b-3989-4fa8-9a8a-ce6be8839d8e","Type":"ContainerDied","Data":"ee68bff01ce8ca900c86f5a63b7a74e4a67b432bd36c1ab26062f6d21057fcee"} Feb 27 01:25:31 crc kubenswrapper[4771]: I0227 01:25:31.677573 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-eb75-account-create-update-9d68f" event={"ID":"1c70c92b-3989-4fa8-9a8a-ce6be8839d8e","Type":"ContainerStarted","Data":"fcc2f756b29214d647ff7c2ac7c6103a1589b427f44b86cb7d66e673999b807d"} Feb 27 01:25:31 crc kubenswrapper[4771]: I0227 01:25:31.680827 4771 generic.go:334] "Generic (PLEG): container finished" podID="e6e8c01e-0567-48bd-aaef-580afc5667af" containerID="de4c99802ee4fb5b938608f68ba2f8f6e8581da8ec45da90efbd05b14c2e2124" exitCode=0 Feb 27 01:25:31 crc kubenswrapper[4771]: I0227 01:25:31.680877 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d71f-account-create-update-qphgx" event={"ID":"e6e8c01e-0567-48bd-aaef-580afc5667af","Type":"ContainerDied","Data":"de4c99802ee4fb5b938608f68ba2f8f6e8581da8ec45da90efbd05b14c2e2124"} Feb 27 01:25:31 crc kubenswrapper[4771]: I0227 01:25:31.680898 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d71f-account-create-update-qphgx" event={"ID":"e6e8c01e-0567-48bd-aaef-580afc5667af","Type":"ContainerStarted","Data":"ab8eb214ae78bd9d2ac9e027e74031216708cedaf9947131203acb459e569ab4"} Feb 27 01:25:31 crc kubenswrapper[4771]: I0227 01:25:31.682783 4771 generic.go:334] "Generic (PLEG): container finished" podID="ac233507-57ad-484c-817b-270cee86a50a" containerID="e46e028aae564b8b61bae21b82557178b6c6124429818922dfca92da85ac543f" exitCode=0 Feb 27 01:25:31 crc kubenswrapper[4771]: I0227 01:25:31.682875 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7hm42" event={"ID":"ac233507-57ad-484c-817b-270cee86a50a","Type":"ContainerDied","Data":"e46e028aae564b8b61bae21b82557178b6c6124429818922dfca92da85ac543f"} Feb 27 01:25:31 crc kubenswrapper[4771]: I0227 01:25:31.688517 4771 generic.go:334] "Generic (PLEG): container finished" podID="e18a7c26-064f-4b67-b9e2-d8a66499cec8" containerID="37c856aacb1089f98e87ed101087ce663fea108416ee0ee9716ff87847067073" exitCode=0 Feb 27 01:25:31 crc kubenswrapper[4771]: I0227 01:25:31.688614 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kz4w4" event={"ID":"e18a7c26-064f-4b67-b9e2-d8a66499cec8","Type":"ContainerDied","Data":"37c856aacb1089f98e87ed101087ce663fea108416ee0ee9716ff87847067073"} Feb 27 01:25:31 crc kubenswrapper[4771]: I0227 01:25:31.688641 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kz4w4" event={"ID":"e18a7c26-064f-4b67-b9e2-d8a66499cec8","Type":"ContainerStarted","Data":"8a466820a4434076343a0c44ccb8744d9c6cc1c6a0fc97ebceaa25d73920f5d4"} Feb 27 01:25:31 crc kubenswrapper[4771]: I0227 01:25:31.691203 4771 generic.go:334] "Generic (PLEG): container finished" podID="f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7" containerID="bf2e80dfffdae487e743928065935ce3b663de7121e35856f3e40f18ab34213c" exitCode=0 Feb 27 01:25:31 crc kubenswrapper[4771]: I0227 01:25:31.691244 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a6fe-account-create-update-52b7q" event={"ID":"f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7","Type":"ContainerDied","Data":"bf2e80dfffdae487e743928065935ce3b663de7121e35856f3e40f18ab34213c"} Feb 27 01:25:31 crc kubenswrapper[4771]: I0227 01:25:31.691264 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a6fe-account-create-update-52b7q" event={"ID":"f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7","Type":"ContainerStarted","Data":"4fb9b5ebd695d0b9ff5999adb23b6fb980a67b9f1ea7942e33f90eb2dd47400e"} Feb 27 01:25:31 crc kubenswrapper[4771]: I0227 01:25:31.783692 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd57f8bd-d811-4740-b644-f8d69d329d5c" path="/var/lib/kubelet/pods/dd57f8bd-d811-4740-b644-f8d69d329d5c/volumes" Feb 27 01:25:32 crc kubenswrapper[4771]: I0227 01:25:32.702668 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dcddaa5-567b-4ee7-ba36-894719a998c9","Type":"ContainerStarted","Data":"b463a562d021b80d8ed165aab336d5cf5d4a7e62b5b2c4f8b3db3ccdfe6b7273"} Feb 27 01:25:32 crc kubenswrapper[4771]: I0227 01:25:32.703230 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5dcddaa5-567b-4ee7-ba36-894719a998c9" containerName="ceilometer-central-agent" containerID="cri-o://81ba5d615a0683d5ca9af691dd3103c55a2955078e8f7e8852bd811f0b0eee66" gracePeriod=30 Feb 27 01:25:32 crc kubenswrapper[4771]: I0227 01:25:32.703778 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5dcddaa5-567b-4ee7-ba36-894719a998c9" containerName="proxy-httpd" containerID="cri-o://b463a562d021b80d8ed165aab336d5cf5d4a7e62b5b2c4f8b3db3ccdfe6b7273" gracePeriod=30 Feb 27 01:25:32 crc kubenswrapper[4771]: I0227 01:25:32.703839 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5dcddaa5-567b-4ee7-ba36-894719a998c9" containerName="sg-core" containerID="cri-o://3a2fa7eafc391f9ba968f4b7b4707f332fc08175480ee1c53f36f82a4cda7d98" gracePeriod=30 Feb 27 01:25:32 crc kubenswrapper[4771]: I0227 01:25:32.703881 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5dcddaa5-567b-4ee7-ba36-894719a998c9" containerName="ceilometer-notification-agent" containerID="cri-o://db95962d0596b9c9ce82875974fec1606d36f60fbc0524bd6d078544d86cf659" gracePeriod=30 Feb 27 01:25:32 crc kubenswrapper[4771]: I0227 01:25:32.738923 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.624566957 podStartE2EDuration="6.738904144s" podCreationTimestamp="2026-02-27 01:25:26 +0000 UTC" firstStartedPulling="2026-02-27 01:25:27.482470817 +0000 UTC m=+1240.420032105" lastFinishedPulling="2026-02-27 01:25:31.596808004 +0000 UTC m=+1244.534369292" observedRunningTime="2026-02-27 01:25:32.731351529 +0000 UTC m=+1245.668912827" watchObservedRunningTime="2026-02-27 01:25:32.738904144 +0000 UTC m=+1245.676465432" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.187433 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a6fe-account-create-update-52b7q" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.277042 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7-operator-scripts\") pod \"f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7\" (UID: \"f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7\") " Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.277461 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5hkg\" (UniqueName: \"kubernetes.io/projected/f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7-kube-api-access-v5hkg\") pod \"f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7\" (UID: \"f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7\") " Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.277945 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7" (UID: "f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.283228 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7-kube-api-access-v5hkg" (OuterVolumeSpecName: "kube-api-access-v5hkg") pod "f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7" (UID: "f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7"). InnerVolumeSpecName "kube-api-access-v5hkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.340387 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-eb75-account-create-update-9d68f" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.350310 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kz4w4" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.379348 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e18a7c26-064f-4b67-b9e2-d8a66499cec8-operator-scripts\") pod \"e18a7c26-064f-4b67-b9e2-d8a66499cec8\" (UID: \"e18a7c26-064f-4b67-b9e2-d8a66499cec8\") " Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.379504 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqkq7\" (UniqueName: \"kubernetes.io/projected/e18a7c26-064f-4b67-b9e2-d8a66499cec8-kube-api-access-qqkq7\") pod \"e18a7c26-064f-4b67-b9e2-d8a66499cec8\" (UID: \"e18a7c26-064f-4b67-b9e2-d8a66499cec8\") " Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.379541 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jfbj\" (UniqueName: \"kubernetes.io/projected/1c70c92b-3989-4fa8-9a8a-ce6be8839d8e-kube-api-access-8jfbj\") pod \"1c70c92b-3989-4fa8-9a8a-ce6be8839d8e\" (UID: \"1c70c92b-3989-4fa8-9a8a-ce6be8839d8e\") " Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.379637 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c70c92b-3989-4fa8-9a8a-ce6be8839d8e-operator-scripts\") pod \"1c70c92b-3989-4fa8-9a8a-ce6be8839d8e\" (UID: \"1c70c92b-3989-4fa8-9a8a-ce6be8839d8e\") " Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.379986 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5hkg\" (UniqueName: \"kubernetes.io/projected/f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7-kube-api-access-v5hkg\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.379999 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.380015 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e18a7c26-064f-4b67-b9e2-d8a66499cec8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e18a7c26-064f-4b67-b9e2-d8a66499cec8" (UID: "e18a7c26-064f-4b67-b9e2-d8a66499cec8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.380326 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c70c92b-3989-4fa8-9a8a-ce6be8839d8e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1c70c92b-3989-4fa8-9a8a-ce6be8839d8e" (UID: "1c70c92b-3989-4fa8-9a8a-ce6be8839d8e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.382799 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ssvdh" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.382873 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c70c92b-3989-4fa8-9a8a-ce6be8839d8e-kube-api-access-8jfbj" (OuterVolumeSpecName: "kube-api-access-8jfbj") pod "1c70c92b-3989-4fa8-9a8a-ce6be8839d8e" (UID: "1c70c92b-3989-4fa8-9a8a-ce6be8839d8e"). InnerVolumeSpecName "kube-api-access-8jfbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.383020 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7hm42" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.384726 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e18a7c26-064f-4b67-b9e2-d8a66499cec8-kube-api-access-qqkq7" (OuterVolumeSpecName: "kube-api-access-qqkq7") pod "e18a7c26-064f-4b67-b9e2-d8a66499cec8" (UID: "e18a7c26-064f-4b67-b9e2-d8a66499cec8"). InnerVolumeSpecName "kube-api-access-qqkq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.392233 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d71f-account-create-update-qphgx" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.480720 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwqft\" (UniqueName: \"kubernetes.io/projected/e6e8c01e-0567-48bd-aaef-580afc5667af-kube-api-access-mwqft\") pod \"e6e8c01e-0567-48bd-aaef-580afc5667af\" (UID: \"e6e8c01e-0567-48bd-aaef-580afc5667af\") " Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.480830 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6e8c01e-0567-48bd-aaef-580afc5667af-operator-scripts\") pod \"e6e8c01e-0567-48bd-aaef-580afc5667af\" (UID: \"e6e8c01e-0567-48bd-aaef-580afc5667af\") " Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.480993 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8681229-5d10-47f2-8cdf-fa8b6c584ef8-operator-scripts\") pod \"d8681229-5d10-47f2-8cdf-fa8b6c584ef8\" (UID: \"d8681229-5d10-47f2-8cdf-fa8b6c584ef8\") " Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.481038 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac233507-57ad-484c-817b-270cee86a50a-operator-scripts\") pod \"ac233507-57ad-484c-817b-270cee86a50a\" (UID: \"ac233507-57ad-484c-817b-270cee86a50a\") " Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.481129 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhrf9\" (UniqueName: \"kubernetes.io/projected/ac233507-57ad-484c-817b-270cee86a50a-kube-api-access-vhrf9\") pod \"ac233507-57ad-484c-817b-270cee86a50a\" (UID: \"ac233507-57ad-484c-817b-270cee86a50a\") " Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.481183 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mbt6\" (UniqueName: \"kubernetes.io/projected/d8681229-5d10-47f2-8cdf-fa8b6c584ef8-kube-api-access-8mbt6\") pod \"d8681229-5d10-47f2-8cdf-fa8b6c584ef8\" (UID: \"d8681229-5d10-47f2-8cdf-fa8b6c584ef8\") " Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.481449 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6e8c01e-0567-48bd-aaef-580afc5667af-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e6e8c01e-0567-48bd-aaef-580afc5667af" (UID: "e6e8c01e-0567-48bd-aaef-580afc5667af"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.481735 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6e8c01e-0567-48bd-aaef-580afc5667af-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.481767 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqkq7\" (UniqueName: \"kubernetes.io/projected/e18a7c26-064f-4b67-b9e2-d8a66499cec8-kube-api-access-qqkq7\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.481784 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jfbj\" (UniqueName: \"kubernetes.io/projected/1c70c92b-3989-4fa8-9a8a-ce6be8839d8e-kube-api-access-8jfbj\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.481798 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c70c92b-3989-4fa8-9a8a-ce6be8839d8e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.481811 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e18a7c26-064f-4b67-b9e2-d8a66499cec8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.481930 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac233507-57ad-484c-817b-270cee86a50a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ac233507-57ad-484c-817b-270cee86a50a" (UID: "ac233507-57ad-484c-817b-270cee86a50a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.482322 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8681229-5d10-47f2-8cdf-fa8b6c584ef8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d8681229-5d10-47f2-8cdf-fa8b6c584ef8" (UID: "d8681229-5d10-47f2-8cdf-fa8b6c584ef8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.483830 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6e8c01e-0567-48bd-aaef-580afc5667af-kube-api-access-mwqft" (OuterVolumeSpecName: "kube-api-access-mwqft") pod "e6e8c01e-0567-48bd-aaef-580afc5667af" (UID: "e6e8c01e-0567-48bd-aaef-580afc5667af"). InnerVolumeSpecName "kube-api-access-mwqft". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.484963 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac233507-57ad-484c-817b-270cee86a50a-kube-api-access-vhrf9" (OuterVolumeSpecName: "kube-api-access-vhrf9") pod "ac233507-57ad-484c-817b-270cee86a50a" (UID: "ac233507-57ad-484c-817b-270cee86a50a"). InnerVolumeSpecName "kube-api-access-vhrf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.485271 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8681229-5d10-47f2-8cdf-fa8b6c584ef8-kube-api-access-8mbt6" (OuterVolumeSpecName: "kube-api-access-8mbt6") pod "d8681229-5d10-47f2-8cdf-fa8b6c584ef8" (UID: "d8681229-5d10-47f2-8cdf-fa8b6c584ef8"). InnerVolumeSpecName "kube-api-access-8mbt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.583940 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8681229-5d10-47f2-8cdf-fa8b6c584ef8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.583971 4771 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac233507-57ad-484c-817b-270cee86a50a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.583980 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhrf9\" (UniqueName: \"kubernetes.io/projected/ac233507-57ad-484c-817b-270cee86a50a-kube-api-access-vhrf9\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.583990 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mbt6\" (UniqueName: \"kubernetes.io/projected/d8681229-5d10-47f2-8cdf-fa8b6c584ef8-kube-api-access-8mbt6\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.583999 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwqft\" (UniqueName: \"kubernetes.io/projected/e6e8c01e-0567-48bd-aaef-580afc5667af-kube-api-access-mwqft\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.716697 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ssvdh" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.716702 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-ssvdh" event={"ID":"d8681229-5d10-47f2-8cdf-fa8b6c584ef8","Type":"ContainerDied","Data":"202046b8fe774b6d107983d0a8eb21e502423b743d746baf8082d1ac8863256d"} Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.716753 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="202046b8fe774b6d107983d0a8eb21e502423b743d746baf8082d1ac8863256d" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.718658 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-eb75-account-create-update-9d68f" event={"ID":"1c70c92b-3989-4fa8-9a8a-ce6be8839d8e","Type":"ContainerDied","Data":"fcc2f756b29214d647ff7c2ac7c6103a1589b427f44b86cb7d66e673999b807d"} Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.718682 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcc2f756b29214d647ff7c2ac7c6103a1589b427f44b86cb7d66e673999b807d" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.718768 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-eb75-account-create-update-9d68f" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.720573 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d71f-account-create-update-qphgx" event={"ID":"e6e8c01e-0567-48bd-aaef-580afc5667af","Type":"ContainerDied","Data":"ab8eb214ae78bd9d2ac9e027e74031216708cedaf9947131203acb459e569ab4"} Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.720597 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab8eb214ae78bd9d2ac9e027e74031216708cedaf9947131203acb459e569ab4" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.720580 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d71f-account-create-update-qphgx" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.726073 4771 generic.go:334] "Generic (PLEG): container finished" podID="5dcddaa5-567b-4ee7-ba36-894719a998c9" containerID="b463a562d021b80d8ed165aab336d5cf5d4a7e62b5b2c4f8b3db3ccdfe6b7273" exitCode=0 Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.726129 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dcddaa5-567b-4ee7-ba36-894719a998c9","Type":"ContainerDied","Data":"b463a562d021b80d8ed165aab336d5cf5d4a7e62b5b2c4f8b3db3ccdfe6b7273"} Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.726164 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dcddaa5-567b-4ee7-ba36-894719a998c9","Type":"ContainerDied","Data":"3a2fa7eafc391f9ba968f4b7b4707f332fc08175480ee1c53f36f82a4cda7d98"} Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.726133 4771 generic.go:334] "Generic (PLEG): container finished" podID="5dcddaa5-567b-4ee7-ba36-894719a998c9" containerID="3a2fa7eafc391f9ba968f4b7b4707f332fc08175480ee1c53f36f82a4cda7d98" exitCode=2 Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.726194 4771 generic.go:334] "Generic (PLEG): container finished" podID="5dcddaa5-567b-4ee7-ba36-894719a998c9" containerID="db95962d0596b9c9ce82875974fec1606d36f60fbc0524bd6d078544d86cf659" exitCode=0 Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.726272 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dcddaa5-567b-4ee7-ba36-894719a998c9","Type":"ContainerDied","Data":"db95962d0596b9c9ce82875974fec1606d36f60fbc0524bd6d078544d86cf659"} Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.728520 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kz4w4" event={"ID":"e18a7c26-064f-4b67-b9e2-d8a66499cec8","Type":"ContainerDied","Data":"8a466820a4434076343a0c44ccb8744d9c6cc1c6a0fc97ebceaa25d73920f5d4"} Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.728626 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a466820a4434076343a0c44ccb8744d9c6cc1c6a0fc97ebceaa25d73920f5d4" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.728585 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kz4w4" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.730791 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7hm42" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.730841 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7hm42" event={"ID":"ac233507-57ad-484c-817b-270cee86a50a","Type":"ContainerDied","Data":"894d4e5cec8378a94d97b82598d4a0bbd1c7d048840d729e916ddb6de592ce84"} Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.730867 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="894d4e5cec8378a94d97b82598d4a0bbd1c7d048840d729e916ddb6de592ce84" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.733186 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a6fe-account-create-update-52b7q" event={"ID":"f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7","Type":"ContainerDied","Data":"4fb9b5ebd695d0b9ff5999adb23b6fb980a67b9f1ea7942e33f90eb2dd47400e"} Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.733212 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fb9b5ebd695d0b9ff5999adb23b6fb980a67b9f1ea7942e33f90eb2dd47400e" Feb 27 01:25:33 crc kubenswrapper[4771]: I0227 01:25:33.733252 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a6fe-account-create-update-52b7q" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.067082 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zh9lc"] Feb 27 01:25:35 crc kubenswrapper[4771]: E0227 01:25:35.067864 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8681229-5d10-47f2-8cdf-fa8b6c584ef8" containerName="mariadb-database-create" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.067882 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8681229-5d10-47f2-8cdf-fa8b6c584ef8" containerName="mariadb-database-create" Feb 27 01:25:35 crc kubenswrapper[4771]: E0227 01:25:35.067900 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd57f8bd-d811-4740-b644-f8d69d329d5c" containerName="neutron-api" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.067907 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd57f8bd-d811-4740-b644-f8d69d329d5c" containerName="neutron-api" Feb 27 01:25:35 crc kubenswrapper[4771]: E0227 01:25:35.067920 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6e8c01e-0567-48bd-aaef-580afc5667af" containerName="mariadb-account-create-update" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.067928 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6e8c01e-0567-48bd-aaef-580afc5667af" containerName="mariadb-account-create-update" Feb 27 01:25:35 crc kubenswrapper[4771]: E0227 01:25:35.067947 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd57f8bd-d811-4740-b644-f8d69d329d5c" containerName="neutron-httpd" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.067955 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd57f8bd-d811-4740-b644-f8d69d329d5c" containerName="neutron-httpd" Feb 27 01:25:35 crc kubenswrapper[4771]: E0227 01:25:35.067980 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac233507-57ad-484c-817b-270cee86a50a" containerName="mariadb-database-create" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.067988 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac233507-57ad-484c-817b-270cee86a50a" containerName="mariadb-database-create" Feb 27 01:25:35 crc kubenswrapper[4771]: E0227 01:25:35.068007 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7" containerName="mariadb-account-create-update" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.068015 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7" containerName="mariadb-account-create-update" Feb 27 01:25:35 crc kubenswrapper[4771]: E0227 01:25:35.068026 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c70c92b-3989-4fa8-9a8a-ce6be8839d8e" containerName="mariadb-account-create-update" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.068033 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c70c92b-3989-4fa8-9a8a-ce6be8839d8e" containerName="mariadb-account-create-update" Feb 27 01:25:35 crc kubenswrapper[4771]: E0227 01:25:35.068045 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e18a7c26-064f-4b67-b9e2-d8a66499cec8" containerName="mariadb-database-create" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.068053 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e18a7c26-064f-4b67-b9e2-d8a66499cec8" containerName="mariadb-database-create" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.068234 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac233507-57ad-484c-817b-270cee86a50a" containerName="mariadb-database-create" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.068251 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8681229-5d10-47f2-8cdf-fa8b6c584ef8" containerName="mariadb-database-create" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.068269 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e18a7c26-064f-4b67-b9e2-d8a66499cec8" containerName="mariadb-database-create" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.068282 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6e8c01e-0567-48bd-aaef-580afc5667af" containerName="mariadb-account-create-update" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.068294 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd57f8bd-d811-4740-b644-f8d69d329d5c" containerName="neutron-httpd" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.068310 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd57f8bd-d811-4740-b644-f8d69d329d5c" containerName="neutron-api" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.068323 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7" containerName="mariadb-account-create-update" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.068332 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c70c92b-3989-4fa8-9a8a-ce6be8839d8e" containerName="mariadb-account-create-update" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.069069 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zh9lc" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.070789 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bkjj8" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.071244 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.074409 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.075849 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zh9lc"] Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.110014 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a927ae95-187b-4517-b54d-7faaf7de3155-scripts\") pod \"nova-cell0-conductor-db-sync-zh9lc\" (UID: \"a927ae95-187b-4517-b54d-7faaf7de3155\") " pod="openstack/nova-cell0-conductor-db-sync-zh9lc" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.110075 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kmjm\" (UniqueName: \"kubernetes.io/projected/a927ae95-187b-4517-b54d-7faaf7de3155-kube-api-access-6kmjm\") pod \"nova-cell0-conductor-db-sync-zh9lc\" (UID: \"a927ae95-187b-4517-b54d-7faaf7de3155\") " pod="openstack/nova-cell0-conductor-db-sync-zh9lc" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.110104 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a927ae95-187b-4517-b54d-7faaf7de3155-config-data\") pod \"nova-cell0-conductor-db-sync-zh9lc\" (UID: \"a927ae95-187b-4517-b54d-7faaf7de3155\") " pod="openstack/nova-cell0-conductor-db-sync-zh9lc" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.110354 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a927ae95-187b-4517-b54d-7faaf7de3155-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zh9lc\" (UID: \"a927ae95-187b-4517-b54d-7faaf7de3155\") " pod="openstack/nova-cell0-conductor-db-sync-zh9lc" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.212036 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a927ae95-187b-4517-b54d-7faaf7de3155-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zh9lc\" (UID: \"a927ae95-187b-4517-b54d-7faaf7de3155\") " pod="openstack/nova-cell0-conductor-db-sync-zh9lc" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.212199 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a927ae95-187b-4517-b54d-7faaf7de3155-scripts\") pod \"nova-cell0-conductor-db-sync-zh9lc\" (UID: \"a927ae95-187b-4517-b54d-7faaf7de3155\") " pod="openstack/nova-cell0-conductor-db-sync-zh9lc" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.212238 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kmjm\" (UniqueName: \"kubernetes.io/projected/a927ae95-187b-4517-b54d-7faaf7de3155-kube-api-access-6kmjm\") pod \"nova-cell0-conductor-db-sync-zh9lc\" (UID: \"a927ae95-187b-4517-b54d-7faaf7de3155\") " pod="openstack/nova-cell0-conductor-db-sync-zh9lc" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.212269 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a927ae95-187b-4517-b54d-7faaf7de3155-config-data\") pod \"nova-cell0-conductor-db-sync-zh9lc\" (UID: \"a927ae95-187b-4517-b54d-7faaf7de3155\") " pod="openstack/nova-cell0-conductor-db-sync-zh9lc" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.218694 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a927ae95-187b-4517-b54d-7faaf7de3155-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zh9lc\" (UID: \"a927ae95-187b-4517-b54d-7faaf7de3155\") " pod="openstack/nova-cell0-conductor-db-sync-zh9lc" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.231260 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a927ae95-187b-4517-b54d-7faaf7de3155-config-data\") pod \"nova-cell0-conductor-db-sync-zh9lc\" (UID: \"a927ae95-187b-4517-b54d-7faaf7de3155\") " pod="openstack/nova-cell0-conductor-db-sync-zh9lc" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.232189 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a927ae95-187b-4517-b54d-7faaf7de3155-scripts\") pod \"nova-cell0-conductor-db-sync-zh9lc\" (UID: \"a927ae95-187b-4517-b54d-7faaf7de3155\") " pod="openstack/nova-cell0-conductor-db-sync-zh9lc" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.240201 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kmjm\" (UniqueName: \"kubernetes.io/projected/a927ae95-187b-4517-b54d-7faaf7de3155-kube-api-access-6kmjm\") pod \"nova-cell0-conductor-db-sync-zh9lc\" (UID: \"a927ae95-187b-4517-b54d-7faaf7de3155\") " pod="openstack/nova-cell0-conductor-db-sync-zh9lc" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.384919 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zh9lc" Feb 27 01:25:35 crc kubenswrapper[4771]: I0227 01:25:35.898199 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zh9lc"] Feb 27 01:25:36 crc kubenswrapper[4771]: I0227 01:25:36.188296 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 27 01:25:36 crc kubenswrapper[4771]: I0227 01:25:36.189618 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 27 01:25:36 crc kubenswrapper[4771]: I0227 01:25:36.223273 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 27 01:25:36 crc kubenswrapper[4771]: I0227 01:25:36.235940 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 27 01:25:36 crc kubenswrapper[4771]: I0227 01:25:36.531732 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 27 01:25:36 crc kubenswrapper[4771]: I0227 01:25:36.531971 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 27 01:25:36 crc kubenswrapper[4771]: I0227 01:25:36.562919 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 27 01:25:36 crc kubenswrapper[4771]: I0227 01:25:36.586302 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 27 01:25:36 crc kubenswrapper[4771]: I0227 01:25:36.770781 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zh9lc" event={"ID":"a927ae95-187b-4517-b54d-7faaf7de3155","Type":"ContainerStarted","Data":"d7034b0d3e3441ee76ac5196f243b4f477e8409d5ba1f9304d1c02b39a51152d"} Feb 27 01:25:36 crc kubenswrapper[4771]: I0227 01:25:36.770831 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 27 01:25:36 crc kubenswrapper[4771]: I0227 01:25:36.770848 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 27 01:25:36 crc kubenswrapper[4771]: I0227 01:25:36.770859 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 27 01:25:36 crc kubenswrapper[4771]: I0227 01:25:36.770871 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.555842 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.667895 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dcddaa5-567b-4ee7-ba36-894719a998c9-config-data\") pod \"5dcddaa5-567b-4ee7-ba36-894719a998c9\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.668010 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dcddaa5-567b-4ee7-ba36-894719a998c9-combined-ca-bundle\") pod \"5dcddaa5-567b-4ee7-ba36-894719a998c9\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.668176 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dcddaa5-567b-4ee7-ba36-894719a998c9-log-httpd\") pod \"5dcddaa5-567b-4ee7-ba36-894719a998c9\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.668197 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dcddaa5-567b-4ee7-ba36-894719a998c9-sg-core-conf-yaml\") pod \"5dcddaa5-567b-4ee7-ba36-894719a998c9\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.668339 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dcddaa5-567b-4ee7-ba36-894719a998c9-run-httpd\") pod \"5dcddaa5-567b-4ee7-ba36-894719a998c9\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.668434 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8nmw\" (UniqueName: \"kubernetes.io/projected/5dcddaa5-567b-4ee7-ba36-894719a998c9-kube-api-access-n8nmw\") pod \"5dcddaa5-567b-4ee7-ba36-894719a998c9\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.668498 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dcddaa5-567b-4ee7-ba36-894719a998c9-scripts\") pod \"5dcddaa5-567b-4ee7-ba36-894719a998c9\" (UID: \"5dcddaa5-567b-4ee7-ba36-894719a998c9\") " Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.668926 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dcddaa5-567b-4ee7-ba36-894719a998c9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5dcddaa5-567b-4ee7-ba36-894719a998c9" (UID: "5dcddaa5-567b-4ee7-ba36-894719a998c9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.669047 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dcddaa5-567b-4ee7-ba36-894719a998c9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5dcddaa5-567b-4ee7-ba36-894719a998c9" (UID: "5dcddaa5-567b-4ee7-ba36-894719a998c9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.669385 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dcddaa5-567b-4ee7-ba36-894719a998c9-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.669401 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dcddaa5-567b-4ee7-ba36-894719a998c9-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.674630 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dcddaa5-567b-4ee7-ba36-894719a998c9-scripts" (OuterVolumeSpecName: "scripts") pod "5dcddaa5-567b-4ee7-ba36-894719a998c9" (UID: "5dcddaa5-567b-4ee7-ba36-894719a998c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.676747 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dcddaa5-567b-4ee7-ba36-894719a998c9-kube-api-access-n8nmw" (OuterVolumeSpecName: "kube-api-access-n8nmw") pod "5dcddaa5-567b-4ee7-ba36-894719a998c9" (UID: "5dcddaa5-567b-4ee7-ba36-894719a998c9"). InnerVolumeSpecName "kube-api-access-n8nmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.700874 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dcddaa5-567b-4ee7-ba36-894719a998c9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5dcddaa5-567b-4ee7-ba36-894719a998c9" (UID: "5dcddaa5-567b-4ee7-ba36-894719a998c9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.757700 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dcddaa5-567b-4ee7-ba36-894719a998c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dcddaa5-567b-4ee7-ba36-894719a998c9" (UID: "5dcddaa5-567b-4ee7-ba36-894719a998c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.771787 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8nmw\" (UniqueName: \"kubernetes.io/projected/5dcddaa5-567b-4ee7-ba36-894719a998c9-kube-api-access-n8nmw\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.771823 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dcddaa5-567b-4ee7-ba36-894719a998c9-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.771832 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dcddaa5-567b-4ee7-ba36-894719a998c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.771841 4771 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dcddaa5-567b-4ee7-ba36-894719a998c9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.782843 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dcddaa5-567b-4ee7-ba36-894719a998c9-config-data" (OuterVolumeSpecName: "config-data") pod "5dcddaa5-567b-4ee7-ba36-894719a998c9" (UID: "5dcddaa5-567b-4ee7-ba36-894719a998c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.790404 4771 generic.go:334] "Generic (PLEG): container finished" podID="5dcddaa5-567b-4ee7-ba36-894719a998c9" containerID="81ba5d615a0683d5ca9af691dd3103c55a2955078e8f7e8852bd811f0b0eee66" exitCode=0 Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.791482 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.795633 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dcddaa5-567b-4ee7-ba36-894719a998c9","Type":"ContainerDied","Data":"81ba5d615a0683d5ca9af691dd3103c55a2955078e8f7e8852bd811f0b0eee66"} Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.795685 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dcddaa5-567b-4ee7-ba36-894719a998c9","Type":"ContainerDied","Data":"71cf574ac7e9bbe5ce6b9dc88520671ad9b18769f1d45e6aeb12ab23cb24800d"} Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.795712 4771 scope.go:117] "RemoveContainer" containerID="b463a562d021b80d8ed165aab336d5cf5d4a7e62b5b2c4f8b3db3ccdfe6b7273" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.829330 4771 scope.go:117] "RemoveContainer" containerID="3a2fa7eafc391f9ba968f4b7b4707f332fc08175480ee1c53f36f82a4cda7d98" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.847191 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.857913 4771 scope.go:117] "RemoveContainer" containerID="db95962d0596b9c9ce82875974fec1606d36f60fbc0524bd6d078544d86cf659" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.867716 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.873954 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dcddaa5-567b-4ee7-ba36-894719a998c9-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.881764 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:25:37 crc kubenswrapper[4771]: E0227 01:25:37.882114 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dcddaa5-567b-4ee7-ba36-894719a998c9" containerName="ceilometer-notification-agent" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.882128 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dcddaa5-567b-4ee7-ba36-894719a998c9" containerName="ceilometer-notification-agent" Feb 27 01:25:37 crc kubenswrapper[4771]: E0227 01:25:37.882148 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dcddaa5-567b-4ee7-ba36-894719a998c9" containerName="sg-core" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.882154 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dcddaa5-567b-4ee7-ba36-894719a998c9" containerName="sg-core" Feb 27 01:25:37 crc kubenswrapper[4771]: E0227 01:25:37.882169 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dcddaa5-567b-4ee7-ba36-894719a998c9" containerName="ceilometer-central-agent" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.882178 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dcddaa5-567b-4ee7-ba36-894719a998c9" containerName="ceilometer-central-agent" Feb 27 01:25:37 crc kubenswrapper[4771]: E0227 01:25:37.882202 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dcddaa5-567b-4ee7-ba36-894719a998c9" containerName="proxy-httpd" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.882210 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dcddaa5-567b-4ee7-ba36-894719a998c9" containerName="proxy-httpd" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.882359 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dcddaa5-567b-4ee7-ba36-894719a998c9" containerName="ceilometer-notification-agent" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.882368 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dcddaa5-567b-4ee7-ba36-894719a998c9" containerName="proxy-httpd" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.882390 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dcddaa5-567b-4ee7-ba36-894719a998c9" containerName="sg-core" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.882400 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dcddaa5-567b-4ee7-ba36-894719a998c9" containerName="ceilometer-central-agent" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.884730 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.885460 4771 scope.go:117] "RemoveContainer" containerID="81ba5d615a0683d5ca9af691dd3103c55a2955078e8f7e8852bd811f0b0eee66" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.889271 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.889320 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.910811 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.944272 4771 scope.go:117] "RemoveContainer" containerID="b463a562d021b80d8ed165aab336d5cf5d4a7e62b5b2c4f8b3db3ccdfe6b7273" Feb 27 01:25:37 crc kubenswrapper[4771]: E0227 01:25:37.944737 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b463a562d021b80d8ed165aab336d5cf5d4a7e62b5b2c4f8b3db3ccdfe6b7273\": container with ID starting with b463a562d021b80d8ed165aab336d5cf5d4a7e62b5b2c4f8b3db3ccdfe6b7273 not found: ID does not exist" containerID="b463a562d021b80d8ed165aab336d5cf5d4a7e62b5b2c4f8b3db3ccdfe6b7273" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.944770 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b463a562d021b80d8ed165aab336d5cf5d4a7e62b5b2c4f8b3db3ccdfe6b7273"} err="failed to get container status \"b463a562d021b80d8ed165aab336d5cf5d4a7e62b5b2c4f8b3db3ccdfe6b7273\": rpc error: code = NotFound desc = could not find container \"b463a562d021b80d8ed165aab336d5cf5d4a7e62b5b2c4f8b3db3ccdfe6b7273\": container with ID starting with b463a562d021b80d8ed165aab336d5cf5d4a7e62b5b2c4f8b3db3ccdfe6b7273 not found: ID does not exist" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.944790 4771 scope.go:117] "RemoveContainer" containerID="3a2fa7eafc391f9ba968f4b7b4707f332fc08175480ee1c53f36f82a4cda7d98" Feb 27 01:25:37 crc kubenswrapper[4771]: E0227 01:25:37.944983 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a2fa7eafc391f9ba968f4b7b4707f332fc08175480ee1c53f36f82a4cda7d98\": container with ID starting with 3a2fa7eafc391f9ba968f4b7b4707f332fc08175480ee1c53f36f82a4cda7d98 not found: ID does not exist" containerID="3a2fa7eafc391f9ba968f4b7b4707f332fc08175480ee1c53f36f82a4cda7d98" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.945006 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a2fa7eafc391f9ba968f4b7b4707f332fc08175480ee1c53f36f82a4cda7d98"} err="failed to get container status \"3a2fa7eafc391f9ba968f4b7b4707f332fc08175480ee1c53f36f82a4cda7d98\": rpc error: code = NotFound desc = could not find container \"3a2fa7eafc391f9ba968f4b7b4707f332fc08175480ee1c53f36f82a4cda7d98\": container with ID starting with 3a2fa7eafc391f9ba968f4b7b4707f332fc08175480ee1c53f36f82a4cda7d98 not found: ID does not exist" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.945023 4771 scope.go:117] "RemoveContainer" containerID="db95962d0596b9c9ce82875974fec1606d36f60fbc0524bd6d078544d86cf659" Feb 27 01:25:37 crc kubenswrapper[4771]: E0227 01:25:37.945406 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db95962d0596b9c9ce82875974fec1606d36f60fbc0524bd6d078544d86cf659\": container with ID starting with db95962d0596b9c9ce82875974fec1606d36f60fbc0524bd6d078544d86cf659 not found: ID does not exist" containerID="db95962d0596b9c9ce82875974fec1606d36f60fbc0524bd6d078544d86cf659" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.945451 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db95962d0596b9c9ce82875974fec1606d36f60fbc0524bd6d078544d86cf659"} err="failed to get container status \"db95962d0596b9c9ce82875974fec1606d36f60fbc0524bd6d078544d86cf659\": rpc error: code = NotFound desc = could not find container \"db95962d0596b9c9ce82875974fec1606d36f60fbc0524bd6d078544d86cf659\": container with ID starting with db95962d0596b9c9ce82875974fec1606d36f60fbc0524bd6d078544d86cf659 not found: ID does not exist" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.945478 4771 scope.go:117] "RemoveContainer" containerID="81ba5d615a0683d5ca9af691dd3103c55a2955078e8f7e8852bd811f0b0eee66" Feb 27 01:25:37 crc kubenswrapper[4771]: E0227 01:25:37.946192 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81ba5d615a0683d5ca9af691dd3103c55a2955078e8f7e8852bd811f0b0eee66\": container with ID starting with 81ba5d615a0683d5ca9af691dd3103c55a2955078e8f7e8852bd811f0b0eee66 not found: ID does not exist" containerID="81ba5d615a0683d5ca9af691dd3103c55a2955078e8f7e8852bd811f0b0eee66" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.946240 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81ba5d615a0683d5ca9af691dd3103c55a2955078e8f7e8852bd811f0b0eee66"} err="failed to get container status \"81ba5d615a0683d5ca9af691dd3103c55a2955078e8f7e8852bd811f0b0eee66\": rpc error: code = NotFound desc = could not find container \"81ba5d615a0683d5ca9af691dd3103c55a2955078e8f7e8852bd811f0b0eee66\": container with ID starting with 81ba5d615a0683d5ca9af691dd3103c55a2955078e8f7e8852bd811f0b0eee66 not found: ID does not exist" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.976019 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-log-httpd\") pod \"ceilometer-0\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " pod="openstack/ceilometer-0" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.976270 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-scripts\") pod \"ceilometer-0\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " pod="openstack/ceilometer-0" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.976416 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg5dk\" (UniqueName: \"kubernetes.io/projected/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-kube-api-access-tg5dk\") pod \"ceilometer-0\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " pod="openstack/ceilometer-0" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.976485 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " pod="openstack/ceilometer-0" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.976679 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " pod="openstack/ceilometer-0" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.976809 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-run-httpd\") pod \"ceilometer-0\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " pod="openstack/ceilometer-0" Feb 27 01:25:37 crc kubenswrapper[4771]: I0227 01:25:37.976924 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-config-data\") pod \"ceilometer-0\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " pod="openstack/ceilometer-0" Feb 27 01:25:38 crc kubenswrapper[4771]: I0227 01:25:38.078331 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " pod="openstack/ceilometer-0" Feb 27 01:25:38 crc kubenswrapper[4771]: I0227 01:25:38.078393 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-run-httpd\") pod \"ceilometer-0\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " pod="openstack/ceilometer-0" Feb 27 01:25:38 crc kubenswrapper[4771]: I0227 01:25:38.078448 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-config-data\") pod \"ceilometer-0\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " pod="openstack/ceilometer-0" Feb 27 01:25:38 crc kubenswrapper[4771]: I0227 01:25:38.078503 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-log-httpd\") pod \"ceilometer-0\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " pod="openstack/ceilometer-0" Feb 27 01:25:38 crc kubenswrapper[4771]: I0227 01:25:38.078904 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-run-httpd\") pod \"ceilometer-0\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " pod="openstack/ceilometer-0" Feb 27 01:25:38 crc kubenswrapper[4771]: I0227 01:25:38.078960 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-log-httpd\") pod \"ceilometer-0\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " pod="openstack/ceilometer-0" Feb 27 01:25:38 crc kubenswrapper[4771]: I0227 01:25:38.079019 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-scripts\") pod \"ceilometer-0\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " pod="openstack/ceilometer-0" Feb 27 01:25:38 crc kubenswrapper[4771]: I0227 01:25:38.079222 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg5dk\" (UniqueName: \"kubernetes.io/projected/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-kube-api-access-tg5dk\") pod \"ceilometer-0\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " pod="openstack/ceilometer-0" Feb 27 01:25:38 crc kubenswrapper[4771]: I0227 01:25:38.079254 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " pod="openstack/ceilometer-0" Feb 27 01:25:38 crc kubenswrapper[4771]: I0227 01:25:38.082696 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " pod="openstack/ceilometer-0" Feb 27 01:25:38 crc kubenswrapper[4771]: I0227 01:25:38.082932 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-config-data\") pod \"ceilometer-0\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " pod="openstack/ceilometer-0" Feb 27 01:25:38 crc kubenswrapper[4771]: I0227 01:25:38.084463 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " pod="openstack/ceilometer-0" Feb 27 01:25:38 crc kubenswrapper[4771]: I0227 01:25:38.085133 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-scripts\") pod \"ceilometer-0\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " pod="openstack/ceilometer-0" Feb 27 01:25:38 crc kubenswrapper[4771]: I0227 01:25:38.100273 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg5dk\" (UniqueName: \"kubernetes.io/projected/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-kube-api-access-tg5dk\") pod \"ceilometer-0\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " pod="openstack/ceilometer-0" Feb 27 01:25:38 crc kubenswrapper[4771]: I0227 01:25:38.219684 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:25:38 crc kubenswrapper[4771]: I0227 01:25:38.697270 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 27 01:25:38 crc kubenswrapper[4771]: I0227 01:25:38.703407 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 27 01:25:38 crc kubenswrapper[4771]: I0227 01:25:38.705273 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:25:38 crc kubenswrapper[4771]: I0227 01:25:38.814263 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c294d7a9-b983-45df-82dd-1b5bc8ed7a06","Type":"ContainerStarted","Data":"a7eea7526225e076a5c18809c35b01c21b2cc2c51c10edc66d1da97ec6ea063f"} Feb 27 01:25:38 crc kubenswrapper[4771]: I0227 01:25:38.816241 4771 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 27 01:25:38 crc kubenswrapper[4771]: I0227 01:25:38.816258 4771 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 27 01:25:38 crc kubenswrapper[4771]: I0227 01:25:38.823040 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 27 01:25:38 crc kubenswrapper[4771]: I0227 01:25:38.835204 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 27 01:25:39 crc kubenswrapper[4771]: I0227 01:25:39.786441 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dcddaa5-567b-4ee7-ba36-894719a998c9" path="/var/lib/kubelet/pods/5dcddaa5-567b-4ee7-ba36-894719a998c9/volumes" Feb 27 01:25:43 crc kubenswrapper[4771]: I0227 01:25:43.860422 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c294d7a9-b983-45df-82dd-1b5bc8ed7a06","Type":"ContainerStarted","Data":"7e822e58f897ad1c063cd2eb974dab1a348cec564e18b5355b8fbe75c6974f32"} Feb 27 01:25:44 crc kubenswrapper[4771]: I0227 01:25:44.871263 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zh9lc" event={"ID":"a927ae95-187b-4517-b54d-7faaf7de3155","Type":"ContainerStarted","Data":"4f25ff54fef33988e38ce0a2441a74ef999ed718b9f05e65087731a7701707d7"} Feb 27 01:25:44 crc kubenswrapper[4771]: I0227 01:25:44.873294 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c294d7a9-b983-45df-82dd-1b5bc8ed7a06","Type":"ContainerStarted","Data":"b560253dd71ede6f53d6b143f3f797d0e680c5ffed4813739e619d7c090d0c2f"} Feb 27 01:25:44 crc kubenswrapper[4771]: I0227 01:25:44.892622 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-zh9lc" podStartSLOduration=1.640129986 podStartE2EDuration="9.892597297s" podCreationTimestamp="2026-02-27 01:25:35 +0000 UTC" firstStartedPulling="2026-02-27 01:25:35.906282141 +0000 UTC m=+1248.843843429" lastFinishedPulling="2026-02-27 01:25:44.158749452 +0000 UTC m=+1257.096310740" observedRunningTime="2026-02-27 01:25:44.883682065 +0000 UTC m=+1257.821243353" watchObservedRunningTime="2026-02-27 01:25:44.892597297 +0000 UTC m=+1257.830158585" Feb 27 01:25:45 crc kubenswrapper[4771]: I0227 01:25:45.682477 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:25:45 crc kubenswrapper[4771]: I0227 01:25:45.908652 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c294d7a9-b983-45df-82dd-1b5bc8ed7a06","Type":"ContainerStarted","Data":"991bdaad5effef40fdf16508bce0cd88d3c827528700da35fdb75486086bb6d0"} Feb 27 01:25:48 crc kubenswrapper[4771]: I0227 01:25:48.947804 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c294d7a9-b983-45df-82dd-1b5bc8ed7a06","Type":"ContainerStarted","Data":"2cc084be9dbea22a5f8d34fe8a5e06e6f3f3ecfef8f6c0433494c0eed137041b"} Feb 27 01:25:48 crc kubenswrapper[4771]: I0227 01:25:48.948326 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c294d7a9-b983-45df-82dd-1b5bc8ed7a06" containerName="proxy-httpd" containerID="cri-o://2cc084be9dbea22a5f8d34fe8a5e06e6f3f3ecfef8f6c0433494c0eed137041b" gracePeriod=30 Feb 27 01:25:48 crc kubenswrapper[4771]: I0227 01:25:48.948344 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 01:25:48 crc kubenswrapper[4771]: I0227 01:25:48.947981 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c294d7a9-b983-45df-82dd-1b5bc8ed7a06" containerName="ceilometer-central-agent" containerID="cri-o://7e822e58f897ad1c063cd2eb974dab1a348cec564e18b5355b8fbe75c6974f32" gracePeriod=30 Feb 27 01:25:48 crc kubenswrapper[4771]: I0227 01:25:48.948455 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c294d7a9-b983-45df-82dd-1b5bc8ed7a06" containerName="sg-core" containerID="cri-o://991bdaad5effef40fdf16508bce0cd88d3c827528700da35fdb75486086bb6d0" gracePeriod=30 Feb 27 01:25:48 crc kubenswrapper[4771]: I0227 01:25:48.948407 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c294d7a9-b983-45df-82dd-1b5bc8ed7a06" containerName="ceilometer-notification-agent" containerID="cri-o://b560253dd71ede6f53d6b143f3f797d0e680c5ffed4813739e619d7c090d0c2f" gracePeriod=30 Feb 27 01:25:48 crc kubenswrapper[4771]: I0227 01:25:48.984678 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.00816706 podStartE2EDuration="11.98465513s" podCreationTimestamp="2026-02-27 01:25:37 +0000 UTC" firstStartedPulling="2026-02-27 01:25:38.730216263 +0000 UTC m=+1251.667777551" lastFinishedPulling="2026-02-27 01:25:47.706704333 +0000 UTC m=+1260.644265621" observedRunningTime="2026-02-27 01:25:48.97397975 +0000 UTC m=+1261.911541038" watchObservedRunningTime="2026-02-27 01:25:48.98465513 +0000 UTC m=+1261.922216418" Feb 27 01:25:49 crc kubenswrapper[4771]: I0227 01:25:49.961215 4771 generic.go:334] "Generic (PLEG): container finished" podID="c294d7a9-b983-45df-82dd-1b5bc8ed7a06" containerID="2cc084be9dbea22a5f8d34fe8a5e06e6f3f3ecfef8f6c0433494c0eed137041b" exitCode=0 Feb 27 01:25:49 crc kubenswrapper[4771]: I0227 01:25:49.961669 4771 generic.go:334] "Generic (PLEG): container finished" podID="c294d7a9-b983-45df-82dd-1b5bc8ed7a06" containerID="991bdaad5effef40fdf16508bce0cd88d3c827528700da35fdb75486086bb6d0" exitCode=2 Feb 27 01:25:49 crc kubenswrapper[4771]: I0227 01:25:49.961693 4771 generic.go:334] "Generic (PLEG): container finished" podID="c294d7a9-b983-45df-82dd-1b5bc8ed7a06" containerID="b560253dd71ede6f53d6b143f3f797d0e680c5ffed4813739e619d7c090d0c2f" exitCode=0 Feb 27 01:25:49 crc kubenswrapper[4771]: I0227 01:25:49.961409 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c294d7a9-b983-45df-82dd-1b5bc8ed7a06","Type":"ContainerDied","Data":"2cc084be9dbea22a5f8d34fe8a5e06e6f3f3ecfef8f6c0433494c0eed137041b"} Feb 27 01:25:49 crc kubenswrapper[4771]: I0227 01:25:49.961762 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c294d7a9-b983-45df-82dd-1b5bc8ed7a06","Type":"ContainerDied","Data":"991bdaad5effef40fdf16508bce0cd88d3c827528700da35fdb75486086bb6d0"} Feb 27 01:25:49 crc kubenswrapper[4771]: I0227 01:25:49.961803 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c294d7a9-b983-45df-82dd-1b5bc8ed7a06","Type":"ContainerDied","Data":"b560253dd71ede6f53d6b143f3f797d0e680c5ffed4813739e619d7c090d0c2f"} Feb 27 01:25:54 crc kubenswrapper[4771]: I0227 01:25:54.006975 4771 generic.go:334] "Generic (PLEG): container finished" podID="c294d7a9-b983-45df-82dd-1b5bc8ed7a06" containerID="7e822e58f897ad1c063cd2eb974dab1a348cec564e18b5355b8fbe75c6974f32" exitCode=0 Feb 27 01:25:54 crc kubenswrapper[4771]: I0227 01:25:54.007067 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c294d7a9-b983-45df-82dd-1b5bc8ed7a06","Type":"ContainerDied","Data":"7e822e58f897ad1c063cd2eb974dab1a348cec564e18b5355b8fbe75c6974f32"} Feb 27 01:25:54 crc kubenswrapper[4771]: I0227 01:25:54.195084 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:25:54 crc kubenswrapper[4771]: I0227 01:25:54.285093 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-log-httpd\") pod \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " Feb 27 01:25:54 crc kubenswrapper[4771]: I0227 01:25:54.285153 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-sg-core-conf-yaml\") pod \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " Feb 27 01:25:54 crc kubenswrapper[4771]: I0227 01:25:54.285308 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-run-httpd\") pod \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " Feb 27 01:25:54 crc kubenswrapper[4771]: I0227 01:25:54.285388 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-config-data\") pod \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " Feb 27 01:25:54 crc kubenswrapper[4771]: I0227 01:25:54.285434 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg5dk\" (UniqueName: \"kubernetes.io/projected/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-kube-api-access-tg5dk\") pod \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " Feb 27 01:25:54 crc kubenswrapper[4771]: I0227 01:25:54.285467 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-scripts\") pod \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " Feb 27 01:25:54 crc kubenswrapper[4771]: I0227 01:25:54.285485 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-combined-ca-bundle\") pod \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\" (UID: \"c294d7a9-b983-45df-82dd-1b5bc8ed7a06\") " Feb 27 01:25:54 crc kubenswrapper[4771]: I0227 01:25:54.285595 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c294d7a9-b983-45df-82dd-1b5bc8ed7a06" (UID: "c294d7a9-b983-45df-82dd-1b5bc8ed7a06"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:25:54 crc kubenswrapper[4771]: I0227 01:25:54.285651 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c294d7a9-b983-45df-82dd-1b5bc8ed7a06" (UID: "c294d7a9-b983-45df-82dd-1b5bc8ed7a06"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:25:54 crc kubenswrapper[4771]: I0227 01:25:54.286403 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:54 crc kubenswrapper[4771]: I0227 01:25:54.286421 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:54 crc kubenswrapper[4771]: I0227 01:25:54.291324 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-scripts" (OuterVolumeSpecName: "scripts") pod "c294d7a9-b983-45df-82dd-1b5bc8ed7a06" (UID: "c294d7a9-b983-45df-82dd-1b5bc8ed7a06"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:54 crc kubenswrapper[4771]: I0227 01:25:54.291879 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-kube-api-access-tg5dk" (OuterVolumeSpecName: "kube-api-access-tg5dk") pod "c294d7a9-b983-45df-82dd-1b5bc8ed7a06" (UID: "c294d7a9-b983-45df-82dd-1b5bc8ed7a06"). InnerVolumeSpecName "kube-api-access-tg5dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:25:54 crc kubenswrapper[4771]: I0227 01:25:54.313097 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c294d7a9-b983-45df-82dd-1b5bc8ed7a06" (UID: "c294d7a9-b983-45df-82dd-1b5bc8ed7a06"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:54 crc kubenswrapper[4771]: I0227 01:25:54.359531 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c294d7a9-b983-45df-82dd-1b5bc8ed7a06" (UID: "c294d7a9-b983-45df-82dd-1b5bc8ed7a06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:54 crc kubenswrapper[4771]: I0227 01:25:54.381768 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-config-data" (OuterVolumeSpecName: "config-data") pod "c294d7a9-b983-45df-82dd-1b5bc8ed7a06" (UID: "c294d7a9-b983-45df-82dd-1b5bc8ed7a06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:54 crc kubenswrapper[4771]: I0227 01:25:54.388136 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:54 crc kubenswrapper[4771]: I0227 01:25:54.388173 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tg5dk\" (UniqueName: \"kubernetes.io/projected/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-kube-api-access-tg5dk\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:54 crc kubenswrapper[4771]: I0227 01:25:54.388185 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:54 crc kubenswrapper[4771]: I0227 01:25:54.388194 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:54 crc kubenswrapper[4771]: I0227 01:25:54.388206 4771 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c294d7a9-b983-45df-82dd-1b5bc8ed7a06-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.021047 4771 generic.go:334] "Generic (PLEG): container finished" podID="a927ae95-187b-4517-b54d-7faaf7de3155" containerID="4f25ff54fef33988e38ce0a2441a74ef999ed718b9f05e65087731a7701707d7" exitCode=0 Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.021096 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zh9lc" event={"ID":"a927ae95-187b-4517-b54d-7faaf7de3155","Type":"ContainerDied","Data":"4f25ff54fef33988e38ce0a2441a74ef999ed718b9f05e65087731a7701707d7"} Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.026598 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c294d7a9-b983-45df-82dd-1b5bc8ed7a06","Type":"ContainerDied","Data":"a7eea7526225e076a5c18809c35b01c21b2cc2c51c10edc66d1da97ec6ea063f"} Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.026642 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.026659 4771 scope.go:117] "RemoveContainer" containerID="2cc084be9dbea22a5f8d34fe8a5e06e6f3f3ecfef8f6c0433494c0eed137041b" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.077206 4771 scope.go:117] "RemoveContainer" containerID="991bdaad5effef40fdf16508bce0cd88d3c827528700da35fdb75486086bb6d0" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.078863 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.091979 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.113144 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:25:55 crc kubenswrapper[4771]: E0227 01:25:55.113538 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c294d7a9-b983-45df-82dd-1b5bc8ed7a06" containerName="ceilometer-central-agent" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.113566 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c294d7a9-b983-45df-82dd-1b5bc8ed7a06" containerName="ceilometer-central-agent" Feb 27 01:25:55 crc kubenswrapper[4771]: E0227 01:25:55.113581 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c294d7a9-b983-45df-82dd-1b5bc8ed7a06" containerName="ceilometer-notification-agent" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.113588 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c294d7a9-b983-45df-82dd-1b5bc8ed7a06" containerName="ceilometer-notification-agent" Feb 27 01:25:55 crc kubenswrapper[4771]: E0227 01:25:55.113600 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c294d7a9-b983-45df-82dd-1b5bc8ed7a06" containerName="sg-core" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.113606 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c294d7a9-b983-45df-82dd-1b5bc8ed7a06" containerName="sg-core" Feb 27 01:25:55 crc kubenswrapper[4771]: E0227 01:25:55.113635 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c294d7a9-b983-45df-82dd-1b5bc8ed7a06" containerName="proxy-httpd" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.113641 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c294d7a9-b983-45df-82dd-1b5bc8ed7a06" containerName="proxy-httpd" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.113799 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c294d7a9-b983-45df-82dd-1b5bc8ed7a06" containerName="sg-core" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.113815 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c294d7a9-b983-45df-82dd-1b5bc8ed7a06" containerName="ceilometer-central-agent" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.113825 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c294d7a9-b983-45df-82dd-1b5bc8ed7a06" containerName="proxy-httpd" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.113837 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c294d7a9-b983-45df-82dd-1b5bc8ed7a06" containerName="ceilometer-notification-agent" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.115364 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.116482 4771 scope.go:117] "RemoveContainer" containerID="b560253dd71ede6f53d6b143f3f797d0e680c5ffed4813739e619d7c090d0c2f" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.121761 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.121970 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.126138 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.172037 4771 scope.go:117] "RemoveContainer" containerID="7e822e58f897ad1c063cd2eb974dab1a348cec564e18b5355b8fbe75c6974f32" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.190502 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="27da755f-7147-4fee-af32-994932f0b715" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.175:3000/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.201377 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwhhk\" (UniqueName: \"kubernetes.io/projected/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-kube-api-access-hwhhk\") pod \"ceilometer-0\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " pod="openstack/ceilometer-0" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.201437 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-run-httpd\") pod \"ceilometer-0\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " pod="openstack/ceilometer-0" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.201605 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " pod="openstack/ceilometer-0" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.201679 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-log-httpd\") pod \"ceilometer-0\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " pod="openstack/ceilometer-0" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.201774 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-config-data\") pod \"ceilometer-0\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " pod="openstack/ceilometer-0" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.201936 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-scripts\") pod \"ceilometer-0\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " pod="openstack/ceilometer-0" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.202020 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " pod="openstack/ceilometer-0" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.303550 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwhhk\" (UniqueName: \"kubernetes.io/projected/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-kube-api-access-hwhhk\") pod \"ceilometer-0\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " pod="openstack/ceilometer-0" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.303829 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-run-httpd\") pod \"ceilometer-0\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " pod="openstack/ceilometer-0" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.304021 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " pod="openstack/ceilometer-0" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.304157 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-log-httpd\") pod \"ceilometer-0\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " pod="openstack/ceilometer-0" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.304261 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-config-data\") pod \"ceilometer-0\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " pod="openstack/ceilometer-0" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.304337 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-run-httpd\") pod \"ceilometer-0\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " pod="openstack/ceilometer-0" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.304454 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-scripts\") pod \"ceilometer-0\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " pod="openstack/ceilometer-0" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.304564 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " pod="openstack/ceilometer-0" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.304847 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-log-httpd\") pod \"ceilometer-0\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " pod="openstack/ceilometer-0" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.308932 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-scripts\") pod \"ceilometer-0\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " pod="openstack/ceilometer-0" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.309762 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-config-data\") pod \"ceilometer-0\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " pod="openstack/ceilometer-0" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.310574 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " pod="openstack/ceilometer-0" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.321680 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " pod="openstack/ceilometer-0" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.329594 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwhhk\" (UniqueName: \"kubernetes.io/projected/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-kube-api-access-hwhhk\") pod \"ceilometer-0\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " pod="openstack/ceilometer-0" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.473714 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.783287 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c294d7a9-b983-45df-82dd-1b5bc8ed7a06" path="/var/lib/kubelet/pods/c294d7a9-b983-45df-82dd-1b5bc8ed7a06/volumes" Feb 27 01:25:55 crc kubenswrapper[4771]: I0227 01:25:55.930988 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:25:56 crc kubenswrapper[4771]: I0227 01:25:56.036504 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49","Type":"ContainerStarted","Data":"e928481d205fb5cf066fe84dd4e2761d6c49dd0896f580db163c83a6e2e82e2c"} Feb 27 01:25:56 crc kubenswrapper[4771]: I0227 01:25:56.480148 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:25:56 crc kubenswrapper[4771]: I0227 01:25:56.551280 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zh9lc" Feb 27 01:25:56 crc kubenswrapper[4771]: I0227 01:25:56.731877 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a927ae95-187b-4517-b54d-7faaf7de3155-combined-ca-bundle\") pod \"a927ae95-187b-4517-b54d-7faaf7de3155\" (UID: \"a927ae95-187b-4517-b54d-7faaf7de3155\") " Feb 27 01:25:56 crc kubenswrapper[4771]: I0227 01:25:56.732599 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kmjm\" (UniqueName: \"kubernetes.io/projected/a927ae95-187b-4517-b54d-7faaf7de3155-kube-api-access-6kmjm\") pod \"a927ae95-187b-4517-b54d-7faaf7de3155\" (UID: \"a927ae95-187b-4517-b54d-7faaf7de3155\") " Feb 27 01:25:56 crc kubenswrapper[4771]: I0227 01:25:56.732660 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a927ae95-187b-4517-b54d-7faaf7de3155-config-data\") pod \"a927ae95-187b-4517-b54d-7faaf7de3155\" (UID: \"a927ae95-187b-4517-b54d-7faaf7de3155\") " Feb 27 01:25:56 crc kubenswrapper[4771]: I0227 01:25:56.732725 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a927ae95-187b-4517-b54d-7faaf7de3155-scripts\") pod \"a927ae95-187b-4517-b54d-7faaf7de3155\" (UID: \"a927ae95-187b-4517-b54d-7faaf7de3155\") " Feb 27 01:25:56 crc kubenswrapper[4771]: I0227 01:25:56.740856 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a927ae95-187b-4517-b54d-7faaf7de3155-scripts" (OuterVolumeSpecName: "scripts") pod "a927ae95-187b-4517-b54d-7faaf7de3155" (UID: "a927ae95-187b-4517-b54d-7faaf7de3155"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:56 crc kubenswrapper[4771]: I0227 01:25:56.741168 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a927ae95-187b-4517-b54d-7faaf7de3155-kube-api-access-6kmjm" (OuterVolumeSpecName: "kube-api-access-6kmjm") pod "a927ae95-187b-4517-b54d-7faaf7de3155" (UID: "a927ae95-187b-4517-b54d-7faaf7de3155"). InnerVolumeSpecName "kube-api-access-6kmjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:25:56 crc kubenswrapper[4771]: I0227 01:25:56.766676 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a927ae95-187b-4517-b54d-7faaf7de3155-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a927ae95-187b-4517-b54d-7faaf7de3155" (UID: "a927ae95-187b-4517-b54d-7faaf7de3155"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:56 crc kubenswrapper[4771]: I0227 01:25:56.774802 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a927ae95-187b-4517-b54d-7faaf7de3155-config-data" (OuterVolumeSpecName: "config-data") pod "a927ae95-187b-4517-b54d-7faaf7de3155" (UID: "a927ae95-187b-4517-b54d-7faaf7de3155"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:25:56 crc kubenswrapper[4771]: I0227 01:25:56.835111 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a927ae95-187b-4517-b54d-7faaf7de3155-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:56 crc kubenswrapper[4771]: I0227 01:25:56.835156 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kmjm\" (UniqueName: \"kubernetes.io/projected/a927ae95-187b-4517-b54d-7faaf7de3155-kube-api-access-6kmjm\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:56 crc kubenswrapper[4771]: I0227 01:25:56.835169 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a927ae95-187b-4517-b54d-7faaf7de3155-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:56 crc kubenswrapper[4771]: I0227 01:25:56.835177 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a927ae95-187b-4517-b54d-7faaf7de3155-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:25:57 crc kubenswrapper[4771]: I0227 01:25:57.048531 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49","Type":"ContainerStarted","Data":"5083b6ecdd2f1a75e5285813a903f832e9bfc940fae24489f70999e6264674ac"} Feb 27 01:25:57 crc kubenswrapper[4771]: I0227 01:25:57.058430 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zh9lc" event={"ID":"a927ae95-187b-4517-b54d-7faaf7de3155","Type":"ContainerDied","Data":"d7034b0d3e3441ee76ac5196f243b4f477e8409d5ba1f9304d1c02b39a51152d"} Feb 27 01:25:57 crc kubenswrapper[4771]: I0227 01:25:57.058482 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7034b0d3e3441ee76ac5196f243b4f477e8409d5ba1f9304d1c02b39a51152d" Feb 27 01:25:57 crc kubenswrapper[4771]: I0227 01:25:57.058611 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zh9lc" Feb 27 01:25:57 crc kubenswrapper[4771]: I0227 01:25:57.142817 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 27 01:25:57 crc kubenswrapper[4771]: E0227 01:25:57.143385 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a927ae95-187b-4517-b54d-7faaf7de3155" containerName="nova-cell0-conductor-db-sync" Feb 27 01:25:57 crc kubenswrapper[4771]: I0227 01:25:57.143411 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a927ae95-187b-4517-b54d-7faaf7de3155" containerName="nova-cell0-conductor-db-sync" Feb 27 01:25:57 crc kubenswrapper[4771]: I0227 01:25:57.143654 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a927ae95-187b-4517-b54d-7faaf7de3155" containerName="nova-cell0-conductor-db-sync" Feb 27 01:25:57 crc kubenswrapper[4771]: I0227 01:25:57.144313 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 27 01:25:57 crc kubenswrapper[4771]: I0227 01:25:57.147811 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 27 01:25:57 crc kubenswrapper[4771]: I0227 01:25:57.147975 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bkjj8" Feb 27 01:25:57 crc kubenswrapper[4771]: I0227 01:25:57.152834 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 27 01:25:57 crc kubenswrapper[4771]: I0227 01:25:57.243414 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcmb7\" (UniqueName: \"kubernetes.io/projected/03b297ed-ac7f-4416-b929-b3d463bc5d72-kube-api-access-rcmb7\") pod \"nova-cell0-conductor-0\" (UID: \"03b297ed-ac7f-4416-b929-b3d463bc5d72\") " pod="openstack/nova-cell0-conductor-0" Feb 27 01:25:57 crc kubenswrapper[4771]: I0227 01:25:57.243882 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03b297ed-ac7f-4416-b929-b3d463bc5d72-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"03b297ed-ac7f-4416-b929-b3d463bc5d72\") " pod="openstack/nova-cell0-conductor-0" Feb 27 01:25:57 crc kubenswrapper[4771]: I0227 01:25:57.243933 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b297ed-ac7f-4416-b929-b3d463bc5d72-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"03b297ed-ac7f-4416-b929-b3d463bc5d72\") " pod="openstack/nova-cell0-conductor-0" Feb 27 01:25:57 crc kubenswrapper[4771]: I0227 01:25:57.346228 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03b297ed-ac7f-4416-b929-b3d463bc5d72-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"03b297ed-ac7f-4416-b929-b3d463bc5d72\") " pod="openstack/nova-cell0-conductor-0" Feb 27 01:25:57 crc kubenswrapper[4771]: I0227 01:25:57.346311 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b297ed-ac7f-4416-b929-b3d463bc5d72-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"03b297ed-ac7f-4416-b929-b3d463bc5d72\") " pod="openstack/nova-cell0-conductor-0" Feb 27 01:25:57 crc kubenswrapper[4771]: I0227 01:25:57.346458 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcmb7\" (UniqueName: \"kubernetes.io/projected/03b297ed-ac7f-4416-b929-b3d463bc5d72-kube-api-access-rcmb7\") pod \"nova-cell0-conductor-0\" (UID: \"03b297ed-ac7f-4416-b929-b3d463bc5d72\") " pod="openstack/nova-cell0-conductor-0" Feb 27 01:25:57 crc kubenswrapper[4771]: I0227 01:25:57.352607 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b297ed-ac7f-4416-b929-b3d463bc5d72-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"03b297ed-ac7f-4416-b929-b3d463bc5d72\") " pod="openstack/nova-cell0-conductor-0" Feb 27 01:25:57 crc kubenswrapper[4771]: I0227 01:25:57.353179 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03b297ed-ac7f-4416-b929-b3d463bc5d72-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"03b297ed-ac7f-4416-b929-b3d463bc5d72\") " pod="openstack/nova-cell0-conductor-0" Feb 27 01:25:57 crc kubenswrapper[4771]: I0227 01:25:57.363175 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcmb7\" (UniqueName: \"kubernetes.io/projected/03b297ed-ac7f-4416-b929-b3d463bc5d72-kube-api-access-rcmb7\") pod \"nova-cell0-conductor-0\" (UID: \"03b297ed-ac7f-4416-b929-b3d463bc5d72\") " pod="openstack/nova-cell0-conductor-0" Feb 27 01:25:57 crc kubenswrapper[4771]: I0227 01:25:57.545115 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 27 01:25:58 crc kubenswrapper[4771]: I0227 01:25:58.007910 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 27 01:25:58 crc kubenswrapper[4771]: I0227 01:25:58.078527 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49","Type":"ContainerStarted","Data":"90159afbb3091f0b31c7ca9cb62367c86d128f542798aaac600f1031920078cb"} Feb 27 01:25:58 crc kubenswrapper[4771]: I0227 01:25:58.078596 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49","Type":"ContainerStarted","Data":"4090bd38252a72300c4d29ddc0a5483f171e65cd7aa92c0d830e4a31ffc19a5a"} Feb 27 01:25:58 crc kubenswrapper[4771]: I0227 01:25:58.084133 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"03b297ed-ac7f-4416-b929-b3d463bc5d72","Type":"ContainerStarted","Data":"ca7901f885c542cd27e5b2be246269929401945a537e273324d4a0fa204c1ff8"} Feb 27 01:25:59 crc kubenswrapper[4771]: I0227 01:25:59.092779 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"03b297ed-ac7f-4416-b929-b3d463bc5d72","Type":"ContainerStarted","Data":"d6e67de5cd247b9e69993ca28740c65af49d08a06e282d10d1aa095eb65d28db"} Feb 27 01:25:59 crc kubenswrapper[4771]: I0227 01:25:59.093788 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 27 01:25:59 crc kubenswrapper[4771]: I0227 01:25:59.120431 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.120414315 podStartE2EDuration="2.120414315s" podCreationTimestamp="2026-02-27 01:25:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:25:59.113373984 +0000 UTC m=+1272.050935302" watchObservedRunningTime="2026-02-27 01:25:59.120414315 +0000 UTC m=+1272.057975593" Feb 27 01:26:00 crc kubenswrapper[4771]: I0227 01:26:00.144835 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535926-bsrhj"] Feb 27 01:26:00 crc kubenswrapper[4771]: I0227 01:26:00.146261 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535926-bsrhj" Feb 27 01:26:00 crc kubenswrapper[4771]: I0227 01:26:00.152848 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535926-bsrhj"] Feb 27 01:26:00 crc kubenswrapper[4771]: I0227 01:26:00.152989 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:26:00 crc kubenswrapper[4771]: I0227 01:26:00.153026 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:26:00 crc kubenswrapper[4771]: I0227 01:26:00.155841 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 01:26:00 crc kubenswrapper[4771]: I0227 01:26:00.306392 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxllc\" (UniqueName: \"kubernetes.io/projected/abf6ea82-e9fc-4fd6-81c5-6883afb00b83-kube-api-access-gxllc\") pod \"auto-csr-approver-29535926-bsrhj\" (UID: \"abf6ea82-e9fc-4fd6-81c5-6883afb00b83\") " pod="openshift-infra/auto-csr-approver-29535926-bsrhj" Feb 27 01:26:00 crc kubenswrapper[4771]: I0227 01:26:00.408472 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxllc\" (UniqueName: \"kubernetes.io/projected/abf6ea82-e9fc-4fd6-81c5-6883afb00b83-kube-api-access-gxllc\") pod \"auto-csr-approver-29535926-bsrhj\" (UID: \"abf6ea82-e9fc-4fd6-81c5-6883afb00b83\") " pod="openshift-infra/auto-csr-approver-29535926-bsrhj" Feb 27 01:26:00 crc kubenswrapper[4771]: I0227 01:26:00.446907 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxllc\" (UniqueName: \"kubernetes.io/projected/abf6ea82-e9fc-4fd6-81c5-6883afb00b83-kube-api-access-gxllc\") pod \"auto-csr-approver-29535926-bsrhj\" (UID: \"abf6ea82-e9fc-4fd6-81c5-6883afb00b83\") " pod="openshift-infra/auto-csr-approver-29535926-bsrhj" Feb 27 01:26:00 crc kubenswrapper[4771]: I0227 01:26:00.464608 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535926-bsrhj" Feb 27 01:26:00 crc kubenswrapper[4771]: I0227 01:26:00.938674 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535926-bsrhj"] Feb 27 01:26:00 crc kubenswrapper[4771]: W0227 01:26:00.940192 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabf6ea82_e9fc_4fd6_81c5_6883afb00b83.slice/crio-ea76fb71075d9f3caec10d3da72057c66050e08573813a0683754bebf567e677 WatchSource:0}: Error finding container ea76fb71075d9f3caec10d3da72057c66050e08573813a0683754bebf567e677: Status 404 returned error can't find the container with id ea76fb71075d9f3caec10d3da72057c66050e08573813a0683754bebf567e677 Feb 27 01:26:01 crc kubenswrapper[4771]: I0227 01:26:01.118963 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49","Type":"ContainerStarted","Data":"1b29e9eb29bf56ab95201794b9d956cbcbe2f5263e08dfd2152ca8bec5b24af0"} Feb 27 01:26:01 crc kubenswrapper[4771]: I0227 01:26:01.119112 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" containerName="ceilometer-central-agent" containerID="cri-o://5083b6ecdd2f1a75e5285813a903f832e9bfc940fae24489f70999e6264674ac" gracePeriod=30 Feb 27 01:26:01 crc kubenswrapper[4771]: I0227 01:26:01.119407 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 01:26:01 crc kubenswrapper[4771]: I0227 01:26:01.119651 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" containerName="sg-core" containerID="cri-o://90159afbb3091f0b31c7ca9cb62367c86d128f542798aaac600f1031920078cb" gracePeriod=30 Feb 27 01:26:01 crc kubenswrapper[4771]: I0227 01:26:01.119662 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" containerName="ceilometer-notification-agent" containerID="cri-o://4090bd38252a72300c4d29ddc0a5483f171e65cd7aa92c0d830e4a31ffc19a5a" gracePeriod=30 Feb 27 01:26:01 crc kubenswrapper[4771]: I0227 01:26:01.119698 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" containerName="proxy-httpd" containerID="cri-o://1b29e9eb29bf56ab95201794b9d956cbcbe2f5263e08dfd2152ca8bec5b24af0" gracePeriod=30 Feb 27 01:26:01 crc kubenswrapper[4771]: I0227 01:26:01.121706 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535926-bsrhj" event={"ID":"abf6ea82-e9fc-4fd6-81c5-6883afb00b83","Type":"ContainerStarted","Data":"ea76fb71075d9f3caec10d3da72057c66050e08573813a0683754bebf567e677"} Feb 27 01:26:01 crc kubenswrapper[4771]: I0227 01:26:01.149847 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.4361221149999999 podStartE2EDuration="6.149826005s" podCreationTimestamp="2026-02-27 01:25:55 +0000 UTC" firstStartedPulling="2026-02-27 01:25:55.931872994 +0000 UTC m=+1268.869434292" lastFinishedPulling="2026-02-27 01:26:00.645576854 +0000 UTC m=+1273.583138182" observedRunningTime="2026-02-27 01:26:01.144910342 +0000 UTC m=+1274.082471630" watchObservedRunningTime="2026-02-27 01:26:01.149826005 +0000 UTC m=+1274.087387303" Feb 27 01:26:02 crc kubenswrapper[4771]: I0227 01:26:02.135729 4771 generic.go:334] "Generic (PLEG): container finished" podID="1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" containerID="1b29e9eb29bf56ab95201794b9d956cbcbe2f5263e08dfd2152ca8bec5b24af0" exitCode=0 Feb 27 01:26:02 crc kubenswrapper[4771]: I0227 01:26:02.136007 4771 generic.go:334] "Generic (PLEG): container finished" podID="1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" containerID="90159afbb3091f0b31c7ca9cb62367c86d128f542798aaac600f1031920078cb" exitCode=2 Feb 27 01:26:02 crc kubenswrapper[4771]: I0227 01:26:02.136015 4771 generic.go:334] "Generic (PLEG): container finished" podID="1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" containerID="4090bd38252a72300c4d29ddc0a5483f171e65cd7aa92c0d830e4a31ffc19a5a" exitCode=0 Feb 27 01:26:02 crc kubenswrapper[4771]: I0227 01:26:02.135802 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49","Type":"ContainerDied","Data":"1b29e9eb29bf56ab95201794b9d956cbcbe2f5263e08dfd2152ca8bec5b24af0"} Feb 27 01:26:02 crc kubenswrapper[4771]: I0227 01:26:02.136053 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49","Type":"ContainerDied","Data":"90159afbb3091f0b31c7ca9cb62367c86d128f542798aaac600f1031920078cb"} Feb 27 01:26:02 crc kubenswrapper[4771]: I0227 01:26:02.136067 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49","Type":"ContainerDied","Data":"4090bd38252a72300c4d29ddc0a5483f171e65cd7aa92c0d830e4a31ffc19a5a"} Feb 27 01:26:02 crc kubenswrapper[4771]: I0227 01:26:02.914842 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.056336 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-log-httpd\") pod \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.056736 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-combined-ca-bundle\") pod \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.056772 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-sg-core-conf-yaml\") pod \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.056806 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-scripts\") pod \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.056822 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-config-data\") pod \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.056905 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-run-httpd\") pod \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.056943 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwhhk\" (UniqueName: \"kubernetes.io/projected/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-kube-api-access-hwhhk\") pod \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\" (UID: \"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49\") " Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.057053 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" (UID: "1be945e5-35ac-4f9c-a844-c8c4b7a4fd49"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.057248 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" (UID: "1be945e5-35ac-4f9c-a844-c8c4b7a4fd49"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.057812 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.057842 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.063043 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-kube-api-access-hwhhk" (OuterVolumeSpecName: "kube-api-access-hwhhk") pod "1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" (UID: "1be945e5-35ac-4f9c-a844-c8c4b7a4fd49"). InnerVolumeSpecName "kube-api-access-hwhhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.065424 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-scripts" (OuterVolumeSpecName: "scripts") pod "1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" (UID: "1be945e5-35ac-4f9c-a844-c8c4b7a4fd49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.086140 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" (UID: "1be945e5-35ac-4f9c-a844-c8c4b7a4fd49"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.146384 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" (UID: "1be945e5-35ac-4f9c-a844-c8c4b7a4fd49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.147659 4771 generic.go:334] "Generic (PLEG): container finished" podID="1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" containerID="5083b6ecdd2f1a75e5285813a903f832e9bfc940fae24489f70999e6264674ac" exitCode=0 Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.147726 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49","Type":"ContainerDied","Data":"5083b6ecdd2f1a75e5285813a903f832e9bfc940fae24489f70999e6264674ac"} Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.147750 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.147801 4771 scope.go:117] "RemoveContainer" containerID="1b29e9eb29bf56ab95201794b9d956cbcbe2f5263e08dfd2152ca8bec5b24af0" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.147755 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1be945e5-35ac-4f9c-a844-c8c4b7a4fd49","Type":"ContainerDied","Data":"e928481d205fb5cf066fe84dd4e2761d6c49dd0896f580db163c83a6e2e82e2c"} Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.149909 4771 generic.go:334] "Generic (PLEG): container finished" podID="abf6ea82-e9fc-4fd6-81c5-6883afb00b83" containerID="596827d833e8740f4219de319278bedbde3129517faaddc09167ab70da358220" exitCode=0 Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.149945 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535926-bsrhj" event={"ID":"abf6ea82-e9fc-4fd6-81c5-6883afb00b83","Type":"ContainerDied","Data":"596827d833e8740f4219de319278bedbde3129517faaddc09167ab70da358220"} Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.161715 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.161752 4771 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.161765 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.161777 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwhhk\" (UniqueName: \"kubernetes.io/projected/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-kube-api-access-hwhhk\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.166398 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-config-data" (OuterVolumeSpecName: "config-data") pod "1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" (UID: "1be945e5-35ac-4f9c-a844-c8c4b7a4fd49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.172868 4771 scope.go:117] "RemoveContainer" containerID="90159afbb3091f0b31c7ca9cb62367c86d128f542798aaac600f1031920078cb" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.190696 4771 scope.go:117] "RemoveContainer" containerID="4090bd38252a72300c4d29ddc0a5483f171e65cd7aa92c0d830e4a31ffc19a5a" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.210823 4771 scope.go:117] "RemoveContainer" containerID="5083b6ecdd2f1a75e5285813a903f832e9bfc940fae24489f70999e6264674ac" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.226888 4771 scope.go:117] "RemoveContainer" containerID="1b29e9eb29bf56ab95201794b9d956cbcbe2f5263e08dfd2152ca8bec5b24af0" Feb 27 01:26:03 crc kubenswrapper[4771]: E0227 01:26:03.227365 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b29e9eb29bf56ab95201794b9d956cbcbe2f5263e08dfd2152ca8bec5b24af0\": container with ID starting with 1b29e9eb29bf56ab95201794b9d956cbcbe2f5263e08dfd2152ca8bec5b24af0 not found: ID does not exist" containerID="1b29e9eb29bf56ab95201794b9d956cbcbe2f5263e08dfd2152ca8bec5b24af0" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.227396 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b29e9eb29bf56ab95201794b9d956cbcbe2f5263e08dfd2152ca8bec5b24af0"} err="failed to get container status \"1b29e9eb29bf56ab95201794b9d956cbcbe2f5263e08dfd2152ca8bec5b24af0\": rpc error: code = NotFound desc = could not find container \"1b29e9eb29bf56ab95201794b9d956cbcbe2f5263e08dfd2152ca8bec5b24af0\": container with ID starting with 1b29e9eb29bf56ab95201794b9d956cbcbe2f5263e08dfd2152ca8bec5b24af0 not found: ID does not exist" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.227415 4771 scope.go:117] "RemoveContainer" containerID="90159afbb3091f0b31c7ca9cb62367c86d128f542798aaac600f1031920078cb" Feb 27 01:26:03 crc kubenswrapper[4771]: E0227 01:26:03.227951 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90159afbb3091f0b31c7ca9cb62367c86d128f542798aaac600f1031920078cb\": container with ID starting with 90159afbb3091f0b31c7ca9cb62367c86d128f542798aaac600f1031920078cb not found: ID does not exist" containerID="90159afbb3091f0b31c7ca9cb62367c86d128f542798aaac600f1031920078cb" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.228028 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90159afbb3091f0b31c7ca9cb62367c86d128f542798aaac600f1031920078cb"} err="failed to get container status \"90159afbb3091f0b31c7ca9cb62367c86d128f542798aaac600f1031920078cb\": rpc error: code = NotFound desc = could not find container \"90159afbb3091f0b31c7ca9cb62367c86d128f542798aaac600f1031920078cb\": container with ID starting with 90159afbb3091f0b31c7ca9cb62367c86d128f542798aaac600f1031920078cb not found: ID does not exist" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.228080 4771 scope.go:117] "RemoveContainer" containerID="4090bd38252a72300c4d29ddc0a5483f171e65cd7aa92c0d830e4a31ffc19a5a" Feb 27 01:26:03 crc kubenswrapper[4771]: E0227 01:26:03.228514 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4090bd38252a72300c4d29ddc0a5483f171e65cd7aa92c0d830e4a31ffc19a5a\": container with ID starting with 4090bd38252a72300c4d29ddc0a5483f171e65cd7aa92c0d830e4a31ffc19a5a not found: ID does not exist" containerID="4090bd38252a72300c4d29ddc0a5483f171e65cd7aa92c0d830e4a31ffc19a5a" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.228542 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4090bd38252a72300c4d29ddc0a5483f171e65cd7aa92c0d830e4a31ffc19a5a"} err="failed to get container status \"4090bd38252a72300c4d29ddc0a5483f171e65cd7aa92c0d830e4a31ffc19a5a\": rpc error: code = NotFound desc = could not find container \"4090bd38252a72300c4d29ddc0a5483f171e65cd7aa92c0d830e4a31ffc19a5a\": container with ID starting with 4090bd38252a72300c4d29ddc0a5483f171e65cd7aa92c0d830e4a31ffc19a5a not found: ID does not exist" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.228580 4771 scope.go:117] "RemoveContainer" containerID="5083b6ecdd2f1a75e5285813a903f832e9bfc940fae24489f70999e6264674ac" Feb 27 01:26:03 crc kubenswrapper[4771]: E0227 01:26:03.228923 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5083b6ecdd2f1a75e5285813a903f832e9bfc940fae24489f70999e6264674ac\": container with ID starting with 5083b6ecdd2f1a75e5285813a903f832e9bfc940fae24489f70999e6264674ac not found: ID does not exist" containerID="5083b6ecdd2f1a75e5285813a903f832e9bfc940fae24489f70999e6264674ac" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.228984 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5083b6ecdd2f1a75e5285813a903f832e9bfc940fae24489f70999e6264674ac"} err="failed to get container status \"5083b6ecdd2f1a75e5285813a903f832e9bfc940fae24489f70999e6264674ac\": rpc error: code = NotFound desc = could not find container \"5083b6ecdd2f1a75e5285813a903f832e9bfc940fae24489f70999e6264674ac\": container with ID starting with 5083b6ecdd2f1a75e5285813a903f832e9bfc940fae24489f70999e6264674ac not found: ID does not exist" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.264709 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.506797 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.568685 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.579406 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:26:03 crc kubenswrapper[4771]: E0227 01:26:03.579809 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" containerName="ceilometer-notification-agent" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.579824 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" containerName="ceilometer-notification-agent" Feb 27 01:26:03 crc kubenswrapper[4771]: E0227 01:26:03.579836 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" containerName="proxy-httpd" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.579842 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" containerName="proxy-httpd" Feb 27 01:26:03 crc kubenswrapper[4771]: E0227 01:26:03.579855 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" containerName="sg-core" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.579861 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" containerName="sg-core" Feb 27 01:26:03 crc kubenswrapper[4771]: E0227 01:26:03.579870 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" containerName="ceilometer-central-agent" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.579875 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" containerName="ceilometer-central-agent" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.580037 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" containerName="ceilometer-notification-agent" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.580051 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" containerName="sg-core" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.580063 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" containerName="ceilometer-central-agent" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.580073 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" containerName="proxy-httpd" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.582239 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.584746 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.585307 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.593123 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.674723 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f082e8d-a1dc-43e2-9c41-7292439e0f88-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " pod="openstack/ceilometer-0" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.674767 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f082e8d-a1dc-43e2-9c41-7292439e0f88-run-httpd\") pod \"ceilometer-0\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " pod="openstack/ceilometer-0" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.675022 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f082e8d-a1dc-43e2-9c41-7292439e0f88-scripts\") pod \"ceilometer-0\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " pod="openstack/ceilometer-0" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.675078 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f082e8d-a1dc-43e2-9c41-7292439e0f88-config-data\") pod \"ceilometer-0\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " pod="openstack/ceilometer-0" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.675132 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f082e8d-a1dc-43e2-9c41-7292439e0f88-log-httpd\") pod \"ceilometer-0\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " pod="openstack/ceilometer-0" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.675179 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj2w4\" (UniqueName: \"kubernetes.io/projected/2f082e8d-a1dc-43e2-9c41-7292439e0f88-kube-api-access-wj2w4\") pod \"ceilometer-0\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " pod="openstack/ceilometer-0" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.675338 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f082e8d-a1dc-43e2-9c41-7292439e0f88-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " pod="openstack/ceilometer-0" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.777209 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f082e8d-a1dc-43e2-9c41-7292439e0f88-scripts\") pod \"ceilometer-0\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " pod="openstack/ceilometer-0" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.777261 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f082e8d-a1dc-43e2-9c41-7292439e0f88-config-data\") pod \"ceilometer-0\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " pod="openstack/ceilometer-0" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.777286 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f082e8d-a1dc-43e2-9c41-7292439e0f88-log-httpd\") pod \"ceilometer-0\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " pod="openstack/ceilometer-0" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.777363 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wj2w4\" (UniqueName: \"kubernetes.io/projected/2f082e8d-a1dc-43e2-9c41-7292439e0f88-kube-api-access-wj2w4\") pod \"ceilometer-0\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " pod="openstack/ceilometer-0" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.777934 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f082e8d-a1dc-43e2-9c41-7292439e0f88-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " pod="openstack/ceilometer-0" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.777992 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f082e8d-a1dc-43e2-9c41-7292439e0f88-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " pod="openstack/ceilometer-0" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.778023 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f082e8d-a1dc-43e2-9c41-7292439e0f88-run-httpd\") pod \"ceilometer-0\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " pod="openstack/ceilometer-0" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.777852 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f082e8d-a1dc-43e2-9c41-7292439e0f88-log-httpd\") pod \"ceilometer-0\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " pod="openstack/ceilometer-0" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.778957 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f082e8d-a1dc-43e2-9c41-7292439e0f88-run-httpd\") pod \"ceilometer-0\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " pod="openstack/ceilometer-0" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.783597 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1be945e5-35ac-4f9c-a844-c8c4b7a4fd49" path="/var/lib/kubelet/pods/1be945e5-35ac-4f9c-a844-c8c4b7a4fd49/volumes" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.784296 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f082e8d-a1dc-43e2-9c41-7292439e0f88-config-data\") pod \"ceilometer-0\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " pod="openstack/ceilometer-0" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.784270 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f082e8d-a1dc-43e2-9c41-7292439e0f88-scripts\") pod \"ceilometer-0\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " pod="openstack/ceilometer-0" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.784415 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f082e8d-a1dc-43e2-9c41-7292439e0f88-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " pod="openstack/ceilometer-0" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.786801 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f082e8d-a1dc-43e2-9c41-7292439e0f88-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " pod="openstack/ceilometer-0" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.794408 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj2w4\" (UniqueName: \"kubernetes.io/projected/2f082e8d-a1dc-43e2-9c41-7292439e0f88-kube-api-access-wj2w4\") pod \"ceilometer-0\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " pod="openstack/ceilometer-0" Feb 27 01:26:03 crc kubenswrapper[4771]: I0227 01:26:03.904305 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:26:04 crc kubenswrapper[4771]: I0227 01:26:04.367147 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:26:04 crc kubenswrapper[4771]: W0227 01:26:04.370347 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f082e8d_a1dc_43e2_9c41_7292439e0f88.slice/crio-6b1f52ba62e5666b35c6ff737cc98350f973f58df302e00cdcf80e20f98de522 WatchSource:0}: Error finding container 6b1f52ba62e5666b35c6ff737cc98350f973f58df302e00cdcf80e20f98de522: Status 404 returned error can't find the container with id 6b1f52ba62e5666b35c6ff737cc98350f973f58df302e00cdcf80e20f98de522 Feb 27 01:26:04 crc kubenswrapper[4771]: I0227 01:26:04.464912 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535926-bsrhj" Feb 27 01:26:04 crc kubenswrapper[4771]: I0227 01:26:04.596068 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxllc\" (UniqueName: \"kubernetes.io/projected/abf6ea82-e9fc-4fd6-81c5-6883afb00b83-kube-api-access-gxllc\") pod \"abf6ea82-e9fc-4fd6-81c5-6883afb00b83\" (UID: \"abf6ea82-e9fc-4fd6-81c5-6883afb00b83\") " Feb 27 01:26:04 crc kubenswrapper[4771]: I0227 01:26:04.602381 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abf6ea82-e9fc-4fd6-81c5-6883afb00b83-kube-api-access-gxllc" (OuterVolumeSpecName: "kube-api-access-gxllc") pod "abf6ea82-e9fc-4fd6-81c5-6883afb00b83" (UID: "abf6ea82-e9fc-4fd6-81c5-6883afb00b83"). InnerVolumeSpecName "kube-api-access-gxllc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:26:04 crc kubenswrapper[4771]: I0227 01:26:04.699134 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxllc\" (UniqueName: \"kubernetes.io/projected/abf6ea82-e9fc-4fd6-81c5-6883afb00b83-kube-api-access-gxllc\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:05 crc kubenswrapper[4771]: I0227 01:26:05.172681 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f082e8d-a1dc-43e2-9c41-7292439e0f88","Type":"ContainerStarted","Data":"421067aec0baefa92d900b90c6183d82a646a6960ef02dc705de4958f67e734e"} Feb 27 01:26:05 crc kubenswrapper[4771]: I0227 01:26:05.172739 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f082e8d-a1dc-43e2-9c41-7292439e0f88","Type":"ContainerStarted","Data":"6b1f52ba62e5666b35c6ff737cc98350f973f58df302e00cdcf80e20f98de522"} Feb 27 01:26:05 crc kubenswrapper[4771]: I0227 01:26:05.174634 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535926-bsrhj" event={"ID":"abf6ea82-e9fc-4fd6-81c5-6883afb00b83","Type":"ContainerDied","Data":"ea76fb71075d9f3caec10d3da72057c66050e08573813a0683754bebf567e677"} Feb 27 01:26:05 crc kubenswrapper[4771]: I0227 01:26:05.174672 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea76fb71075d9f3caec10d3da72057c66050e08573813a0683754bebf567e677" Feb 27 01:26:05 crc kubenswrapper[4771]: I0227 01:26:05.174779 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535926-bsrhj" Feb 27 01:26:05 crc kubenswrapper[4771]: I0227 01:26:05.548531 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535920-6lvqm"] Feb 27 01:26:05 crc kubenswrapper[4771]: I0227 01:26:05.558035 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535920-6lvqm"] Feb 27 01:26:05 crc kubenswrapper[4771]: I0227 01:26:05.784339 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db5208a9-6556-4267-8519-a646c7b1aff6" path="/var/lib/kubelet/pods/db5208a9-6556-4267-8519-a646c7b1aff6/volumes" Feb 27 01:26:06 crc kubenswrapper[4771]: I0227 01:26:06.187631 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f082e8d-a1dc-43e2-9c41-7292439e0f88","Type":"ContainerStarted","Data":"86667540c8643b9730217724f4fb285e78d2795af4c5bef6c7338c4e58fde941"} Feb 27 01:26:07 crc kubenswrapper[4771]: I0227 01:26:07.202682 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f082e8d-a1dc-43e2-9c41-7292439e0f88","Type":"ContainerStarted","Data":"066a0b6b4d75bced7db6789f537a8751d33dcd9619cd4611dc307029b5332d5b"} Feb 27 01:26:07 crc kubenswrapper[4771]: I0227 01:26:07.611340 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.196561 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-6vpvq"] Feb 27 01:26:08 crc kubenswrapper[4771]: E0227 01:26:08.197232 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abf6ea82-e9fc-4fd6-81c5-6883afb00b83" containerName="oc" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.197251 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="abf6ea82-e9fc-4fd6-81c5-6883afb00b83" containerName="oc" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.197430 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="abf6ea82-e9fc-4fd6-81c5-6883afb00b83" containerName="oc" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.198148 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6vpvq" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.201866 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.202533 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.215143 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-6vpvq"] Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.373357 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d5da4dc-52ee-48f5-b5af-0fea453db0d7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6vpvq\" (UID: \"4d5da4dc-52ee-48f5-b5af-0fea453db0d7\") " pod="openstack/nova-cell0-cell-mapping-6vpvq" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.373421 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d5da4dc-52ee-48f5-b5af-0fea453db0d7-config-data\") pod \"nova-cell0-cell-mapping-6vpvq\" (UID: \"4d5da4dc-52ee-48f5-b5af-0fea453db0d7\") " pod="openstack/nova-cell0-cell-mapping-6vpvq" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.373493 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7n5x\" (UniqueName: \"kubernetes.io/projected/4d5da4dc-52ee-48f5-b5af-0fea453db0d7-kube-api-access-g7n5x\") pod \"nova-cell0-cell-mapping-6vpvq\" (UID: \"4d5da4dc-52ee-48f5-b5af-0fea453db0d7\") " pod="openstack/nova-cell0-cell-mapping-6vpvq" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.373631 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d5da4dc-52ee-48f5-b5af-0fea453db0d7-scripts\") pod \"nova-cell0-cell-mapping-6vpvq\" (UID: \"4d5da4dc-52ee-48f5-b5af-0fea453db0d7\") " pod="openstack/nova-cell0-cell-mapping-6vpvq" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.445430 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.446853 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.451699 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.475685 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d5da4dc-52ee-48f5-b5af-0fea453db0d7-scripts\") pod \"nova-cell0-cell-mapping-6vpvq\" (UID: \"4d5da4dc-52ee-48f5-b5af-0fea453db0d7\") " pod="openstack/nova-cell0-cell-mapping-6vpvq" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.475802 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d5da4dc-52ee-48f5-b5af-0fea453db0d7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6vpvq\" (UID: \"4d5da4dc-52ee-48f5-b5af-0fea453db0d7\") " pod="openstack/nova-cell0-cell-mapping-6vpvq" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.475829 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d5da4dc-52ee-48f5-b5af-0fea453db0d7-config-data\") pod \"nova-cell0-cell-mapping-6vpvq\" (UID: \"4d5da4dc-52ee-48f5-b5af-0fea453db0d7\") " pod="openstack/nova-cell0-cell-mapping-6vpvq" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.475883 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7n5x\" (UniqueName: \"kubernetes.io/projected/4d5da4dc-52ee-48f5-b5af-0fea453db0d7-kube-api-access-g7n5x\") pod \"nova-cell0-cell-mapping-6vpvq\" (UID: \"4d5da4dc-52ee-48f5-b5af-0fea453db0d7\") " pod="openstack/nova-cell0-cell-mapping-6vpvq" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.480171 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.485266 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.486251 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d5da4dc-52ee-48f5-b5af-0fea453db0d7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6vpvq\" (UID: \"4d5da4dc-52ee-48f5-b5af-0fea453db0d7\") " pod="openstack/nova-cell0-cell-mapping-6vpvq" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.488107 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d5da4dc-52ee-48f5-b5af-0fea453db0d7-scripts\") pod \"nova-cell0-cell-mapping-6vpvq\" (UID: \"4d5da4dc-52ee-48f5-b5af-0fea453db0d7\") " pod="openstack/nova-cell0-cell-mapping-6vpvq" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.496617 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.510073 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d5da4dc-52ee-48f5-b5af-0fea453db0d7-config-data\") pod \"nova-cell0-cell-mapping-6vpvq\" (UID: \"4d5da4dc-52ee-48f5-b5af-0fea453db0d7\") " pod="openstack/nova-cell0-cell-mapping-6vpvq" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.512001 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7n5x\" (UniqueName: \"kubernetes.io/projected/4d5da4dc-52ee-48f5-b5af-0fea453db0d7-kube-api-access-g7n5x\") pod \"nova-cell0-cell-mapping-6vpvq\" (UID: \"4d5da4dc-52ee-48f5-b5af-0fea453db0d7\") " pod="openstack/nova-cell0-cell-mapping-6vpvq" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.523043 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6vpvq" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.527358 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.578778 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093bc468-6ace-4a5a-a695-b8b202f64bcd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"093bc468-6ace-4a5a-a695-b8b202f64bcd\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.578885 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093bc468-6ace-4a5a-a695-b8b202f64bcd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"093bc468-6ace-4a5a-a695-b8b202f64bcd\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.579020 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97954daf-91f3-4031-a5bc-e5c6429a8810-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"97954daf-91f3-4031-a5bc-e5c6429a8810\") " pod="openstack/nova-api-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.579224 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vxf2\" (UniqueName: \"kubernetes.io/projected/093bc468-6ace-4a5a-a695-b8b202f64bcd-kube-api-access-6vxf2\") pod \"nova-cell1-novncproxy-0\" (UID: \"093bc468-6ace-4a5a-a695-b8b202f64bcd\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.580616 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97954daf-91f3-4031-a5bc-e5c6429a8810-config-data\") pod \"nova-api-0\" (UID: \"97954daf-91f3-4031-a5bc-e5c6429a8810\") " pod="openstack/nova-api-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.581140 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97954daf-91f3-4031-a5bc-e5c6429a8810-logs\") pod \"nova-api-0\" (UID: \"97954daf-91f3-4031-a5bc-e5c6429a8810\") " pod="openstack/nova-api-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.581230 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwzrk\" (UniqueName: \"kubernetes.io/projected/97954daf-91f3-4031-a5bc-e5c6429a8810-kube-api-access-gwzrk\") pod \"nova-api-0\" (UID: \"97954daf-91f3-4031-a5bc-e5c6429a8810\") " pod="openstack/nova-api-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.594460 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.669158 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.670659 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.680148 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.682150 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.682357 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093bc468-6ace-4a5a-a695-b8b202f64bcd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"093bc468-6ace-4a5a-a695-b8b202f64bcd\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.682424 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97954daf-91f3-4031-a5bc-e5c6429a8810-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"97954daf-91f3-4031-a5bc-e5c6429a8810\") " pod="openstack/nova-api-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.682536 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vxf2\" (UniqueName: \"kubernetes.io/projected/093bc468-6ace-4a5a-a695-b8b202f64bcd-kube-api-access-6vxf2\") pod \"nova-cell1-novncproxy-0\" (UID: \"093bc468-6ace-4a5a-a695-b8b202f64bcd\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.682578 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97954daf-91f3-4031-a5bc-e5c6429a8810-config-data\") pod \"nova-api-0\" (UID: \"97954daf-91f3-4031-a5bc-e5c6429a8810\") " pod="openstack/nova-api-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.682651 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97954daf-91f3-4031-a5bc-e5c6429a8810-logs\") pod \"nova-api-0\" (UID: \"97954daf-91f3-4031-a5bc-e5c6429a8810\") " pod="openstack/nova-api-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.682672 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwzrk\" (UniqueName: \"kubernetes.io/projected/97954daf-91f3-4031-a5bc-e5c6429a8810-kube-api-access-gwzrk\") pod \"nova-api-0\" (UID: \"97954daf-91f3-4031-a5bc-e5c6429a8810\") " pod="openstack/nova-api-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.682720 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093bc468-6ace-4a5a-a695-b8b202f64bcd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"093bc468-6ace-4a5a-a695-b8b202f64bcd\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.694189 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.741215 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97954daf-91f3-4031-a5bc-e5c6429a8810-logs\") pod \"nova-api-0\" (UID: \"97954daf-91f3-4031-a5bc-e5c6429a8810\") " pod="openstack/nova-api-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.742062 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97954daf-91f3-4031-a5bc-e5c6429a8810-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"97954daf-91f3-4031-a5bc-e5c6429a8810\") " pod="openstack/nova-api-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.754027 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97954daf-91f3-4031-a5bc-e5c6429a8810-config-data\") pod \"nova-api-0\" (UID: \"97954daf-91f3-4031-a5bc-e5c6429a8810\") " pod="openstack/nova-api-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.758599 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwzrk\" (UniqueName: \"kubernetes.io/projected/97954daf-91f3-4031-a5bc-e5c6429a8810-kube-api-access-gwzrk\") pod \"nova-api-0\" (UID: \"97954daf-91f3-4031-a5bc-e5c6429a8810\") " pod="openstack/nova-api-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.765426 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093bc468-6ace-4a5a-a695-b8b202f64bcd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"093bc468-6ace-4a5a-a695-b8b202f64bcd\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.766182 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093bc468-6ace-4a5a-a695-b8b202f64bcd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"093bc468-6ace-4a5a-a695-b8b202f64bcd\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.766876 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.780360 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vxf2\" (UniqueName: \"kubernetes.io/projected/093bc468-6ace-4a5a-a695-b8b202f64bcd-kube-api-access-6vxf2\") pod \"nova-cell1-novncproxy-0\" (UID: \"093bc468-6ace-4a5a-a695-b8b202f64bcd\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.783570 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.788376 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd844cf-2d18-4cf0-946a-5fa088d72a18-config-data\") pod \"nova-metadata-0\" (UID: \"bbd844cf-2d18-4cf0-946a-5fa088d72a18\") " pod="openstack/nova-metadata-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.788449 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xgll\" (UniqueName: \"kubernetes.io/projected/51d00b94-9ca5-41df-9dd3-2c638d714751-kube-api-access-2xgll\") pod \"nova-scheduler-0\" (UID: \"51d00b94-9ca5-41df-9dd3-2c638d714751\") " pod="openstack/nova-scheduler-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.788512 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d00b94-9ca5-41df-9dd3-2c638d714751-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"51d00b94-9ca5-41df-9dd3-2c638d714751\") " pod="openstack/nova-scheduler-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.788535 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d00b94-9ca5-41df-9dd3-2c638d714751-config-data\") pod \"nova-scheduler-0\" (UID: \"51d00b94-9ca5-41df-9dd3-2c638d714751\") " pod="openstack/nova-scheduler-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.788664 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krchb\" (UniqueName: \"kubernetes.io/projected/bbd844cf-2d18-4cf0-946a-5fa088d72a18-kube-api-access-krchb\") pod \"nova-metadata-0\" (UID: \"bbd844cf-2d18-4cf0-946a-5fa088d72a18\") " pod="openstack/nova-metadata-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.789175 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbd844cf-2d18-4cf0-946a-5fa088d72a18-logs\") pod \"nova-metadata-0\" (UID: \"bbd844cf-2d18-4cf0-946a-5fa088d72a18\") " pod="openstack/nova-metadata-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.789234 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd844cf-2d18-4cf0-946a-5fa088d72a18-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bbd844cf-2d18-4cf0-946a-5fa088d72a18\") " pod="openstack/nova-metadata-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.820675 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.893876 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xgll\" (UniqueName: \"kubernetes.io/projected/51d00b94-9ca5-41df-9dd3-2c638d714751-kube-api-access-2xgll\") pod \"nova-scheduler-0\" (UID: \"51d00b94-9ca5-41df-9dd3-2c638d714751\") " pod="openstack/nova-scheduler-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.893929 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d00b94-9ca5-41df-9dd3-2c638d714751-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"51d00b94-9ca5-41df-9dd3-2c638d714751\") " pod="openstack/nova-scheduler-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.893950 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d00b94-9ca5-41df-9dd3-2c638d714751-config-data\") pod \"nova-scheduler-0\" (UID: \"51d00b94-9ca5-41df-9dd3-2c638d714751\") " pod="openstack/nova-scheduler-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.894009 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krchb\" (UniqueName: \"kubernetes.io/projected/bbd844cf-2d18-4cf0-946a-5fa088d72a18-kube-api-access-krchb\") pod \"nova-metadata-0\" (UID: \"bbd844cf-2d18-4cf0-946a-5fa088d72a18\") " pod="openstack/nova-metadata-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.894054 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbd844cf-2d18-4cf0-946a-5fa088d72a18-logs\") pod \"nova-metadata-0\" (UID: \"bbd844cf-2d18-4cf0-946a-5fa088d72a18\") " pod="openstack/nova-metadata-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.894071 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd844cf-2d18-4cf0-946a-5fa088d72a18-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bbd844cf-2d18-4cf0-946a-5fa088d72a18\") " pod="openstack/nova-metadata-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.894106 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd844cf-2d18-4cf0-946a-5fa088d72a18-config-data\") pod \"nova-metadata-0\" (UID: \"bbd844cf-2d18-4cf0-946a-5fa088d72a18\") " pod="openstack/nova-metadata-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.895163 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbd844cf-2d18-4cf0-946a-5fa088d72a18-logs\") pod \"nova-metadata-0\" (UID: \"bbd844cf-2d18-4cf0-946a-5fa088d72a18\") " pod="openstack/nova-metadata-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.898647 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-mr7mf"] Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.906987 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d00b94-9ca5-41df-9dd3-2c638d714751-config-data\") pod \"nova-scheduler-0\" (UID: \"51d00b94-9ca5-41df-9dd3-2c638d714751\") " pod="openstack/nova-scheduler-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.907397 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.907464 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-mr7mf"] Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.907579 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.912661 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d00b94-9ca5-41df-9dd3-2c638d714751-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"51d00b94-9ca5-41df-9dd3-2c638d714751\") " pod="openstack/nova-scheduler-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.918244 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd844cf-2d18-4cf0-946a-5fa088d72a18-config-data\") pod \"nova-metadata-0\" (UID: \"bbd844cf-2d18-4cf0-946a-5fa088d72a18\") " pod="openstack/nova-metadata-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.959269 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd844cf-2d18-4cf0-946a-5fa088d72a18-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bbd844cf-2d18-4cf0-946a-5fa088d72a18\") " pod="openstack/nova-metadata-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.959502 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xgll\" (UniqueName: \"kubernetes.io/projected/51d00b94-9ca5-41df-9dd3-2c638d714751-kube-api-access-2xgll\") pod \"nova-scheduler-0\" (UID: \"51d00b94-9ca5-41df-9dd3-2c638d714751\") " pod="openstack/nova-scheduler-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.964603 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krchb\" (UniqueName: \"kubernetes.io/projected/bbd844cf-2d18-4cf0-946a-5fa088d72a18-kube-api-access-krchb\") pod \"nova-metadata-0\" (UID: \"bbd844cf-2d18-4cf0-946a-5fa088d72a18\") " pod="openstack/nova-metadata-0" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.997110 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-dns-svc\") pod \"dnsmasq-dns-757b4f8459-mr7mf\" (UID: \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\") " pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.997384 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-mr7mf\" (UID: \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\") " pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.997484 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-mr7mf\" (UID: \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\") " pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.997647 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t64wv\" (UniqueName: \"kubernetes.io/projected/bc9923bd-11c1-4d1d-965b-17e8352ece8c-kube-api-access-t64wv\") pod \"dnsmasq-dns-757b4f8459-mr7mf\" (UID: \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\") " pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.997770 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-mr7mf\" (UID: \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\") " pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" Feb 27 01:26:08 crc kubenswrapper[4771]: I0227 01:26:08.997956 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-config\") pod \"dnsmasq-dns-757b4f8459-mr7mf\" (UID: \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\") " pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.068331 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.100528 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-mr7mf\" (UID: \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\") " pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.100612 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-mr7mf\" (UID: \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\") " pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.100706 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t64wv\" (UniqueName: \"kubernetes.io/projected/bc9923bd-11c1-4d1d-965b-17e8352ece8c-kube-api-access-t64wv\") pod \"dnsmasq-dns-757b4f8459-mr7mf\" (UID: \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\") " pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.100724 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-mr7mf\" (UID: \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\") " pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.101194 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-config\") pod \"dnsmasq-dns-757b4f8459-mr7mf\" (UID: \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\") " pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.101249 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-dns-svc\") pod \"dnsmasq-dns-757b4f8459-mr7mf\" (UID: \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\") " pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.104523 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-mr7mf\" (UID: \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\") " pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.105133 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-mr7mf\" (UID: \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\") " pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.107299 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-mr7mf\" (UID: \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\") " pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.108218 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-config\") pod \"dnsmasq-dns-757b4f8459-mr7mf\" (UID: \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\") " pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.108377 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-dns-svc\") pod \"dnsmasq-dns-757b4f8459-mr7mf\" (UID: \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\") " pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.108824 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.118538 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.131380 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t64wv\" (UniqueName: \"kubernetes.io/projected/bc9923bd-11c1-4d1d-965b-17e8352ece8c-kube-api-access-t64wv\") pod \"dnsmasq-dns-757b4f8459-mr7mf\" (UID: \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\") " pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.251566 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-6vpvq"] Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.260414 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.280870 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f082e8d-a1dc-43e2-9c41-7292439e0f88","Type":"ContainerStarted","Data":"59ef4a3642384b996c829237fb903f1d93239560603462909590268a1dacf201"} Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.281222 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.307386 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.642159167 podStartE2EDuration="6.307363039s" podCreationTimestamp="2026-02-27 01:26:03 +0000 UTC" firstStartedPulling="2026-02-27 01:26:04.373401109 +0000 UTC m=+1277.310962387" lastFinishedPulling="2026-02-27 01:26:08.038604971 +0000 UTC m=+1280.976166259" observedRunningTime="2026-02-27 01:26:09.297790359 +0000 UTC m=+1282.235351647" watchObservedRunningTime="2026-02-27 01:26:09.307363039 +0000 UTC m=+1282.244924327" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.535625 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 01:26:09 crc kubenswrapper[4771]: W0227 01:26:09.546641 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97954daf_91f3_4031_a5bc_e5c6429a8810.slice/crio-0d33f881f100af51641cfa7fe5722478a6df1de1bff73744367acc4c072272f5 WatchSource:0}: Error finding container 0d33f881f100af51641cfa7fe5722478a6df1de1bff73744367acc4c072272f5: Status 404 returned error can't find the container with id 0d33f881f100af51641cfa7fe5722478a6df1de1bff73744367acc4c072272f5 Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.575478 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bbd9v"] Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.577102 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bbd9v" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.580108 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.580754 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.591642 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bbd9v"] Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.618041 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76c84cc9-8833-4227-af3f-7064c9232366-config-data\") pod \"nova-cell1-conductor-db-sync-bbd9v\" (UID: \"76c84cc9-8833-4227-af3f-7064c9232366\") " pod="openstack/nova-cell1-conductor-db-sync-bbd9v" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.618282 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76c84cc9-8833-4227-af3f-7064c9232366-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bbd9v\" (UID: \"76c84cc9-8833-4227-af3f-7064c9232366\") " pod="openstack/nova-cell1-conductor-db-sync-bbd9v" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.618406 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76c84cc9-8833-4227-af3f-7064c9232366-scripts\") pod \"nova-cell1-conductor-db-sync-bbd9v\" (UID: \"76c84cc9-8833-4227-af3f-7064c9232366\") " pod="openstack/nova-cell1-conductor-db-sync-bbd9v" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.618513 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqxlk\" (UniqueName: \"kubernetes.io/projected/76c84cc9-8833-4227-af3f-7064c9232366-kube-api-access-lqxlk\") pod \"nova-cell1-conductor-db-sync-bbd9v\" (UID: \"76c84cc9-8833-4227-af3f-7064c9232366\") " pod="openstack/nova-cell1-conductor-db-sync-bbd9v" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.720473 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqxlk\" (UniqueName: \"kubernetes.io/projected/76c84cc9-8833-4227-af3f-7064c9232366-kube-api-access-lqxlk\") pod \"nova-cell1-conductor-db-sync-bbd9v\" (UID: \"76c84cc9-8833-4227-af3f-7064c9232366\") " pod="openstack/nova-cell1-conductor-db-sync-bbd9v" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.720837 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76c84cc9-8833-4227-af3f-7064c9232366-config-data\") pod \"nova-cell1-conductor-db-sync-bbd9v\" (UID: \"76c84cc9-8833-4227-af3f-7064c9232366\") " pod="openstack/nova-cell1-conductor-db-sync-bbd9v" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.720938 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76c84cc9-8833-4227-af3f-7064c9232366-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bbd9v\" (UID: \"76c84cc9-8833-4227-af3f-7064c9232366\") " pod="openstack/nova-cell1-conductor-db-sync-bbd9v" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.721085 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76c84cc9-8833-4227-af3f-7064c9232366-scripts\") pod \"nova-cell1-conductor-db-sync-bbd9v\" (UID: \"76c84cc9-8833-4227-af3f-7064c9232366\") " pod="openstack/nova-cell1-conductor-db-sync-bbd9v" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.725847 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76c84cc9-8833-4227-af3f-7064c9232366-scripts\") pod \"nova-cell1-conductor-db-sync-bbd9v\" (UID: \"76c84cc9-8833-4227-af3f-7064c9232366\") " pod="openstack/nova-cell1-conductor-db-sync-bbd9v" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.726474 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76c84cc9-8833-4227-af3f-7064c9232366-config-data\") pod \"nova-cell1-conductor-db-sync-bbd9v\" (UID: \"76c84cc9-8833-4227-af3f-7064c9232366\") " pod="openstack/nova-cell1-conductor-db-sync-bbd9v" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.726965 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76c84cc9-8833-4227-af3f-7064c9232366-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bbd9v\" (UID: \"76c84cc9-8833-4227-af3f-7064c9232366\") " pod="openstack/nova-cell1-conductor-db-sync-bbd9v" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.740067 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqxlk\" (UniqueName: \"kubernetes.io/projected/76c84cc9-8833-4227-af3f-7064c9232366-kube-api-access-lqxlk\") pod \"nova-cell1-conductor-db-sync-bbd9v\" (UID: \"76c84cc9-8833-4227-af3f-7064c9232366\") " pod="openstack/nova-cell1-conductor-db-sync-bbd9v" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.797693 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.816256 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.904282 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bbd9v" Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.950964 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-mr7mf"] Feb 27 01:26:09 crc kubenswrapper[4771]: I0227 01:26:09.962015 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 01:26:09 crc kubenswrapper[4771]: W0227 01:26:09.975807 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc9923bd_11c1_4d1d_965b_17e8352ece8c.slice/crio-fcb1ce90d61040b27a02a35ac5a12c3b9e8ac6802b97c7f6a109a99577c307b9 WatchSource:0}: Error finding container fcb1ce90d61040b27a02a35ac5a12c3b9e8ac6802b97c7f6a109a99577c307b9: Status 404 returned error can't find the container with id fcb1ce90d61040b27a02a35ac5a12c3b9e8ac6802b97c7f6a109a99577c307b9 Feb 27 01:26:10 crc kubenswrapper[4771]: I0227 01:26:10.292423 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"093bc468-6ace-4a5a-a695-b8b202f64bcd","Type":"ContainerStarted","Data":"a83e9360fec23743635a00a9181ac1936db3d189a396554dededf2f27679dcf5"} Feb 27 01:26:10 crc kubenswrapper[4771]: I0227 01:26:10.294304 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"51d00b94-9ca5-41df-9dd3-2c638d714751","Type":"ContainerStarted","Data":"52d6c7671f12c940864f1b32e6d2a45fefa61f3d9d8a66b0f4511c0cf28fcbca"} Feb 27 01:26:10 crc kubenswrapper[4771]: I0227 01:26:10.295532 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" event={"ID":"bc9923bd-11c1-4d1d-965b-17e8352ece8c","Type":"ContainerStarted","Data":"fcb1ce90d61040b27a02a35ac5a12c3b9e8ac6802b97c7f6a109a99577c307b9"} Feb 27 01:26:10 crc kubenswrapper[4771]: I0227 01:26:10.298091 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97954daf-91f3-4031-a5bc-e5c6429a8810","Type":"ContainerStarted","Data":"0d33f881f100af51641cfa7fe5722478a6df1de1bff73744367acc4c072272f5"} Feb 27 01:26:10 crc kubenswrapper[4771]: I0227 01:26:10.299038 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bbd844cf-2d18-4cf0-946a-5fa088d72a18","Type":"ContainerStarted","Data":"d8c34a278cf389f99f44ebb52a28d542d980050fa69288dbc383f6c6caeac90c"} Feb 27 01:26:10 crc kubenswrapper[4771]: I0227 01:26:10.301301 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6vpvq" event={"ID":"4d5da4dc-52ee-48f5-b5af-0fea453db0d7","Type":"ContainerStarted","Data":"a2240891cb114d6fa5dcbf2cd6dcd84379d8ca96e14a60c684c90dece827cd5d"} Feb 27 01:26:10 crc kubenswrapper[4771]: I0227 01:26:10.301337 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6vpvq" event={"ID":"4d5da4dc-52ee-48f5-b5af-0fea453db0d7","Type":"ContainerStarted","Data":"d0e8faa8d177c5ade5044cb8b04e7fd5e512fb153219bc76faa8fdd1e5dcc698"} Feb 27 01:26:10 crc kubenswrapper[4771]: I0227 01:26:10.326832 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-6vpvq" podStartSLOduration=2.326814758 podStartE2EDuration="2.326814758s" podCreationTimestamp="2026-02-27 01:26:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:26:10.317285609 +0000 UTC m=+1283.254846897" watchObservedRunningTime="2026-02-27 01:26:10.326814758 +0000 UTC m=+1283.264376046" Feb 27 01:26:10 crc kubenswrapper[4771]: I0227 01:26:10.398979 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bbd9v"] Feb 27 01:26:10 crc kubenswrapper[4771]: W0227 01:26:10.472170 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76c84cc9_8833_4227_af3f_7064c9232366.slice/crio-baf5b1eef4a768db09d363325721e008c674a28408f5935ba978705db0303d85 WatchSource:0}: Error finding container baf5b1eef4a768db09d363325721e008c674a28408f5935ba978705db0303d85: Status 404 returned error can't find the container with id baf5b1eef4a768db09d363325721e008c674a28408f5935ba978705db0303d85 Feb 27 01:26:11 crc kubenswrapper[4771]: I0227 01:26:11.324845 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bbd9v" event={"ID":"76c84cc9-8833-4227-af3f-7064c9232366","Type":"ContainerStarted","Data":"ec2f9a03b3dda302804ac55c3e4a7b3e8c1f087c8f42a89c2abf799701567422"} Feb 27 01:26:11 crc kubenswrapper[4771]: I0227 01:26:11.325116 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bbd9v" event={"ID":"76c84cc9-8833-4227-af3f-7064c9232366","Type":"ContainerStarted","Data":"baf5b1eef4a768db09d363325721e008c674a28408f5935ba978705db0303d85"} Feb 27 01:26:11 crc kubenswrapper[4771]: I0227 01:26:11.328357 4771 generic.go:334] "Generic (PLEG): container finished" podID="bc9923bd-11c1-4d1d-965b-17e8352ece8c" containerID="c68767e3698a273f9632698e9f3c9ae6e8c33000b9ea252b4be09ef6cfffdc06" exitCode=0 Feb 27 01:26:11 crc kubenswrapper[4771]: I0227 01:26:11.328438 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" event={"ID":"bc9923bd-11c1-4d1d-965b-17e8352ece8c","Type":"ContainerDied","Data":"c68767e3698a273f9632698e9f3c9ae6e8c33000b9ea252b4be09ef6cfffdc06"} Feb 27 01:26:11 crc kubenswrapper[4771]: I0227 01:26:11.340876 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-bbd9v" podStartSLOduration=2.3408403890000002 podStartE2EDuration="2.340840389s" podCreationTimestamp="2026-02-27 01:26:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:26:11.337863239 +0000 UTC m=+1284.275424517" watchObservedRunningTime="2026-02-27 01:26:11.340840389 +0000 UTC m=+1284.278401677" Feb 27 01:26:12 crc kubenswrapper[4771]: I0227 01:26:12.017962 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 01:26:12 crc kubenswrapper[4771]: I0227 01:26:12.030150 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 01:26:13 crc kubenswrapper[4771]: I0227 01:26:13.360315 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97954daf-91f3-4031-a5bc-e5c6429a8810","Type":"ContainerStarted","Data":"df27ba885a5439582ce4046e5041d88cf53f4a5ef6b2197fbe25f13a9491e681"} Feb 27 01:26:13 crc kubenswrapper[4771]: I0227 01:26:13.362322 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" event={"ID":"bc9923bd-11c1-4d1d-965b-17e8352ece8c","Type":"ContainerStarted","Data":"8ea850a3026c7791dfec2596a9f06c5992bde67d230a4b9e7ae6bb75c4c9544d"} Feb 27 01:26:13 crc kubenswrapper[4771]: I0227 01:26:13.362444 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" Feb 27 01:26:13 crc kubenswrapper[4771]: I0227 01:26:13.364032 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bbd844cf-2d18-4cf0-946a-5fa088d72a18","Type":"ContainerStarted","Data":"5e34b8cb69adb17949c273d8cd95f6a852936c972a662436ccd3823b4267d635"} Feb 27 01:26:13 crc kubenswrapper[4771]: I0227 01:26:13.371334 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"093bc468-6ace-4a5a-a695-b8b202f64bcd","Type":"ContainerStarted","Data":"00cd740c4404e462b1bc90e9acada3291ba9e04cce9a66bf068036db1fee0945"} Feb 27 01:26:13 crc kubenswrapper[4771]: I0227 01:26:13.371522 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="093bc468-6ace-4a5a-a695-b8b202f64bcd" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://00cd740c4404e462b1bc90e9acada3291ba9e04cce9a66bf068036db1fee0945" gracePeriod=30 Feb 27 01:26:13 crc kubenswrapper[4771]: I0227 01:26:13.385346 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"51d00b94-9ca5-41df-9dd3-2c638d714751","Type":"ContainerStarted","Data":"2a09dd2f48e09cc22f9cf6d44d22191508997f795ecc77431981f495685f7bb2"} Feb 27 01:26:13 crc kubenswrapper[4771]: I0227 01:26:13.386866 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" podStartSLOduration=5.38684321 podStartE2EDuration="5.38684321s" podCreationTimestamp="2026-02-27 01:26:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:26:13.381096014 +0000 UTC m=+1286.318657302" watchObservedRunningTime="2026-02-27 01:26:13.38684321 +0000 UTC m=+1286.324404488" Feb 27 01:26:13 crc kubenswrapper[4771]: I0227 01:26:13.399838 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.356686739 podStartE2EDuration="5.399818723s" podCreationTimestamp="2026-02-27 01:26:08 +0000 UTC" firstStartedPulling="2026-02-27 01:26:09.803856729 +0000 UTC m=+1282.741418017" lastFinishedPulling="2026-02-27 01:26:12.846988713 +0000 UTC m=+1285.784550001" observedRunningTime="2026-02-27 01:26:13.396656856 +0000 UTC m=+1286.334218144" watchObservedRunningTime="2026-02-27 01:26:13.399818723 +0000 UTC m=+1286.337380011" Feb 27 01:26:13 crc kubenswrapper[4771]: I0227 01:26:13.416419 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.373154284 podStartE2EDuration="5.416402872s" podCreationTimestamp="2026-02-27 01:26:08 +0000 UTC" firstStartedPulling="2026-02-27 01:26:09.803575351 +0000 UTC m=+1282.741136639" lastFinishedPulling="2026-02-27 01:26:12.846823929 +0000 UTC m=+1285.784385227" observedRunningTime="2026-02-27 01:26:13.411776547 +0000 UTC m=+1286.349337835" watchObservedRunningTime="2026-02-27 01:26:13.416402872 +0000 UTC m=+1286.353964160" Feb 27 01:26:14 crc kubenswrapper[4771]: I0227 01:26:14.070258 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:14 crc kubenswrapper[4771]: I0227 01:26:14.110640 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 27 01:26:14 crc kubenswrapper[4771]: I0227 01:26:14.395794 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97954daf-91f3-4031-a5bc-e5c6429a8810","Type":"ContainerStarted","Data":"b5d72c52d3852ec1831db3983f510adc0e98dd37936973994b307935a1da2592"} Feb 27 01:26:14 crc kubenswrapper[4771]: I0227 01:26:14.398863 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bbd844cf-2d18-4cf0-946a-5fa088d72a18","Type":"ContainerStarted","Data":"79516992855ab904c953afadecc8de239e87d084253e7435693424dc19e0a2db"} Feb 27 01:26:14 crc kubenswrapper[4771]: I0227 01:26:14.399161 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bbd844cf-2d18-4cf0-946a-5fa088d72a18" containerName="nova-metadata-metadata" containerID="cri-o://79516992855ab904c953afadecc8de239e87d084253e7435693424dc19e0a2db" gracePeriod=30 Feb 27 01:26:14 crc kubenswrapper[4771]: I0227 01:26:14.399142 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bbd844cf-2d18-4cf0-946a-5fa088d72a18" containerName="nova-metadata-log" containerID="cri-o://5e34b8cb69adb17949c273d8cd95f6a852936c972a662436ccd3823b4267d635" gracePeriod=30 Feb 27 01:26:14 crc kubenswrapper[4771]: I0227 01:26:14.421395 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.13386216 podStartE2EDuration="6.421375529s" podCreationTimestamp="2026-02-27 01:26:08 +0000 UTC" firstStartedPulling="2026-02-27 01:26:09.549905044 +0000 UTC m=+1282.487466332" lastFinishedPulling="2026-02-27 01:26:12.837418413 +0000 UTC m=+1285.774979701" observedRunningTime="2026-02-27 01:26:14.417094702 +0000 UTC m=+1287.354655990" watchObservedRunningTime="2026-02-27 01:26:14.421375529 +0000 UTC m=+1287.358936827" Feb 27 01:26:14 crc kubenswrapper[4771]: I0227 01:26:14.447534 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.589006928 podStartE2EDuration="6.447511289s" podCreationTimestamp="2026-02-27 01:26:08 +0000 UTC" firstStartedPulling="2026-02-27 01:26:09.991020611 +0000 UTC m=+1282.928581889" lastFinishedPulling="2026-02-27 01:26:12.849524962 +0000 UTC m=+1285.787086250" observedRunningTime="2026-02-27 01:26:14.442232865 +0000 UTC m=+1287.379794183" watchObservedRunningTime="2026-02-27 01:26:14.447511289 +0000 UTC m=+1287.385072587" Feb 27 01:26:14 crc kubenswrapper[4771]: I0227 01:26:14.975900 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.051166 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd844cf-2d18-4cf0-946a-5fa088d72a18-config-data\") pod \"bbd844cf-2d18-4cf0-946a-5fa088d72a18\" (UID: \"bbd844cf-2d18-4cf0-946a-5fa088d72a18\") " Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.051252 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd844cf-2d18-4cf0-946a-5fa088d72a18-combined-ca-bundle\") pod \"bbd844cf-2d18-4cf0-946a-5fa088d72a18\" (UID: \"bbd844cf-2d18-4cf0-946a-5fa088d72a18\") " Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.051492 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krchb\" (UniqueName: \"kubernetes.io/projected/bbd844cf-2d18-4cf0-946a-5fa088d72a18-kube-api-access-krchb\") pod \"bbd844cf-2d18-4cf0-946a-5fa088d72a18\" (UID: \"bbd844cf-2d18-4cf0-946a-5fa088d72a18\") " Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.051603 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbd844cf-2d18-4cf0-946a-5fa088d72a18-logs\") pod \"bbd844cf-2d18-4cf0-946a-5fa088d72a18\" (UID: \"bbd844cf-2d18-4cf0-946a-5fa088d72a18\") " Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.052406 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbd844cf-2d18-4cf0-946a-5fa088d72a18-logs" (OuterVolumeSpecName: "logs") pod "bbd844cf-2d18-4cf0-946a-5fa088d72a18" (UID: "bbd844cf-2d18-4cf0-946a-5fa088d72a18"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.053224 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbd844cf-2d18-4cf0-946a-5fa088d72a18-logs\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.071302 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbd844cf-2d18-4cf0-946a-5fa088d72a18-kube-api-access-krchb" (OuterVolumeSpecName: "kube-api-access-krchb") pod "bbd844cf-2d18-4cf0-946a-5fa088d72a18" (UID: "bbd844cf-2d18-4cf0-946a-5fa088d72a18"). InnerVolumeSpecName "kube-api-access-krchb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.078636 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbd844cf-2d18-4cf0-946a-5fa088d72a18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbd844cf-2d18-4cf0-946a-5fa088d72a18" (UID: "bbd844cf-2d18-4cf0-946a-5fa088d72a18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.081257 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbd844cf-2d18-4cf0-946a-5fa088d72a18-config-data" (OuterVolumeSpecName: "config-data") pod "bbd844cf-2d18-4cf0-946a-5fa088d72a18" (UID: "bbd844cf-2d18-4cf0-946a-5fa088d72a18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.155330 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd844cf-2d18-4cf0-946a-5fa088d72a18-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.155365 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd844cf-2d18-4cf0-946a-5fa088d72a18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.155375 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krchb\" (UniqueName: \"kubernetes.io/projected/bbd844cf-2d18-4cf0-946a-5fa088d72a18-kube-api-access-krchb\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.414704 4771 generic.go:334] "Generic (PLEG): container finished" podID="bbd844cf-2d18-4cf0-946a-5fa088d72a18" containerID="79516992855ab904c953afadecc8de239e87d084253e7435693424dc19e0a2db" exitCode=0 Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.414751 4771 generic.go:334] "Generic (PLEG): container finished" podID="bbd844cf-2d18-4cf0-946a-5fa088d72a18" containerID="5e34b8cb69adb17949c273d8cd95f6a852936c972a662436ccd3823b4267d635" exitCode=143 Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.414765 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.414845 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bbd844cf-2d18-4cf0-946a-5fa088d72a18","Type":"ContainerDied","Data":"79516992855ab904c953afadecc8de239e87d084253e7435693424dc19e0a2db"} Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.414883 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bbd844cf-2d18-4cf0-946a-5fa088d72a18","Type":"ContainerDied","Data":"5e34b8cb69adb17949c273d8cd95f6a852936c972a662436ccd3823b4267d635"} Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.414902 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bbd844cf-2d18-4cf0-946a-5fa088d72a18","Type":"ContainerDied","Data":"d8c34a278cf389f99f44ebb52a28d542d980050fa69288dbc383f6c6caeac90c"} Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.414926 4771 scope.go:117] "RemoveContainer" containerID="79516992855ab904c953afadecc8de239e87d084253e7435693424dc19e0a2db" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.464181 4771 scope.go:117] "RemoveContainer" containerID="5e34b8cb69adb17949c273d8cd95f6a852936c972a662436ccd3823b4267d635" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.472087 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.491421 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.504668 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 27 01:26:15 crc kubenswrapper[4771]: E0227 01:26:15.505175 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbd844cf-2d18-4cf0-946a-5fa088d72a18" containerName="nova-metadata-log" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.505199 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbd844cf-2d18-4cf0-946a-5fa088d72a18" containerName="nova-metadata-log" Feb 27 01:26:15 crc kubenswrapper[4771]: E0227 01:26:15.505221 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbd844cf-2d18-4cf0-946a-5fa088d72a18" containerName="nova-metadata-metadata" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.505228 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbd844cf-2d18-4cf0-946a-5fa088d72a18" containerName="nova-metadata-metadata" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.505448 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbd844cf-2d18-4cf0-946a-5fa088d72a18" containerName="nova-metadata-metadata" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.505481 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbd844cf-2d18-4cf0-946a-5fa088d72a18" containerName="nova-metadata-log" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.506611 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.511719 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.511937 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.513305 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.518764 4771 scope.go:117] "RemoveContainer" containerID="79516992855ab904c953afadecc8de239e87d084253e7435693424dc19e0a2db" Feb 27 01:26:15 crc kubenswrapper[4771]: E0227 01:26:15.521750 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79516992855ab904c953afadecc8de239e87d084253e7435693424dc19e0a2db\": container with ID starting with 79516992855ab904c953afadecc8de239e87d084253e7435693424dc19e0a2db not found: ID does not exist" containerID="79516992855ab904c953afadecc8de239e87d084253e7435693424dc19e0a2db" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.521803 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79516992855ab904c953afadecc8de239e87d084253e7435693424dc19e0a2db"} err="failed to get container status \"79516992855ab904c953afadecc8de239e87d084253e7435693424dc19e0a2db\": rpc error: code = NotFound desc = could not find container \"79516992855ab904c953afadecc8de239e87d084253e7435693424dc19e0a2db\": container with ID starting with 79516992855ab904c953afadecc8de239e87d084253e7435693424dc19e0a2db not found: ID does not exist" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.521836 4771 scope.go:117] "RemoveContainer" containerID="5e34b8cb69adb17949c273d8cd95f6a852936c972a662436ccd3823b4267d635" Feb 27 01:26:15 crc kubenswrapper[4771]: E0227 01:26:15.522535 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e34b8cb69adb17949c273d8cd95f6a852936c972a662436ccd3823b4267d635\": container with ID starting with 5e34b8cb69adb17949c273d8cd95f6a852936c972a662436ccd3823b4267d635 not found: ID does not exist" containerID="5e34b8cb69adb17949c273d8cd95f6a852936c972a662436ccd3823b4267d635" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.522615 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e34b8cb69adb17949c273d8cd95f6a852936c972a662436ccd3823b4267d635"} err="failed to get container status \"5e34b8cb69adb17949c273d8cd95f6a852936c972a662436ccd3823b4267d635\": rpc error: code = NotFound desc = could not find container \"5e34b8cb69adb17949c273d8cd95f6a852936c972a662436ccd3823b4267d635\": container with ID starting with 5e34b8cb69adb17949c273d8cd95f6a852936c972a662436ccd3823b4267d635 not found: ID does not exist" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.522666 4771 scope.go:117] "RemoveContainer" containerID="79516992855ab904c953afadecc8de239e87d084253e7435693424dc19e0a2db" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.523463 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79516992855ab904c953afadecc8de239e87d084253e7435693424dc19e0a2db"} err="failed to get container status \"79516992855ab904c953afadecc8de239e87d084253e7435693424dc19e0a2db\": rpc error: code = NotFound desc = could not find container \"79516992855ab904c953afadecc8de239e87d084253e7435693424dc19e0a2db\": container with ID starting with 79516992855ab904c953afadecc8de239e87d084253e7435693424dc19e0a2db not found: ID does not exist" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.523494 4771 scope.go:117] "RemoveContainer" containerID="5e34b8cb69adb17949c273d8cd95f6a852936c972a662436ccd3823b4267d635" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.524539 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e34b8cb69adb17949c273d8cd95f6a852936c972a662436ccd3823b4267d635"} err="failed to get container status \"5e34b8cb69adb17949c273d8cd95f6a852936c972a662436ccd3823b4267d635\": rpc error: code = NotFound desc = could not find container \"5e34b8cb69adb17949c273d8cd95f6a852936c972a662436ccd3823b4267d635\": container with ID starting with 5e34b8cb69adb17949c273d8cd95f6a852936c972a662436ccd3823b4267d635 not found: ID does not exist" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.569366 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f64f305a-75ae-416a-983f-3b3482ba9886-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f64f305a-75ae-416a-983f-3b3482ba9886\") " pod="openstack/nova-metadata-0" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.569533 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f64f305a-75ae-416a-983f-3b3482ba9886-config-data\") pod \"nova-metadata-0\" (UID: \"f64f305a-75ae-416a-983f-3b3482ba9886\") " pod="openstack/nova-metadata-0" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.570115 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f64f305a-75ae-416a-983f-3b3482ba9886-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f64f305a-75ae-416a-983f-3b3482ba9886\") " pod="openstack/nova-metadata-0" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.570198 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f64f305a-75ae-416a-983f-3b3482ba9886-logs\") pod \"nova-metadata-0\" (UID: \"f64f305a-75ae-416a-983f-3b3482ba9886\") " pod="openstack/nova-metadata-0" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.570245 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz7np\" (UniqueName: \"kubernetes.io/projected/f64f305a-75ae-416a-983f-3b3482ba9886-kube-api-access-vz7np\") pod \"nova-metadata-0\" (UID: \"f64f305a-75ae-416a-983f-3b3482ba9886\") " pod="openstack/nova-metadata-0" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.671890 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f64f305a-75ae-416a-983f-3b3482ba9886-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f64f305a-75ae-416a-983f-3b3482ba9886\") " pod="openstack/nova-metadata-0" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.672319 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f64f305a-75ae-416a-983f-3b3482ba9886-logs\") pod \"nova-metadata-0\" (UID: \"f64f305a-75ae-416a-983f-3b3482ba9886\") " pod="openstack/nova-metadata-0" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.672370 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz7np\" (UniqueName: \"kubernetes.io/projected/f64f305a-75ae-416a-983f-3b3482ba9886-kube-api-access-vz7np\") pod \"nova-metadata-0\" (UID: \"f64f305a-75ae-416a-983f-3b3482ba9886\") " pod="openstack/nova-metadata-0" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.672412 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f64f305a-75ae-416a-983f-3b3482ba9886-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f64f305a-75ae-416a-983f-3b3482ba9886\") " pod="openstack/nova-metadata-0" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.673481 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f64f305a-75ae-416a-983f-3b3482ba9886-logs\") pod \"nova-metadata-0\" (UID: \"f64f305a-75ae-416a-983f-3b3482ba9886\") " pod="openstack/nova-metadata-0" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.672533 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f64f305a-75ae-416a-983f-3b3482ba9886-config-data\") pod \"nova-metadata-0\" (UID: \"f64f305a-75ae-416a-983f-3b3482ba9886\") " pod="openstack/nova-metadata-0" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.678149 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f64f305a-75ae-416a-983f-3b3482ba9886-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f64f305a-75ae-416a-983f-3b3482ba9886\") " pod="openstack/nova-metadata-0" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.678636 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f64f305a-75ae-416a-983f-3b3482ba9886-config-data\") pod \"nova-metadata-0\" (UID: \"f64f305a-75ae-416a-983f-3b3482ba9886\") " pod="openstack/nova-metadata-0" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.693171 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz7np\" (UniqueName: \"kubernetes.io/projected/f64f305a-75ae-416a-983f-3b3482ba9886-kube-api-access-vz7np\") pod \"nova-metadata-0\" (UID: \"f64f305a-75ae-416a-983f-3b3482ba9886\") " pod="openstack/nova-metadata-0" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.695924 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f64f305a-75ae-416a-983f-3b3482ba9886-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f64f305a-75ae-416a-983f-3b3482ba9886\") " pod="openstack/nova-metadata-0" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.792715 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbd844cf-2d18-4cf0-946a-5fa088d72a18" path="/var/lib/kubelet/pods/bbd844cf-2d18-4cf0-946a-5fa088d72a18/volumes" Feb 27 01:26:15 crc kubenswrapper[4771]: I0227 01:26:15.832787 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 01:26:16 crc kubenswrapper[4771]: I0227 01:26:16.330999 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 01:26:16 crc kubenswrapper[4771]: W0227 01:26:16.356807 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf64f305a_75ae_416a_983f_3b3482ba9886.slice/crio-a045fc001895aaa7c3660a005b1b5ec6c80a960bd1ad316c5e7a8a71e3892fd0 WatchSource:0}: Error finding container a045fc001895aaa7c3660a005b1b5ec6c80a960bd1ad316c5e7a8a71e3892fd0: Status 404 returned error can't find the container with id a045fc001895aaa7c3660a005b1b5ec6c80a960bd1ad316c5e7a8a71e3892fd0 Feb 27 01:26:16 crc kubenswrapper[4771]: I0227 01:26:16.438784 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f64f305a-75ae-416a-983f-3b3482ba9886","Type":"ContainerStarted","Data":"a045fc001895aaa7c3660a005b1b5ec6c80a960bd1ad316c5e7a8a71e3892fd0"} Feb 27 01:26:17 crc kubenswrapper[4771]: I0227 01:26:17.454009 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f64f305a-75ae-416a-983f-3b3482ba9886","Type":"ContainerStarted","Data":"1918c59e4c0f85b43001a9418848bbef0411d66b0ed2ff07f139617de5c62e8e"} Feb 27 01:26:17 crc kubenswrapper[4771]: I0227 01:26:17.455947 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f64f305a-75ae-416a-983f-3b3482ba9886","Type":"ContainerStarted","Data":"64c2660e0ba3519271d87e2b4ebf7c3bffde7ca9c56a6c5cf3057e78e76a05f0"} Feb 27 01:26:17 crc kubenswrapper[4771]: I0227 01:26:17.457901 4771 generic.go:334] "Generic (PLEG): container finished" podID="4d5da4dc-52ee-48f5-b5af-0fea453db0d7" containerID="a2240891cb114d6fa5dcbf2cd6dcd84379d8ca96e14a60c684c90dece827cd5d" exitCode=0 Feb 27 01:26:17 crc kubenswrapper[4771]: I0227 01:26:17.457985 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6vpvq" event={"ID":"4d5da4dc-52ee-48f5-b5af-0fea453db0d7","Type":"ContainerDied","Data":"a2240891cb114d6fa5dcbf2cd6dcd84379d8ca96e14a60c684c90dece827cd5d"} Feb 27 01:26:17 crc kubenswrapper[4771]: I0227 01:26:17.490179 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.490144799 podStartE2EDuration="2.490144799s" podCreationTimestamp="2026-02-27 01:26:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:26:17.48133801 +0000 UTC m=+1290.418899348" watchObservedRunningTime="2026-02-27 01:26:17.490144799 +0000 UTC m=+1290.427706097" Feb 27 01:26:18 crc kubenswrapper[4771]: I0227 01:26:18.477421 4771 generic.go:334] "Generic (PLEG): container finished" podID="76c84cc9-8833-4227-af3f-7064c9232366" containerID="ec2f9a03b3dda302804ac55c3e4a7b3e8c1f087c8f42a89c2abf799701567422" exitCode=0 Feb 27 01:26:18 crc kubenswrapper[4771]: I0227 01:26:18.477498 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bbd9v" event={"ID":"76c84cc9-8833-4227-af3f-7064c9232366","Type":"ContainerDied","Data":"ec2f9a03b3dda302804ac55c3e4a7b3e8c1f087c8f42a89c2abf799701567422"} Feb 27 01:26:18 crc kubenswrapper[4771]: I0227 01:26:18.872228 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6vpvq" Feb 27 01:26:18 crc kubenswrapper[4771]: I0227 01:26:18.909265 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 01:26:18 crc kubenswrapper[4771]: I0227 01:26:18.909323 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 01:26:18 crc kubenswrapper[4771]: I0227 01:26:18.942502 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d5da4dc-52ee-48f5-b5af-0fea453db0d7-scripts\") pod \"4d5da4dc-52ee-48f5-b5af-0fea453db0d7\" (UID: \"4d5da4dc-52ee-48f5-b5af-0fea453db0d7\") " Feb 27 01:26:18 crc kubenswrapper[4771]: I0227 01:26:18.942579 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7n5x\" (UniqueName: \"kubernetes.io/projected/4d5da4dc-52ee-48f5-b5af-0fea453db0d7-kube-api-access-g7n5x\") pod \"4d5da4dc-52ee-48f5-b5af-0fea453db0d7\" (UID: \"4d5da4dc-52ee-48f5-b5af-0fea453db0d7\") " Feb 27 01:26:18 crc kubenswrapper[4771]: I0227 01:26:18.942638 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d5da4dc-52ee-48f5-b5af-0fea453db0d7-config-data\") pod \"4d5da4dc-52ee-48f5-b5af-0fea453db0d7\" (UID: \"4d5da4dc-52ee-48f5-b5af-0fea453db0d7\") " Feb 27 01:26:18 crc kubenswrapper[4771]: I0227 01:26:18.942791 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d5da4dc-52ee-48f5-b5af-0fea453db0d7-combined-ca-bundle\") pod \"4d5da4dc-52ee-48f5-b5af-0fea453db0d7\" (UID: \"4d5da4dc-52ee-48f5-b5af-0fea453db0d7\") " Feb 27 01:26:18 crc kubenswrapper[4771]: I0227 01:26:18.948723 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d5da4dc-52ee-48f5-b5af-0fea453db0d7-scripts" (OuterVolumeSpecName: "scripts") pod "4d5da4dc-52ee-48f5-b5af-0fea453db0d7" (UID: "4d5da4dc-52ee-48f5-b5af-0fea453db0d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:18 crc kubenswrapper[4771]: I0227 01:26:18.952718 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d5da4dc-52ee-48f5-b5af-0fea453db0d7-kube-api-access-g7n5x" (OuterVolumeSpecName: "kube-api-access-g7n5x") pod "4d5da4dc-52ee-48f5-b5af-0fea453db0d7" (UID: "4d5da4dc-52ee-48f5-b5af-0fea453db0d7"). InnerVolumeSpecName "kube-api-access-g7n5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:26:18 crc kubenswrapper[4771]: I0227 01:26:18.994141 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d5da4dc-52ee-48f5-b5af-0fea453db0d7-config-data" (OuterVolumeSpecName: "config-data") pod "4d5da4dc-52ee-48f5-b5af-0fea453db0d7" (UID: "4d5da4dc-52ee-48f5-b5af-0fea453db0d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:18 crc kubenswrapper[4771]: I0227 01:26:18.997592 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d5da4dc-52ee-48f5-b5af-0fea453db0d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d5da4dc-52ee-48f5-b5af-0fea453db0d7" (UID: "4d5da4dc-52ee-48f5-b5af-0fea453db0d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:19 crc kubenswrapper[4771]: I0227 01:26:19.045126 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d5da4dc-52ee-48f5-b5af-0fea453db0d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:19 crc kubenswrapper[4771]: I0227 01:26:19.045172 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d5da4dc-52ee-48f5-b5af-0fea453db0d7-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:19 crc kubenswrapper[4771]: I0227 01:26:19.045187 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7n5x\" (UniqueName: \"kubernetes.io/projected/4d5da4dc-52ee-48f5-b5af-0fea453db0d7-kube-api-access-g7n5x\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:19 crc kubenswrapper[4771]: I0227 01:26:19.045201 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d5da4dc-52ee-48f5-b5af-0fea453db0d7-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:19 crc kubenswrapper[4771]: I0227 01:26:19.110448 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 27 01:26:19 crc kubenswrapper[4771]: I0227 01:26:19.145202 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 27 01:26:19 crc kubenswrapper[4771]: I0227 01:26:19.262888 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" Feb 27 01:26:19 crc kubenswrapper[4771]: I0227 01:26:19.354490 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-tdlnj"] Feb 27 01:26:19 crc kubenswrapper[4771]: I0227 01:26:19.354930 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" podUID="dd33f007-b06d-4b0a-afb8-64e98985e598" containerName="dnsmasq-dns" containerID="cri-o://ca181ac2b33ab45be8149bd583e9136329282f6be4045ce3d776d820aae982ea" gracePeriod=10 Feb 27 01:26:19 crc kubenswrapper[4771]: I0227 01:26:19.504255 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6vpvq" Feb 27 01:26:19 crc kubenswrapper[4771]: I0227 01:26:19.504272 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6vpvq" event={"ID":"4d5da4dc-52ee-48f5-b5af-0fea453db0d7","Type":"ContainerDied","Data":"d0e8faa8d177c5ade5044cb8b04e7fd5e512fb153219bc76faa8fdd1e5dcc698"} Feb 27 01:26:19 crc kubenswrapper[4771]: I0227 01:26:19.504729 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0e8faa8d177c5ade5044cb8b04e7fd5e512fb153219bc76faa8fdd1e5dcc698" Feb 27 01:26:19 crc kubenswrapper[4771]: I0227 01:26:19.516750 4771 generic.go:334] "Generic (PLEG): container finished" podID="dd33f007-b06d-4b0a-afb8-64e98985e598" containerID="ca181ac2b33ab45be8149bd583e9136329282f6be4045ce3d776d820aae982ea" exitCode=0 Feb 27 01:26:19 crc kubenswrapper[4771]: I0227 01:26:19.517025 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" event={"ID":"dd33f007-b06d-4b0a-afb8-64e98985e598","Type":"ContainerDied","Data":"ca181ac2b33ab45be8149bd583e9136329282f6be4045ce3d776d820aae982ea"} Feb 27 01:26:19 crc kubenswrapper[4771]: I0227 01:26:19.563814 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 27 01:26:19 crc kubenswrapper[4771]: I0227 01:26:19.686138 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 27 01:26:19 crc kubenswrapper[4771]: I0227 01:26:19.686328 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="97954daf-91f3-4031-a5bc-e5c6429a8810" containerName="nova-api-log" containerID="cri-o://df27ba885a5439582ce4046e5041d88cf53f4a5ef6b2197fbe25f13a9491e681" gracePeriod=30 Feb 27 01:26:19 crc kubenswrapper[4771]: I0227 01:26:19.686516 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="97954daf-91f3-4031-a5bc-e5c6429a8810" containerName="nova-api-api" containerID="cri-o://b5d72c52d3852ec1831db3983f510adc0e98dd37936973994b307935a1da2592" gracePeriod=30 Feb 27 01:26:19 crc kubenswrapper[4771]: I0227 01:26:19.695165 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="97954daf-91f3-4031-a5bc-e5c6429a8810" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": EOF" Feb 27 01:26:19 crc kubenswrapper[4771]: I0227 01:26:19.695239 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="97954daf-91f3-4031-a5bc-e5c6429a8810" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": EOF" Feb 27 01:26:19 crc kubenswrapper[4771]: I0227 01:26:19.726533 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 01:26:19 crc kubenswrapper[4771]: I0227 01:26:19.965423 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" Feb 27 01:26:19 crc kubenswrapper[4771]: I0227 01:26:19.973140 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bbd9v" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.054674 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.066265 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-ovsdbserver-sb\") pod \"dd33f007-b06d-4b0a-afb8-64e98985e598\" (UID: \"dd33f007-b06d-4b0a-afb8-64e98985e598\") " Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.066336 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-ovsdbserver-nb\") pod \"dd33f007-b06d-4b0a-afb8-64e98985e598\" (UID: \"dd33f007-b06d-4b0a-afb8-64e98985e598\") " Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.066464 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-dns-swift-storage-0\") pod \"dd33f007-b06d-4b0a-afb8-64e98985e598\" (UID: \"dd33f007-b06d-4b0a-afb8-64e98985e598\") " Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.066503 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-dns-svc\") pod \"dd33f007-b06d-4b0a-afb8-64e98985e598\" (UID: \"dd33f007-b06d-4b0a-afb8-64e98985e598\") " Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.066578 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-config\") pod \"dd33f007-b06d-4b0a-afb8-64e98985e598\" (UID: \"dd33f007-b06d-4b0a-afb8-64e98985e598\") " Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.066620 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhfwz\" (UniqueName: \"kubernetes.io/projected/dd33f007-b06d-4b0a-afb8-64e98985e598-kube-api-access-qhfwz\") pod \"dd33f007-b06d-4b0a-afb8-64e98985e598\" (UID: \"dd33f007-b06d-4b0a-afb8-64e98985e598\") " Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.093914 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd33f007-b06d-4b0a-afb8-64e98985e598-kube-api-access-qhfwz" (OuterVolumeSpecName: "kube-api-access-qhfwz") pod "dd33f007-b06d-4b0a-afb8-64e98985e598" (UID: "dd33f007-b06d-4b0a-afb8-64e98985e598"). InnerVolumeSpecName "kube-api-access-qhfwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.125441 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-config" (OuterVolumeSpecName: "config") pod "dd33f007-b06d-4b0a-afb8-64e98985e598" (UID: "dd33f007-b06d-4b0a-afb8-64e98985e598"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.127421 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dd33f007-b06d-4b0a-afb8-64e98985e598" (UID: "dd33f007-b06d-4b0a-afb8-64e98985e598"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.135316 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dd33f007-b06d-4b0a-afb8-64e98985e598" (UID: "dd33f007-b06d-4b0a-afb8-64e98985e598"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.146388 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dd33f007-b06d-4b0a-afb8-64e98985e598" (UID: "dd33f007-b06d-4b0a-afb8-64e98985e598"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.158746 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dd33f007-b06d-4b0a-afb8-64e98985e598" (UID: "dd33f007-b06d-4b0a-afb8-64e98985e598"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.168521 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76c84cc9-8833-4227-af3f-7064c9232366-scripts\") pod \"76c84cc9-8833-4227-af3f-7064c9232366\" (UID: \"76c84cc9-8833-4227-af3f-7064c9232366\") " Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.168607 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqxlk\" (UniqueName: \"kubernetes.io/projected/76c84cc9-8833-4227-af3f-7064c9232366-kube-api-access-lqxlk\") pod \"76c84cc9-8833-4227-af3f-7064c9232366\" (UID: \"76c84cc9-8833-4227-af3f-7064c9232366\") " Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.168659 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76c84cc9-8833-4227-af3f-7064c9232366-config-data\") pod \"76c84cc9-8833-4227-af3f-7064c9232366\" (UID: \"76c84cc9-8833-4227-af3f-7064c9232366\") " Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.168795 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76c84cc9-8833-4227-af3f-7064c9232366-combined-ca-bundle\") pod \"76c84cc9-8833-4227-af3f-7064c9232366\" (UID: \"76c84cc9-8833-4227-af3f-7064c9232366\") " Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.169257 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.169279 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.169291 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.169304 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.169316 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhfwz\" (UniqueName: \"kubernetes.io/projected/dd33f007-b06d-4b0a-afb8-64e98985e598-kube-api-access-qhfwz\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.169329 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd33f007-b06d-4b0a-afb8-64e98985e598-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.172081 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76c84cc9-8833-4227-af3f-7064c9232366-scripts" (OuterVolumeSpecName: "scripts") pod "76c84cc9-8833-4227-af3f-7064c9232366" (UID: "76c84cc9-8833-4227-af3f-7064c9232366"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.174707 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76c84cc9-8833-4227-af3f-7064c9232366-kube-api-access-lqxlk" (OuterVolumeSpecName: "kube-api-access-lqxlk") pod "76c84cc9-8833-4227-af3f-7064c9232366" (UID: "76c84cc9-8833-4227-af3f-7064c9232366"). InnerVolumeSpecName "kube-api-access-lqxlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.194575 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76c84cc9-8833-4227-af3f-7064c9232366-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76c84cc9-8833-4227-af3f-7064c9232366" (UID: "76c84cc9-8833-4227-af3f-7064c9232366"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.207886 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76c84cc9-8833-4227-af3f-7064c9232366-config-data" (OuterVolumeSpecName: "config-data") pod "76c84cc9-8833-4227-af3f-7064c9232366" (UID: "76c84cc9-8833-4227-af3f-7064c9232366"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.271646 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76c84cc9-8833-4227-af3f-7064c9232366-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.271883 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqxlk\" (UniqueName: \"kubernetes.io/projected/76c84cc9-8833-4227-af3f-7064c9232366-kube-api-access-lqxlk\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.271906 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76c84cc9-8833-4227-af3f-7064c9232366-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.271917 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76c84cc9-8833-4227-af3f-7064c9232366-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.526264 4771 generic.go:334] "Generic (PLEG): container finished" podID="97954daf-91f3-4031-a5bc-e5c6429a8810" containerID="df27ba885a5439582ce4046e5041d88cf53f4a5ef6b2197fbe25f13a9491e681" exitCode=143 Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.526325 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97954daf-91f3-4031-a5bc-e5c6429a8810","Type":"ContainerDied","Data":"df27ba885a5439582ce4046e5041d88cf53f4a5ef6b2197fbe25f13a9491e681"} Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.527829 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.527838 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" event={"ID":"dd33f007-b06d-4b0a-afb8-64e98985e598","Type":"ContainerDied","Data":"1772810f2c49434d571a068258bc2d5cac4bc1c1755c2e1d2fb83f52d7eaf54d"} Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.527883 4771 scope.go:117] "RemoveContainer" containerID="ca181ac2b33ab45be8149bd583e9136329282f6be4045ce3d776d820aae982ea" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.530525 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bbd9v" event={"ID":"76c84cc9-8833-4227-af3f-7064c9232366","Type":"ContainerDied","Data":"baf5b1eef4a768db09d363325721e008c674a28408f5935ba978705db0303d85"} Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.530590 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="baf5b1eef4a768db09d363325721e008c674a28408f5935ba978705db0303d85" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.530596 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bbd9v" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.530894 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f64f305a-75ae-416a-983f-3b3482ba9886" containerName="nova-metadata-metadata" containerID="cri-o://1918c59e4c0f85b43001a9418848bbef0411d66b0ed2ff07f139617de5c62e8e" gracePeriod=30 Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.530889 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f64f305a-75ae-416a-983f-3b3482ba9886" containerName="nova-metadata-log" containerID="cri-o://64c2660e0ba3519271d87e2b4ebf7c3bffde7ca9c56a6c5cf3057e78e76a05f0" gracePeriod=30 Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.552392 4771 scope.go:117] "RemoveContainer" containerID="b98b33b34bf9699ba434450a4e8a742ba1cc8b6e07ff73ba995651810efd7dfe" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.587735 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 27 01:26:20 crc kubenswrapper[4771]: E0227 01:26:20.588056 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd33f007-b06d-4b0a-afb8-64e98985e598" containerName="init" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.588072 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd33f007-b06d-4b0a-afb8-64e98985e598" containerName="init" Feb 27 01:26:20 crc kubenswrapper[4771]: E0227 01:26:20.588087 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d5da4dc-52ee-48f5-b5af-0fea453db0d7" containerName="nova-manage" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.588093 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5da4dc-52ee-48f5-b5af-0fea453db0d7" containerName="nova-manage" Feb 27 01:26:20 crc kubenswrapper[4771]: E0227 01:26:20.588121 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76c84cc9-8833-4227-af3f-7064c9232366" containerName="nova-cell1-conductor-db-sync" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.588127 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="76c84cc9-8833-4227-af3f-7064c9232366" containerName="nova-cell1-conductor-db-sync" Feb 27 01:26:20 crc kubenswrapper[4771]: E0227 01:26:20.588142 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd33f007-b06d-4b0a-afb8-64e98985e598" containerName="dnsmasq-dns" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.588148 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd33f007-b06d-4b0a-afb8-64e98985e598" containerName="dnsmasq-dns" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.588776 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="76c84cc9-8833-4227-af3f-7064c9232366" containerName="nova-cell1-conductor-db-sync" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.588801 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd33f007-b06d-4b0a-afb8-64e98985e598" containerName="dnsmasq-dns" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.588812 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d5da4dc-52ee-48f5-b5af-0fea453db0d7" containerName="nova-manage" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.589942 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.595495 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.606472 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-tdlnj"] Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.614913 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-tdlnj"] Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.638990 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.682388 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdf27295-a275-4fb3-9e79-c3627df37a39-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"fdf27295-a275-4fb3-9e79-c3627df37a39\") " pod="openstack/nova-cell1-conductor-0" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.682755 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdf27295-a275-4fb3-9e79-c3627df37a39-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"fdf27295-a275-4fb3-9e79-c3627df37a39\") " pod="openstack/nova-cell1-conductor-0" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.682863 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnbql\" (UniqueName: \"kubernetes.io/projected/fdf27295-a275-4fb3-9e79-c3627df37a39-kube-api-access-lnbql\") pod \"nova-cell1-conductor-0\" (UID: \"fdf27295-a275-4fb3-9e79-c3627df37a39\") " pod="openstack/nova-cell1-conductor-0" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.787890 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdf27295-a275-4fb3-9e79-c3627df37a39-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"fdf27295-a275-4fb3-9e79-c3627df37a39\") " pod="openstack/nova-cell1-conductor-0" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.787956 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnbql\" (UniqueName: \"kubernetes.io/projected/fdf27295-a275-4fb3-9e79-c3627df37a39-kube-api-access-lnbql\") pod \"nova-cell1-conductor-0\" (UID: \"fdf27295-a275-4fb3-9e79-c3627df37a39\") " pod="openstack/nova-cell1-conductor-0" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.788146 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdf27295-a275-4fb3-9e79-c3627df37a39-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"fdf27295-a275-4fb3-9e79-c3627df37a39\") " pod="openstack/nova-cell1-conductor-0" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.793148 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdf27295-a275-4fb3-9e79-c3627df37a39-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"fdf27295-a275-4fb3-9e79-c3627df37a39\") " pod="openstack/nova-cell1-conductor-0" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.797193 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdf27295-a275-4fb3-9e79-c3627df37a39-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"fdf27295-a275-4fb3-9e79-c3627df37a39\") " pod="openstack/nova-cell1-conductor-0" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.808139 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnbql\" (UniqueName: \"kubernetes.io/projected/fdf27295-a275-4fb3-9e79-c3627df37a39-kube-api-access-lnbql\") pod \"nova-cell1-conductor-0\" (UID: \"fdf27295-a275-4fb3-9e79-c3627df37a39\") " pod="openstack/nova-cell1-conductor-0" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.833259 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.833327 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 27 01:26:20 crc kubenswrapper[4771]: I0227 01:26:20.910302 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.152659 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.301018 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f64f305a-75ae-416a-983f-3b3482ba9886-nova-metadata-tls-certs\") pod \"f64f305a-75ae-416a-983f-3b3482ba9886\" (UID: \"f64f305a-75ae-416a-983f-3b3482ba9886\") " Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.301423 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f64f305a-75ae-416a-983f-3b3482ba9886-logs\") pod \"f64f305a-75ae-416a-983f-3b3482ba9886\" (UID: \"f64f305a-75ae-416a-983f-3b3482ba9886\") " Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.301567 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f64f305a-75ae-416a-983f-3b3482ba9886-combined-ca-bundle\") pod \"f64f305a-75ae-416a-983f-3b3482ba9886\" (UID: \"f64f305a-75ae-416a-983f-3b3482ba9886\") " Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.301609 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f64f305a-75ae-416a-983f-3b3482ba9886-config-data\") pod \"f64f305a-75ae-416a-983f-3b3482ba9886\" (UID: \"f64f305a-75ae-416a-983f-3b3482ba9886\") " Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.301646 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz7np\" (UniqueName: \"kubernetes.io/projected/f64f305a-75ae-416a-983f-3b3482ba9886-kube-api-access-vz7np\") pod \"f64f305a-75ae-416a-983f-3b3482ba9886\" (UID: \"f64f305a-75ae-416a-983f-3b3482ba9886\") " Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.301759 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f64f305a-75ae-416a-983f-3b3482ba9886-logs" (OuterVolumeSpecName: "logs") pod "f64f305a-75ae-416a-983f-3b3482ba9886" (UID: "f64f305a-75ae-416a-983f-3b3482ba9886"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.301922 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f64f305a-75ae-416a-983f-3b3482ba9886-logs\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.312415 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f64f305a-75ae-416a-983f-3b3482ba9886-kube-api-access-vz7np" (OuterVolumeSpecName: "kube-api-access-vz7np") pod "f64f305a-75ae-416a-983f-3b3482ba9886" (UID: "f64f305a-75ae-416a-983f-3b3482ba9886"). InnerVolumeSpecName "kube-api-access-vz7np". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.328096 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f64f305a-75ae-416a-983f-3b3482ba9886-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f64f305a-75ae-416a-983f-3b3482ba9886" (UID: "f64f305a-75ae-416a-983f-3b3482ba9886"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.341061 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f64f305a-75ae-416a-983f-3b3482ba9886-config-data" (OuterVolumeSpecName: "config-data") pod "f64f305a-75ae-416a-983f-3b3482ba9886" (UID: "f64f305a-75ae-416a-983f-3b3482ba9886"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.352559 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f64f305a-75ae-416a-983f-3b3482ba9886-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f64f305a-75ae-416a-983f-3b3482ba9886" (UID: "f64f305a-75ae-416a-983f-3b3482ba9886"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.406122 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f64f305a-75ae-416a-983f-3b3482ba9886-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.406154 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f64f305a-75ae-416a-983f-3b3482ba9886-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.406171 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz7np\" (UniqueName: \"kubernetes.io/projected/f64f305a-75ae-416a-983f-3b3482ba9886-kube-api-access-vz7np\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.406181 4771 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f64f305a-75ae-416a-983f-3b3482ba9886-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:21 crc kubenswrapper[4771]: W0227 01:26:21.408535 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdf27295_a275_4fb3_9e79_c3627df37a39.slice/crio-e0a5257d52e2696403b7fd1061cce9b972794cc14676029b37b7fdf4cf11f263 WatchSource:0}: Error finding container e0a5257d52e2696403b7fd1061cce9b972794cc14676029b37b7fdf4cf11f263: Status 404 returned error can't find the container with id e0a5257d52e2696403b7fd1061cce9b972794cc14676029b37b7fdf4cf11f263 Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.411434 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.539468 4771 generic.go:334] "Generic (PLEG): container finished" podID="f64f305a-75ae-416a-983f-3b3482ba9886" containerID="1918c59e4c0f85b43001a9418848bbef0411d66b0ed2ff07f139617de5c62e8e" exitCode=0 Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.539504 4771 generic.go:334] "Generic (PLEG): container finished" podID="f64f305a-75ae-416a-983f-3b3482ba9886" containerID="64c2660e0ba3519271d87e2b4ebf7c3bffde7ca9c56a6c5cf3057e78e76a05f0" exitCode=143 Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.539511 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f64f305a-75ae-416a-983f-3b3482ba9886","Type":"ContainerDied","Data":"1918c59e4c0f85b43001a9418848bbef0411d66b0ed2ff07f139617de5c62e8e"} Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.539593 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f64f305a-75ae-416a-983f-3b3482ba9886","Type":"ContainerDied","Data":"64c2660e0ba3519271d87e2b4ebf7c3bffde7ca9c56a6c5cf3057e78e76a05f0"} Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.539608 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f64f305a-75ae-416a-983f-3b3482ba9886","Type":"ContainerDied","Data":"a045fc001895aaa7c3660a005b1b5ec6c80a960bd1ad316c5e7a8a71e3892fd0"} Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.539626 4771 scope.go:117] "RemoveContainer" containerID="1918c59e4c0f85b43001a9418848bbef0411d66b0ed2ff07f139617de5c62e8e" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.539528 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.540942 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"fdf27295-a275-4fb3-9e79-c3627df37a39","Type":"ContainerStarted","Data":"e0a5257d52e2696403b7fd1061cce9b972794cc14676029b37b7fdf4cf11f263"} Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.540991 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="51d00b94-9ca5-41df-9dd3-2c638d714751" containerName="nova-scheduler-scheduler" containerID="cri-o://2a09dd2f48e09cc22f9cf6d44d22191508997f795ecc77431981f495685f7bb2" gracePeriod=30 Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.575485 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.589614 4771 scope.go:117] "RemoveContainer" containerID="64c2660e0ba3519271d87e2b4ebf7c3bffde7ca9c56a6c5cf3057e78e76a05f0" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.590289 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.606995 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 27 01:26:21 crc kubenswrapper[4771]: E0227 01:26:21.607715 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f64f305a-75ae-416a-983f-3b3482ba9886" containerName="nova-metadata-metadata" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.607740 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f64f305a-75ae-416a-983f-3b3482ba9886" containerName="nova-metadata-metadata" Feb 27 01:26:21 crc kubenswrapper[4771]: E0227 01:26:21.607760 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f64f305a-75ae-416a-983f-3b3482ba9886" containerName="nova-metadata-log" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.607769 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f64f305a-75ae-416a-983f-3b3482ba9886" containerName="nova-metadata-log" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.608007 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f64f305a-75ae-416a-983f-3b3482ba9886" containerName="nova-metadata-log" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.608036 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f64f305a-75ae-416a-983f-3b3482ba9886" containerName="nova-metadata-metadata" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.610697 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.611264 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccd33248-70b6-45be-852d-d10692d396ad-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ccd33248-70b6-45be-852d-d10692d396ad\") " pod="openstack/nova-metadata-0" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.611337 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccd33248-70b6-45be-852d-d10692d396ad-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ccd33248-70b6-45be-852d-d10692d396ad\") " pod="openstack/nova-metadata-0" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.611386 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccd33248-70b6-45be-852d-d10692d396ad-logs\") pod \"nova-metadata-0\" (UID: \"ccd33248-70b6-45be-852d-d10692d396ad\") " pod="openstack/nova-metadata-0" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.611422 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5r2s\" (UniqueName: \"kubernetes.io/projected/ccd33248-70b6-45be-852d-d10692d396ad-kube-api-access-p5r2s\") pod \"nova-metadata-0\" (UID: \"ccd33248-70b6-45be-852d-d10692d396ad\") " pod="openstack/nova-metadata-0" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.611484 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccd33248-70b6-45be-852d-d10692d396ad-config-data\") pod \"nova-metadata-0\" (UID: \"ccd33248-70b6-45be-852d-d10692d396ad\") " pod="openstack/nova-metadata-0" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.614157 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.614353 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.640534 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.643887 4771 scope.go:117] "RemoveContainer" containerID="1918c59e4c0f85b43001a9418848bbef0411d66b0ed2ff07f139617de5c62e8e" Feb 27 01:26:21 crc kubenswrapper[4771]: E0227 01:26:21.644228 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1918c59e4c0f85b43001a9418848bbef0411d66b0ed2ff07f139617de5c62e8e\": container with ID starting with 1918c59e4c0f85b43001a9418848bbef0411d66b0ed2ff07f139617de5c62e8e not found: ID does not exist" containerID="1918c59e4c0f85b43001a9418848bbef0411d66b0ed2ff07f139617de5c62e8e" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.644320 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1918c59e4c0f85b43001a9418848bbef0411d66b0ed2ff07f139617de5c62e8e"} err="failed to get container status \"1918c59e4c0f85b43001a9418848bbef0411d66b0ed2ff07f139617de5c62e8e\": rpc error: code = NotFound desc = could not find container \"1918c59e4c0f85b43001a9418848bbef0411d66b0ed2ff07f139617de5c62e8e\": container with ID starting with 1918c59e4c0f85b43001a9418848bbef0411d66b0ed2ff07f139617de5c62e8e not found: ID does not exist" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.644408 4771 scope.go:117] "RemoveContainer" containerID="64c2660e0ba3519271d87e2b4ebf7c3bffde7ca9c56a6c5cf3057e78e76a05f0" Feb 27 01:26:21 crc kubenswrapper[4771]: E0227 01:26:21.644815 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64c2660e0ba3519271d87e2b4ebf7c3bffde7ca9c56a6c5cf3057e78e76a05f0\": container with ID starting with 64c2660e0ba3519271d87e2b4ebf7c3bffde7ca9c56a6c5cf3057e78e76a05f0 not found: ID does not exist" containerID="64c2660e0ba3519271d87e2b4ebf7c3bffde7ca9c56a6c5cf3057e78e76a05f0" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.644837 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64c2660e0ba3519271d87e2b4ebf7c3bffde7ca9c56a6c5cf3057e78e76a05f0"} err="failed to get container status \"64c2660e0ba3519271d87e2b4ebf7c3bffde7ca9c56a6c5cf3057e78e76a05f0\": rpc error: code = NotFound desc = could not find container \"64c2660e0ba3519271d87e2b4ebf7c3bffde7ca9c56a6c5cf3057e78e76a05f0\": container with ID starting with 64c2660e0ba3519271d87e2b4ebf7c3bffde7ca9c56a6c5cf3057e78e76a05f0 not found: ID does not exist" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.644850 4771 scope.go:117] "RemoveContainer" containerID="1918c59e4c0f85b43001a9418848bbef0411d66b0ed2ff07f139617de5c62e8e" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.645073 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1918c59e4c0f85b43001a9418848bbef0411d66b0ed2ff07f139617de5c62e8e"} err="failed to get container status \"1918c59e4c0f85b43001a9418848bbef0411d66b0ed2ff07f139617de5c62e8e\": rpc error: code = NotFound desc = could not find container \"1918c59e4c0f85b43001a9418848bbef0411d66b0ed2ff07f139617de5c62e8e\": container with ID starting with 1918c59e4c0f85b43001a9418848bbef0411d66b0ed2ff07f139617de5c62e8e not found: ID does not exist" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.645087 4771 scope.go:117] "RemoveContainer" containerID="64c2660e0ba3519271d87e2b4ebf7c3bffde7ca9c56a6c5cf3057e78e76a05f0" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.645376 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64c2660e0ba3519271d87e2b4ebf7c3bffde7ca9c56a6c5cf3057e78e76a05f0"} err="failed to get container status \"64c2660e0ba3519271d87e2b4ebf7c3bffde7ca9c56a6c5cf3057e78e76a05f0\": rpc error: code = NotFound desc = could not find container \"64c2660e0ba3519271d87e2b4ebf7c3bffde7ca9c56a6c5cf3057e78e76a05f0\": container with ID starting with 64c2660e0ba3519271d87e2b4ebf7c3bffde7ca9c56a6c5cf3057e78e76a05f0 not found: ID does not exist" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.712750 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccd33248-70b6-45be-852d-d10692d396ad-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ccd33248-70b6-45be-852d-d10692d396ad\") " pod="openstack/nova-metadata-0" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.713003 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccd33248-70b6-45be-852d-d10692d396ad-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ccd33248-70b6-45be-852d-d10692d396ad\") " pod="openstack/nova-metadata-0" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.713085 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccd33248-70b6-45be-852d-d10692d396ad-logs\") pod \"nova-metadata-0\" (UID: \"ccd33248-70b6-45be-852d-d10692d396ad\") " pod="openstack/nova-metadata-0" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.713162 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5r2s\" (UniqueName: \"kubernetes.io/projected/ccd33248-70b6-45be-852d-d10692d396ad-kube-api-access-p5r2s\") pod \"nova-metadata-0\" (UID: \"ccd33248-70b6-45be-852d-d10692d396ad\") " pod="openstack/nova-metadata-0" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.713282 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccd33248-70b6-45be-852d-d10692d396ad-config-data\") pod \"nova-metadata-0\" (UID: \"ccd33248-70b6-45be-852d-d10692d396ad\") " pod="openstack/nova-metadata-0" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.715538 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccd33248-70b6-45be-852d-d10692d396ad-logs\") pod \"nova-metadata-0\" (UID: \"ccd33248-70b6-45be-852d-d10692d396ad\") " pod="openstack/nova-metadata-0" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.716258 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccd33248-70b6-45be-852d-d10692d396ad-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ccd33248-70b6-45be-852d-d10692d396ad\") " pod="openstack/nova-metadata-0" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.716923 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccd33248-70b6-45be-852d-d10692d396ad-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ccd33248-70b6-45be-852d-d10692d396ad\") " pod="openstack/nova-metadata-0" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.719363 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccd33248-70b6-45be-852d-d10692d396ad-config-data\") pod \"nova-metadata-0\" (UID: \"ccd33248-70b6-45be-852d-d10692d396ad\") " pod="openstack/nova-metadata-0" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.743151 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5r2s\" (UniqueName: \"kubernetes.io/projected/ccd33248-70b6-45be-852d-d10692d396ad-kube-api-access-p5r2s\") pod \"nova-metadata-0\" (UID: \"ccd33248-70b6-45be-852d-d10692d396ad\") " pod="openstack/nova-metadata-0" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.788193 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd33f007-b06d-4b0a-afb8-64e98985e598" path="/var/lib/kubelet/pods/dd33f007-b06d-4b0a-afb8-64e98985e598/volumes" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.789328 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f64f305a-75ae-416a-983f-3b3482ba9886" path="/var/lib/kubelet/pods/f64f305a-75ae-416a-983f-3b3482ba9886/volumes" Feb 27 01:26:21 crc kubenswrapper[4771]: I0227 01:26:21.938180 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 01:26:22 crc kubenswrapper[4771]: I0227 01:26:22.443075 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 01:26:22 crc kubenswrapper[4771]: I0227 01:26:22.550161 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ccd33248-70b6-45be-852d-d10692d396ad","Type":"ContainerStarted","Data":"9c87c321841e287a1fb801c55d56aa6af0473a132e5ac064def59317645ad27e"} Feb 27 01:26:22 crc kubenswrapper[4771]: I0227 01:26:22.552621 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"fdf27295-a275-4fb3-9e79-c3627df37a39","Type":"ContainerStarted","Data":"d35f10922450b316b5c9f3ad0650ed1ddb34698b80f80eaab15c8534dd5b919c"} Feb 27 01:26:22 crc kubenswrapper[4771]: I0227 01:26:22.553627 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 27 01:26:22 crc kubenswrapper[4771]: I0227 01:26:22.579990 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.579968782 podStartE2EDuration="2.579968782s" podCreationTimestamp="2026-02-27 01:26:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:26:22.570180916 +0000 UTC m=+1295.507742224" watchObservedRunningTime="2026-02-27 01:26:22.579968782 +0000 UTC m=+1295.517530070" Feb 27 01:26:23 crc kubenswrapper[4771]: I0227 01:26:23.566911 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ccd33248-70b6-45be-852d-d10692d396ad","Type":"ContainerStarted","Data":"f9a5dc02012b3c6cfdd5f5baa51125985982be78f0918140df7bbbd0715b337e"} Feb 27 01:26:23 crc kubenswrapper[4771]: I0227 01:26:23.567336 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ccd33248-70b6-45be-852d-d10692d396ad","Type":"ContainerStarted","Data":"a14ac2fd614cde8c7e8a809fc3c047f9af302e879db29dc00bc8dc394a87bfe1"} Feb 27 01:26:23 crc kubenswrapper[4771]: I0227 01:26:23.603584 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.603562823 podStartE2EDuration="2.603562823s" podCreationTimestamp="2026-02-27 01:26:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:26:23.599083892 +0000 UTC m=+1296.536645220" watchObservedRunningTime="2026-02-27 01:26:23.603562823 +0000 UTC m=+1296.541124131" Feb 27 01:26:24 crc kubenswrapper[4771]: E0227 01:26:24.113895 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2a09dd2f48e09cc22f9cf6d44d22191508997f795ecc77431981f495685f7bb2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 27 01:26:24 crc kubenswrapper[4771]: E0227 01:26:24.116036 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2a09dd2f48e09cc22f9cf6d44d22191508997f795ecc77431981f495685f7bb2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 27 01:26:24 crc kubenswrapper[4771]: E0227 01:26:24.118318 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2a09dd2f48e09cc22f9cf6d44d22191508997f795ecc77431981f495685f7bb2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 27 01:26:24 crc kubenswrapper[4771]: E0227 01:26:24.118373 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="51d00b94-9ca5-41df-9dd3-2c638d714751" containerName="nova-scheduler-scheduler" Feb 27 01:26:24 crc kubenswrapper[4771]: I0227 01:26:24.736369 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c9776ccc5-tdlnj" podUID="dd33f007-b06d-4b0a-afb8-64e98985e598" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.171:5353: i/o timeout" Feb 27 01:26:25 crc kubenswrapper[4771]: I0227 01:26:25.288481 4771 scope.go:117] "RemoveContainer" containerID="044206f14224ccf813eacb74b23ed15fc083de8149735fb6e64399341fa97794" Feb 27 01:26:25 crc kubenswrapper[4771]: I0227 01:26:25.403807 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 01:26:25 crc kubenswrapper[4771]: I0227 01:26:25.585982 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d00b94-9ca5-41df-9dd3-2c638d714751-config-data\") pod \"51d00b94-9ca5-41df-9dd3-2c638d714751\" (UID: \"51d00b94-9ca5-41df-9dd3-2c638d714751\") " Feb 27 01:26:25 crc kubenswrapper[4771]: I0227 01:26:25.586303 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d00b94-9ca5-41df-9dd3-2c638d714751-combined-ca-bundle\") pod \"51d00b94-9ca5-41df-9dd3-2c638d714751\" (UID: \"51d00b94-9ca5-41df-9dd3-2c638d714751\") " Feb 27 01:26:25 crc kubenswrapper[4771]: I0227 01:26:25.586638 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xgll\" (UniqueName: \"kubernetes.io/projected/51d00b94-9ca5-41df-9dd3-2c638d714751-kube-api-access-2xgll\") pod \"51d00b94-9ca5-41df-9dd3-2c638d714751\" (UID: \"51d00b94-9ca5-41df-9dd3-2c638d714751\") " Feb 27 01:26:25 crc kubenswrapper[4771]: I0227 01:26:25.592753 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d00b94-9ca5-41df-9dd3-2c638d714751-kube-api-access-2xgll" (OuterVolumeSpecName: "kube-api-access-2xgll") pod "51d00b94-9ca5-41df-9dd3-2c638d714751" (UID: "51d00b94-9ca5-41df-9dd3-2c638d714751"). InnerVolumeSpecName "kube-api-access-2xgll". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:26:25 crc kubenswrapper[4771]: I0227 01:26:25.595841 4771 generic.go:334] "Generic (PLEG): container finished" podID="51d00b94-9ca5-41df-9dd3-2c638d714751" containerID="2a09dd2f48e09cc22f9cf6d44d22191508997f795ecc77431981f495685f7bb2" exitCode=0 Feb 27 01:26:25 crc kubenswrapper[4771]: I0227 01:26:25.595901 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"51d00b94-9ca5-41df-9dd3-2c638d714751","Type":"ContainerDied","Data":"2a09dd2f48e09cc22f9cf6d44d22191508997f795ecc77431981f495685f7bb2"} Feb 27 01:26:25 crc kubenswrapper[4771]: I0227 01:26:25.595927 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"51d00b94-9ca5-41df-9dd3-2c638d714751","Type":"ContainerDied","Data":"52d6c7671f12c940864f1b32e6d2a45fefa61f3d9d8a66b0f4511c0cf28fcbca"} Feb 27 01:26:25 crc kubenswrapper[4771]: I0227 01:26:25.595951 4771 scope.go:117] "RemoveContainer" containerID="2a09dd2f48e09cc22f9cf6d44d22191508997f795ecc77431981f495685f7bb2" Feb 27 01:26:25 crc kubenswrapper[4771]: I0227 01:26:25.596059 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 01:26:25 crc kubenswrapper[4771]: I0227 01:26:25.600473 4771 generic.go:334] "Generic (PLEG): container finished" podID="97954daf-91f3-4031-a5bc-e5c6429a8810" containerID="b5d72c52d3852ec1831db3983f510adc0e98dd37936973994b307935a1da2592" exitCode=0 Feb 27 01:26:25 crc kubenswrapper[4771]: I0227 01:26:25.600512 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97954daf-91f3-4031-a5bc-e5c6429a8810","Type":"ContainerDied","Data":"b5d72c52d3852ec1831db3983f510adc0e98dd37936973994b307935a1da2592"} Feb 27 01:26:25 crc kubenswrapper[4771]: I0227 01:26:25.617281 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d00b94-9ca5-41df-9dd3-2c638d714751-config-data" (OuterVolumeSpecName: "config-data") pod "51d00b94-9ca5-41df-9dd3-2c638d714751" (UID: "51d00b94-9ca5-41df-9dd3-2c638d714751"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:25 crc kubenswrapper[4771]: I0227 01:26:25.619805 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d00b94-9ca5-41df-9dd3-2c638d714751-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51d00b94-9ca5-41df-9dd3-2c638d714751" (UID: "51d00b94-9ca5-41df-9dd3-2c638d714751"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:25 crc kubenswrapper[4771]: I0227 01:26:25.632341 4771 scope.go:117] "RemoveContainer" containerID="2a09dd2f48e09cc22f9cf6d44d22191508997f795ecc77431981f495685f7bb2" Feb 27 01:26:25 crc kubenswrapper[4771]: E0227 01:26:25.632860 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a09dd2f48e09cc22f9cf6d44d22191508997f795ecc77431981f495685f7bb2\": container with ID starting with 2a09dd2f48e09cc22f9cf6d44d22191508997f795ecc77431981f495685f7bb2 not found: ID does not exist" containerID="2a09dd2f48e09cc22f9cf6d44d22191508997f795ecc77431981f495685f7bb2" Feb 27 01:26:25 crc kubenswrapper[4771]: I0227 01:26:25.632885 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a09dd2f48e09cc22f9cf6d44d22191508997f795ecc77431981f495685f7bb2"} err="failed to get container status \"2a09dd2f48e09cc22f9cf6d44d22191508997f795ecc77431981f495685f7bb2\": rpc error: code = NotFound desc = could not find container \"2a09dd2f48e09cc22f9cf6d44d22191508997f795ecc77431981f495685f7bb2\": container with ID starting with 2a09dd2f48e09cc22f9cf6d44d22191508997f795ecc77431981f495685f7bb2 not found: ID does not exist" Feb 27 01:26:25 crc kubenswrapper[4771]: I0227 01:26:25.688864 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xgll\" (UniqueName: \"kubernetes.io/projected/51d00b94-9ca5-41df-9dd3-2c638d714751-kube-api-access-2xgll\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:25 crc kubenswrapper[4771]: I0227 01:26:25.688894 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d00b94-9ca5-41df-9dd3-2c638d714751-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:25 crc kubenswrapper[4771]: I0227 01:26:25.688908 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d00b94-9ca5-41df-9dd3-2c638d714751-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:25 crc kubenswrapper[4771]: I0227 01:26:25.957693 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.000776 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.008917 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 01:26:26 crc kubenswrapper[4771]: E0227 01:26:26.009358 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d00b94-9ca5-41df-9dd3-2c638d714751" containerName="nova-scheduler-scheduler" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.009384 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d00b94-9ca5-41df-9dd3-2c638d714751" containerName="nova-scheduler-scheduler" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.009640 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="51d00b94-9ca5-41df-9dd3-2c638d714751" containerName="nova-scheduler-scheduler" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.010418 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.012279 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.017433 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.155900 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.200829 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltqtj\" (UniqueName: \"kubernetes.io/projected/bf83205a-9abc-4989-8804-2368ba05c0ef-kube-api-access-ltqtj\") pod \"nova-scheduler-0\" (UID: \"bf83205a-9abc-4989-8804-2368ba05c0ef\") " pod="openstack/nova-scheduler-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.200930 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf83205a-9abc-4989-8804-2368ba05c0ef-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bf83205a-9abc-4989-8804-2368ba05c0ef\") " pod="openstack/nova-scheduler-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.200969 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf83205a-9abc-4989-8804-2368ba05c0ef-config-data\") pod \"nova-scheduler-0\" (UID: \"bf83205a-9abc-4989-8804-2368ba05c0ef\") " pod="openstack/nova-scheduler-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.302427 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97954daf-91f3-4031-a5bc-e5c6429a8810-combined-ca-bundle\") pod \"97954daf-91f3-4031-a5bc-e5c6429a8810\" (UID: \"97954daf-91f3-4031-a5bc-e5c6429a8810\") " Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.302675 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwzrk\" (UniqueName: \"kubernetes.io/projected/97954daf-91f3-4031-a5bc-e5c6429a8810-kube-api-access-gwzrk\") pod \"97954daf-91f3-4031-a5bc-e5c6429a8810\" (UID: \"97954daf-91f3-4031-a5bc-e5c6429a8810\") " Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.302812 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97954daf-91f3-4031-a5bc-e5c6429a8810-config-data\") pod \"97954daf-91f3-4031-a5bc-e5c6429a8810\" (UID: \"97954daf-91f3-4031-a5bc-e5c6429a8810\") " Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.302991 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97954daf-91f3-4031-a5bc-e5c6429a8810-logs\") pod \"97954daf-91f3-4031-a5bc-e5c6429a8810\" (UID: \"97954daf-91f3-4031-a5bc-e5c6429a8810\") " Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.303439 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97954daf-91f3-4031-a5bc-e5c6429a8810-logs" (OuterVolumeSpecName: "logs") pod "97954daf-91f3-4031-a5bc-e5c6429a8810" (UID: "97954daf-91f3-4031-a5bc-e5c6429a8810"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.303690 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf83205a-9abc-4989-8804-2368ba05c0ef-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bf83205a-9abc-4989-8804-2368ba05c0ef\") " pod="openstack/nova-scheduler-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.303771 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf83205a-9abc-4989-8804-2368ba05c0ef-config-data\") pod \"nova-scheduler-0\" (UID: \"bf83205a-9abc-4989-8804-2368ba05c0ef\") " pod="openstack/nova-scheduler-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.304049 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltqtj\" (UniqueName: \"kubernetes.io/projected/bf83205a-9abc-4989-8804-2368ba05c0ef-kube-api-access-ltqtj\") pod \"nova-scheduler-0\" (UID: \"bf83205a-9abc-4989-8804-2368ba05c0ef\") " pod="openstack/nova-scheduler-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.304207 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97954daf-91f3-4031-a5bc-e5c6429a8810-logs\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.308174 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97954daf-91f3-4031-a5bc-e5c6429a8810-kube-api-access-gwzrk" (OuterVolumeSpecName: "kube-api-access-gwzrk") pod "97954daf-91f3-4031-a5bc-e5c6429a8810" (UID: "97954daf-91f3-4031-a5bc-e5c6429a8810"). InnerVolumeSpecName "kube-api-access-gwzrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.310840 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf83205a-9abc-4989-8804-2368ba05c0ef-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bf83205a-9abc-4989-8804-2368ba05c0ef\") " pod="openstack/nova-scheduler-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.315998 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf83205a-9abc-4989-8804-2368ba05c0ef-config-data\") pod \"nova-scheduler-0\" (UID: \"bf83205a-9abc-4989-8804-2368ba05c0ef\") " pod="openstack/nova-scheduler-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.323865 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltqtj\" (UniqueName: \"kubernetes.io/projected/bf83205a-9abc-4989-8804-2368ba05c0ef-kube-api-access-ltqtj\") pod \"nova-scheduler-0\" (UID: \"bf83205a-9abc-4989-8804-2368ba05c0ef\") " pod="openstack/nova-scheduler-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.332540 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97954daf-91f3-4031-a5bc-e5c6429a8810-config-data" (OuterVolumeSpecName: "config-data") pod "97954daf-91f3-4031-a5bc-e5c6429a8810" (UID: "97954daf-91f3-4031-a5bc-e5c6429a8810"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.334190 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97954daf-91f3-4031-a5bc-e5c6429a8810-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97954daf-91f3-4031-a5bc-e5c6429a8810" (UID: "97954daf-91f3-4031-a5bc-e5c6429a8810"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.356258 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.404934 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwzrk\" (UniqueName: \"kubernetes.io/projected/97954daf-91f3-4031-a5bc-e5c6429a8810-kube-api-access-gwzrk\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.404970 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97954daf-91f3-4031-a5bc-e5c6429a8810-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.404980 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97954daf-91f3-4031-a5bc-e5c6429a8810-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.610698 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97954daf-91f3-4031-a5bc-e5c6429a8810","Type":"ContainerDied","Data":"0d33f881f100af51641cfa7fe5722478a6df1de1bff73744367acc4c072272f5"} Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.610716 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.611087 4771 scope.go:117] "RemoveContainer" containerID="b5d72c52d3852ec1831db3983f510adc0e98dd37936973994b307935a1da2592" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.656485 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.656727 4771 scope.go:117] "RemoveContainer" containerID="df27ba885a5439582ce4046e5041d88cf53f4a5ef6b2197fbe25f13a9491e681" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.667792 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.678648 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 27 01:26:26 crc kubenswrapper[4771]: E0227 01:26:26.679176 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97954daf-91f3-4031-a5bc-e5c6429a8810" containerName="nova-api-api" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.679195 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="97954daf-91f3-4031-a5bc-e5c6429a8810" containerName="nova-api-api" Feb 27 01:26:26 crc kubenswrapper[4771]: E0227 01:26:26.679233 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97954daf-91f3-4031-a5bc-e5c6429a8810" containerName="nova-api-log" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.679243 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="97954daf-91f3-4031-a5bc-e5c6429a8810" containerName="nova-api-log" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.679473 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="97954daf-91f3-4031-a5bc-e5c6429a8810" containerName="nova-api-log" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.679499 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="97954daf-91f3-4031-a5bc-e5c6429a8810" containerName="nova-api-api" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.680794 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.687189 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.695363 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.781355 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.815680 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f3aca07-3840-4b9d-85f8-86599214f911-logs\") pod \"nova-api-0\" (UID: \"0f3aca07-3840-4b9d-85f8-86599214f911\") " pod="openstack/nova-api-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.816014 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4hhx\" (UniqueName: \"kubernetes.io/projected/0f3aca07-3840-4b9d-85f8-86599214f911-kube-api-access-l4hhx\") pod \"nova-api-0\" (UID: \"0f3aca07-3840-4b9d-85f8-86599214f911\") " pod="openstack/nova-api-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.816060 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f3aca07-3840-4b9d-85f8-86599214f911-config-data\") pod \"nova-api-0\" (UID: \"0f3aca07-3840-4b9d-85f8-86599214f911\") " pod="openstack/nova-api-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.816085 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f3aca07-3840-4b9d-85f8-86599214f911-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0f3aca07-3840-4b9d-85f8-86599214f911\") " pod="openstack/nova-api-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.918108 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4hhx\" (UniqueName: \"kubernetes.io/projected/0f3aca07-3840-4b9d-85f8-86599214f911-kube-api-access-l4hhx\") pod \"nova-api-0\" (UID: \"0f3aca07-3840-4b9d-85f8-86599214f911\") " pod="openstack/nova-api-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.918195 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f3aca07-3840-4b9d-85f8-86599214f911-config-data\") pod \"nova-api-0\" (UID: \"0f3aca07-3840-4b9d-85f8-86599214f911\") " pod="openstack/nova-api-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.918256 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f3aca07-3840-4b9d-85f8-86599214f911-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0f3aca07-3840-4b9d-85f8-86599214f911\") " pod="openstack/nova-api-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.918355 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f3aca07-3840-4b9d-85f8-86599214f911-logs\") pod \"nova-api-0\" (UID: \"0f3aca07-3840-4b9d-85f8-86599214f911\") " pod="openstack/nova-api-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.920162 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f3aca07-3840-4b9d-85f8-86599214f911-logs\") pod \"nova-api-0\" (UID: \"0f3aca07-3840-4b9d-85f8-86599214f911\") " pod="openstack/nova-api-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.926273 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f3aca07-3840-4b9d-85f8-86599214f911-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0f3aca07-3840-4b9d-85f8-86599214f911\") " pod="openstack/nova-api-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.926930 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f3aca07-3840-4b9d-85f8-86599214f911-config-data\") pod \"nova-api-0\" (UID: \"0f3aca07-3840-4b9d-85f8-86599214f911\") " pod="openstack/nova-api-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.938951 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.939005 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 27 01:26:26 crc kubenswrapper[4771]: I0227 01:26:26.939687 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4hhx\" (UniqueName: \"kubernetes.io/projected/0f3aca07-3840-4b9d-85f8-86599214f911-kube-api-access-l4hhx\") pod \"nova-api-0\" (UID: \"0f3aca07-3840-4b9d-85f8-86599214f911\") " pod="openstack/nova-api-0" Feb 27 01:26:27 crc kubenswrapper[4771]: I0227 01:26:27.005539 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 01:26:27 crc kubenswrapper[4771]: I0227 01:26:27.330838 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 01:26:27 crc kubenswrapper[4771]: I0227 01:26:27.625985 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0f3aca07-3840-4b9d-85f8-86599214f911","Type":"ContainerStarted","Data":"3d1195a4447640cb2139971edfb945a9f0e592bb9483c3518ec0899259141839"} Feb 27 01:26:27 crc kubenswrapper[4771]: I0227 01:26:27.626393 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0f3aca07-3840-4b9d-85f8-86599214f911","Type":"ContainerStarted","Data":"068f84c6c4b7b7cbd149deee0b89c86c3a3933111854bfe840ea50f11d657d5c"} Feb 27 01:26:27 crc kubenswrapper[4771]: I0227 01:26:27.628040 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bf83205a-9abc-4989-8804-2368ba05c0ef","Type":"ContainerStarted","Data":"cfc3024b9a7b28be1f5bcdd93565d72778f90a3bcd576c9e85767b245cd395cb"} Feb 27 01:26:27 crc kubenswrapper[4771]: I0227 01:26:27.628069 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bf83205a-9abc-4989-8804-2368ba05c0ef","Type":"ContainerStarted","Data":"5e58d30f676353340076883927be9cb25bdd80576e088f7ae5936a7ea410365b"} Feb 27 01:26:27 crc kubenswrapper[4771]: I0227 01:26:27.656179 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.656156775 podStartE2EDuration="2.656156775s" podCreationTimestamp="2026-02-27 01:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:26:27.642671769 +0000 UTC m=+1300.580233077" watchObservedRunningTime="2026-02-27 01:26:27.656156775 +0000 UTC m=+1300.593718063" Feb 27 01:26:27 crc kubenswrapper[4771]: I0227 01:26:27.788702 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51d00b94-9ca5-41df-9dd3-2c638d714751" path="/var/lib/kubelet/pods/51d00b94-9ca5-41df-9dd3-2c638d714751/volumes" Feb 27 01:26:27 crc kubenswrapper[4771]: I0227 01:26:27.789797 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97954daf-91f3-4031-a5bc-e5c6429a8810" path="/var/lib/kubelet/pods/97954daf-91f3-4031-a5bc-e5c6429a8810/volumes" Feb 27 01:26:28 crc kubenswrapper[4771]: I0227 01:26:28.643963 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0f3aca07-3840-4b9d-85f8-86599214f911","Type":"ContainerStarted","Data":"ea3176dc132bfd634de96d7d6c3c3c5b8796732cc28b642bd241b0da88679c95"} Feb 27 01:26:28 crc kubenswrapper[4771]: I0227 01:26:28.675321 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.6752975660000002 podStartE2EDuration="2.675297566s" podCreationTimestamp="2026-02-27 01:26:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:26:28.668159731 +0000 UTC m=+1301.605721039" watchObservedRunningTime="2026-02-27 01:26:28.675297566 +0000 UTC m=+1301.612858894" Feb 27 01:26:28 crc kubenswrapper[4771]: I0227 01:26:28.952823 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:26:28 crc kubenswrapper[4771]: I0227 01:26:28.952898 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:26:30 crc kubenswrapper[4771]: I0227 01:26:30.956454 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 27 01:26:31 crc kubenswrapper[4771]: I0227 01:26:31.356758 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 27 01:26:31 crc kubenswrapper[4771]: I0227 01:26:31.938680 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 27 01:26:31 crc kubenswrapper[4771]: I0227 01:26:31.938741 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 27 01:26:32 crc kubenswrapper[4771]: I0227 01:26:32.965766 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ccd33248-70b6-45be-852d-d10692d396ad" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 01:26:32 crc kubenswrapper[4771]: I0227 01:26:32.965784 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ccd33248-70b6-45be-852d-d10692d396ad" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 01:26:33 crc kubenswrapper[4771]: I0227 01:26:33.912330 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 27 01:26:36 crc kubenswrapper[4771]: I0227 01:26:36.357281 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 27 01:26:36 crc kubenswrapper[4771]: I0227 01:26:36.383616 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 27 01:26:36 crc kubenswrapper[4771]: I0227 01:26:36.761924 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 27 01:26:37 crc kubenswrapper[4771]: I0227 01:26:37.006686 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 01:26:37 crc kubenswrapper[4771]: I0227 01:26:37.006752 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 01:26:37 crc kubenswrapper[4771]: I0227 01:26:37.533509 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 01:26:37 crc kubenswrapper[4771]: I0227 01:26:37.533777 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="3ba1222d-39ed-4c00-a636-86788e0f6db6" containerName="kube-state-metrics" containerID="cri-o://d2488f3c3f5a1d3c6a7dddb0cedd991b11c548193fbe969322d1495423e8e8ad" gracePeriod=30 Feb 27 01:26:37 crc kubenswrapper[4771]: E0227 01:26:37.634565 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ba1222d_39ed_4c00_a636_86788e0f6db6.slice/crio-d2488f3c3f5a1d3c6a7dddb0cedd991b11c548193fbe969322d1495423e8e8ad.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ba1222d_39ed_4c00_a636_86788e0f6db6.slice/crio-conmon-d2488f3c3f5a1d3c6a7dddb0cedd991b11c548193fbe969322d1495423e8e8ad.scope\": RecentStats: unable to find data in memory cache]" Feb 27 01:26:37 crc kubenswrapper[4771]: I0227 01:26:37.747057 4771 generic.go:334] "Generic (PLEG): container finished" podID="3ba1222d-39ed-4c00-a636-86788e0f6db6" containerID="d2488f3c3f5a1d3c6a7dddb0cedd991b11c548193fbe969322d1495423e8e8ad" exitCode=2 Feb 27 01:26:37 crc kubenswrapper[4771]: I0227 01:26:37.747643 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3ba1222d-39ed-4c00-a636-86788e0f6db6","Type":"ContainerDied","Data":"d2488f3c3f5a1d3c6a7dddb0cedd991b11c548193fbe969322d1495423e8e8ad"} Feb 27 01:26:38 crc kubenswrapper[4771]: I0227 01:26:38.021474 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 27 01:26:38 crc kubenswrapper[4771]: I0227 01:26:38.089486 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0f3aca07-3840-4b9d-85f8-86599214f911" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.206:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 01:26:38 crc kubenswrapper[4771]: I0227 01:26:38.089630 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0f3aca07-3840-4b9d-85f8-86599214f911" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.206:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 01:26:38 crc kubenswrapper[4771]: I0227 01:26:38.186517 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpnzs\" (UniqueName: \"kubernetes.io/projected/3ba1222d-39ed-4c00-a636-86788e0f6db6-kube-api-access-qpnzs\") pod \"3ba1222d-39ed-4c00-a636-86788e0f6db6\" (UID: \"3ba1222d-39ed-4c00-a636-86788e0f6db6\") " Feb 27 01:26:38 crc kubenswrapper[4771]: I0227 01:26:38.191710 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ba1222d-39ed-4c00-a636-86788e0f6db6-kube-api-access-qpnzs" (OuterVolumeSpecName: "kube-api-access-qpnzs") pod "3ba1222d-39ed-4c00-a636-86788e0f6db6" (UID: "3ba1222d-39ed-4c00-a636-86788e0f6db6"). InnerVolumeSpecName "kube-api-access-qpnzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:26:38 crc kubenswrapper[4771]: I0227 01:26:38.290763 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpnzs\" (UniqueName: \"kubernetes.io/projected/3ba1222d-39ed-4c00-a636-86788e0f6db6-kube-api-access-qpnzs\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:38 crc kubenswrapper[4771]: I0227 01:26:38.757264 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3ba1222d-39ed-4c00-a636-86788e0f6db6","Type":"ContainerDied","Data":"e88c2f71086568b73c70d3d780138057c9c8504470999892f2bef34040219e5f"} Feb 27 01:26:38 crc kubenswrapper[4771]: I0227 01:26:38.757307 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 27 01:26:38 crc kubenswrapper[4771]: I0227 01:26:38.757327 4771 scope.go:117] "RemoveContainer" containerID="d2488f3c3f5a1d3c6a7dddb0cedd991b11c548193fbe969322d1495423e8e8ad" Feb 27 01:26:38 crc kubenswrapper[4771]: I0227 01:26:38.799432 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 01:26:38 crc kubenswrapper[4771]: I0227 01:26:38.815010 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 01:26:38 crc kubenswrapper[4771]: I0227 01:26:38.827763 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 01:26:38 crc kubenswrapper[4771]: E0227 01:26:38.828486 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ba1222d-39ed-4c00-a636-86788e0f6db6" containerName="kube-state-metrics" Feb 27 01:26:38 crc kubenswrapper[4771]: I0227 01:26:38.828522 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ba1222d-39ed-4c00-a636-86788e0f6db6" containerName="kube-state-metrics" Feb 27 01:26:38 crc kubenswrapper[4771]: I0227 01:26:38.828878 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ba1222d-39ed-4c00-a636-86788e0f6db6" containerName="kube-state-metrics" Feb 27 01:26:38 crc kubenswrapper[4771]: I0227 01:26:38.829953 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 27 01:26:38 crc kubenswrapper[4771]: I0227 01:26:38.836036 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 27 01:26:38 crc kubenswrapper[4771]: I0227 01:26:38.836103 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 27 01:26:38 crc kubenswrapper[4771]: I0227 01:26:38.850061 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 01:26:39 crc kubenswrapper[4771]: I0227 01:26:39.002734 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhhwn\" (UniqueName: \"kubernetes.io/projected/336e9838-30f4-4164-8664-073e172d8750-kube-api-access-vhhwn\") pod \"kube-state-metrics-0\" (UID: \"336e9838-30f4-4164-8664-073e172d8750\") " pod="openstack/kube-state-metrics-0" Feb 27 01:26:39 crc kubenswrapper[4771]: I0227 01:26:39.002866 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/336e9838-30f4-4164-8664-073e172d8750-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"336e9838-30f4-4164-8664-073e172d8750\") " pod="openstack/kube-state-metrics-0" Feb 27 01:26:39 crc kubenswrapper[4771]: I0227 01:26:39.002900 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/336e9838-30f4-4164-8664-073e172d8750-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"336e9838-30f4-4164-8664-073e172d8750\") " pod="openstack/kube-state-metrics-0" Feb 27 01:26:39 crc kubenswrapper[4771]: I0227 01:26:39.002935 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/336e9838-30f4-4164-8664-073e172d8750-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"336e9838-30f4-4164-8664-073e172d8750\") " pod="openstack/kube-state-metrics-0" Feb 27 01:26:39 crc kubenswrapper[4771]: I0227 01:26:39.106810 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/336e9838-30f4-4164-8664-073e172d8750-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"336e9838-30f4-4164-8664-073e172d8750\") " pod="openstack/kube-state-metrics-0" Feb 27 01:26:39 crc kubenswrapper[4771]: I0227 01:26:39.106890 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/336e9838-30f4-4164-8664-073e172d8750-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"336e9838-30f4-4164-8664-073e172d8750\") " pod="openstack/kube-state-metrics-0" Feb 27 01:26:39 crc kubenswrapper[4771]: I0227 01:26:39.106943 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/336e9838-30f4-4164-8664-073e172d8750-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"336e9838-30f4-4164-8664-073e172d8750\") " pod="openstack/kube-state-metrics-0" Feb 27 01:26:39 crc kubenswrapper[4771]: I0227 01:26:39.107117 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhhwn\" (UniqueName: \"kubernetes.io/projected/336e9838-30f4-4164-8664-073e172d8750-kube-api-access-vhhwn\") pod \"kube-state-metrics-0\" (UID: \"336e9838-30f4-4164-8664-073e172d8750\") " pod="openstack/kube-state-metrics-0" Feb 27 01:26:39 crc kubenswrapper[4771]: I0227 01:26:39.111741 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/336e9838-30f4-4164-8664-073e172d8750-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"336e9838-30f4-4164-8664-073e172d8750\") " pod="openstack/kube-state-metrics-0" Feb 27 01:26:39 crc kubenswrapper[4771]: I0227 01:26:39.113948 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/336e9838-30f4-4164-8664-073e172d8750-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"336e9838-30f4-4164-8664-073e172d8750\") " pod="openstack/kube-state-metrics-0" Feb 27 01:26:39 crc kubenswrapper[4771]: I0227 01:26:39.115764 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/336e9838-30f4-4164-8664-073e172d8750-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"336e9838-30f4-4164-8664-073e172d8750\") " pod="openstack/kube-state-metrics-0" Feb 27 01:26:39 crc kubenswrapper[4771]: I0227 01:26:39.127692 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhhwn\" (UniqueName: \"kubernetes.io/projected/336e9838-30f4-4164-8664-073e172d8750-kube-api-access-vhhwn\") pod \"kube-state-metrics-0\" (UID: \"336e9838-30f4-4164-8664-073e172d8750\") " pod="openstack/kube-state-metrics-0" Feb 27 01:26:39 crc kubenswrapper[4771]: I0227 01:26:39.149997 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 27 01:26:39 crc kubenswrapper[4771]: I0227 01:26:39.495924 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 27 01:26:39 crc kubenswrapper[4771]: I0227 01:26:39.628870 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:26:39 crc kubenswrapper[4771]: I0227 01:26:39.629211 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f082e8d-a1dc-43e2-9c41-7292439e0f88" containerName="ceilometer-central-agent" containerID="cri-o://421067aec0baefa92d900b90c6183d82a646a6960ef02dc705de4958f67e734e" gracePeriod=30 Feb 27 01:26:39 crc kubenswrapper[4771]: I0227 01:26:39.629369 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f082e8d-a1dc-43e2-9c41-7292439e0f88" containerName="proxy-httpd" containerID="cri-o://59ef4a3642384b996c829237fb903f1d93239560603462909590268a1dacf201" gracePeriod=30 Feb 27 01:26:39 crc kubenswrapper[4771]: I0227 01:26:39.629431 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f082e8d-a1dc-43e2-9c41-7292439e0f88" containerName="sg-core" containerID="cri-o://066a0b6b4d75bced7db6789f537a8751d33dcd9619cd4611dc307029b5332d5b" gracePeriod=30 Feb 27 01:26:39 crc kubenswrapper[4771]: I0227 01:26:39.629485 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f082e8d-a1dc-43e2-9c41-7292439e0f88" containerName="ceilometer-notification-agent" containerID="cri-o://86667540c8643b9730217724f4fb285e78d2795af4c5bef6c7338c4e58fde941" gracePeriod=30 Feb 27 01:26:39 crc kubenswrapper[4771]: I0227 01:26:39.784696 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ba1222d-39ed-4c00-a636-86788e0f6db6" path="/var/lib/kubelet/pods/3ba1222d-39ed-4c00-a636-86788e0f6db6/volumes" Feb 27 01:26:39 crc kubenswrapper[4771]: I0227 01:26:39.785419 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"336e9838-30f4-4164-8664-073e172d8750","Type":"ContainerStarted","Data":"9ace603d98517d310a345c8b7ef8bc4ed140d82709f108cf86cc89283537c7e6"} Feb 27 01:26:39 crc kubenswrapper[4771]: I0227 01:26:39.787827 4771 generic.go:334] "Generic (PLEG): container finished" podID="2f082e8d-a1dc-43e2-9c41-7292439e0f88" containerID="59ef4a3642384b996c829237fb903f1d93239560603462909590268a1dacf201" exitCode=0 Feb 27 01:26:39 crc kubenswrapper[4771]: I0227 01:26:39.787870 4771 generic.go:334] "Generic (PLEG): container finished" podID="2f082e8d-a1dc-43e2-9c41-7292439e0f88" containerID="066a0b6b4d75bced7db6789f537a8751d33dcd9619cd4611dc307029b5332d5b" exitCode=2 Feb 27 01:26:39 crc kubenswrapper[4771]: I0227 01:26:39.787930 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f082e8d-a1dc-43e2-9c41-7292439e0f88","Type":"ContainerDied","Data":"59ef4a3642384b996c829237fb903f1d93239560603462909590268a1dacf201"} Feb 27 01:26:39 crc kubenswrapper[4771]: I0227 01:26:39.787962 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f082e8d-a1dc-43e2-9c41-7292439e0f88","Type":"ContainerDied","Data":"066a0b6b4d75bced7db6789f537a8751d33dcd9619cd4611dc307029b5332d5b"} Feb 27 01:26:40 crc kubenswrapper[4771]: I0227 01:26:40.805287 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"336e9838-30f4-4164-8664-073e172d8750","Type":"ContainerStarted","Data":"fee7298364563c4941a650443e0ba773cdcd1020201bed079623103abdd53593"} Feb 27 01:26:40 crc kubenswrapper[4771]: I0227 01:26:40.805899 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 27 01:26:40 crc kubenswrapper[4771]: I0227 01:26:40.813656 4771 generic.go:334] "Generic (PLEG): container finished" podID="2f082e8d-a1dc-43e2-9c41-7292439e0f88" containerID="421067aec0baefa92d900b90c6183d82a646a6960ef02dc705de4958f67e734e" exitCode=0 Feb 27 01:26:40 crc kubenswrapper[4771]: I0227 01:26:40.813727 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f082e8d-a1dc-43e2-9c41-7292439e0f88","Type":"ContainerDied","Data":"421067aec0baefa92d900b90c6183d82a646a6960ef02dc705de4958f67e734e"} Feb 27 01:26:40 crc kubenswrapper[4771]: I0227 01:26:40.834525 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.467150535 podStartE2EDuration="2.834504569s" podCreationTimestamp="2026-02-27 01:26:38 +0000 UTC" firstStartedPulling="2026-02-27 01:26:39.513893203 +0000 UTC m=+1312.451454491" lastFinishedPulling="2026-02-27 01:26:39.881247237 +0000 UTC m=+1312.818808525" observedRunningTime="2026-02-27 01:26:40.830712316 +0000 UTC m=+1313.768273614" watchObservedRunningTime="2026-02-27 01:26:40.834504569 +0000 UTC m=+1313.772065867" Feb 27 01:26:41 crc kubenswrapper[4771]: I0227 01:26:41.948185 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 27 01:26:41 crc kubenswrapper[4771]: I0227 01:26:41.950199 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 27 01:26:41 crc kubenswrapper[4771]: I0227 01:26:41.961397 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 27 01:26:42 crc kubenswrapper[4771]: I0227 01:26:42.843778 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.616748 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.737511 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.819251 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f082e8d-a1dc-43e2-9c41-7292439e0f88-scripts\") pod \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.819406 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f082e8d-a1dc-43e2-9c41-7292439e0f88-run-httpd\") pod \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.819438 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f082e8d-a1dc-43e2-9c41-7292439e0f88-combined-ca-bundle\") pod \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.819500 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f082e8d-a1dc-43e2-9c41-7292439e0f88-log-httpd\") pod \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.819630 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wj2w4\" (UniqueName: \"kubernetes.io/projected/2f082e8d-a1dc-43e2-9c41-7292439e0f88-kube-api-access-wj2w4\") pod \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.819660 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f082e8d-a1dc-43e2-9c41-7292439e0f88-config-data\") pod \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.819686 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f082e8d-a1dc-43e2-9c41-7292439e0f88-sg-core-conf-yaml\") pod \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\" (UID: \"2f082e8d-a1dc-43e2-9c41-7292439e0f88\") " Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.820490 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f082e8d-a1dc-43e2-9c41-7292439e0f88-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2f082e8d-a1dc-43e2-9c41-7292439e0f88" (UID: "2f082e8d-a1dc-43e2-9c41-7292439e0f88"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.820608 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f082e8d-a1dc-43e2-9c41-7292439e0f88-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2f082e8d-a1dc-43e2-9c41-7292439e0f88" (UID: "2f082e8d-a1dc-43e2-9c41-7292439e0f88"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.826399 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f082e8d-a1dc-43e2-9c41-7292439e0f88-scripts" (OuterVolumeSpecName: "scripts") pod "2f082e8d-a1dc-43e2-9c41-7292439e0f88" (UID: "2f082e8d-a1dc-43e2-9c41-7292439e0f88"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.826998 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f082e8d-a1dc-43e2-9c41-7292439e0f88-kube-api-access-wj2w4" (OuterVolumeSpecName: "kube-api-access-wj2w4") pod "2f082e8d-a1dc-43e2-9c41-7292439e0f88" (UID: "2f082e8d-a1dc-43e2-9c41-7292439e0f88"). InnerVolumeSpecName "kube-api-access-wj2w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.852776 4771 generic.go:334] "Generic (PLEG): container finished" podID="093bc468-6ace-4a5a-a695-b8b202f64bcd" containerID="00cd740c4404e462b1bc90e9acada3291ba9e04cce9a66bf068036db1fee0945" exitCode=137 Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.852824 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"093bc468-6ace-4a5a-a695-b8b202f64bcd","Type":"ContainerDied","Data":"00cd740c4404e462b1bc90e9acada3291ba9e04cce9a66bf068036db1fee0945"} Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.852889 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"093bc468-6ace-4a5a-a695-b8b202f64bcd","Type":"ContainerDied","Data":"a83e9360fec23743635a00a9181ac1936db3d189a396554dededf2f27679dcf5"} Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.852913 4771 scope.go:117] "RemoveContainer" containerID="00cd740c4404e462b1bc90e9acada3291ba9e04cce9a66bf068036db1fee0945" Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.853174 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.865738 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f082e8d-a1dc-43e2-9c41-7292439e0f88-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2f082e8d-a1dc-43e2-9c41-7292439e0f88" (UID: "2f082e8d-a1dc-43e2-9c41-7292439e0f88"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.866979 4771 generic.go:334] "Generic (PLEG): container finished" podID="2f082e8d-a1dc-43e2-9c41-7292439e0f88" containerID="86667540c8643b9730217724f4fb285e78d2795af4c5bef6c7338c4e58fde941" exitCode=0 Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.867095 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f082e8d-a1dc-43e2-9c41-7292439e0f88","Type":"ContainerDied","Data":"86667540c8643b9730217724f4fb285e78d2795af4c5bef6c7338c4e58fde941"} Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.867136 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f082e8d-a1dc-43e2-9c41-7292439e0f88","Type":"ContainerDied","Data":"6b1f52ba62e5666b35c6ff737cc98350f973f58df302e00cdcf80e20f98de522"} Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.867185 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.885250 4771 scope.go:117] "RemoveContainer" containerID="00cd740c4404e462b1bc90e9acada3291ba9e04cce9a66bf068036db1fee0945" Feb 27 01:26:43 crc kubenswrapper[4771]: E0227 01:26:43.885860 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00cd740c4404e462b1bc90e9acada3291ba9e04cce9a66bf068036db1fee0945\": container with ID starting with 00cd740c4404e462b1bc90e9acada3291ba9e04cce9a66bf068036db1fee0945 not found: ID does not exist" containerID="00cd740c4404e462b1bc90e9acada3291ba9e04cce9a66bf068036db1fee0945" Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.885914 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00cd740c4404e462b1bc90e9acada3291ba9e04cce9a66bf068036db1fee0945"} err="failed to get container status \"00cd740c4404e462b1bc90e9acada3291ba9e04cce9a66bf068036db1fee0945\": rpc error: code = NotFound desc = could not find container \"00cd740c4404e462b1bc90e9acada3291ba9e04cce9a66bf068036db1fee0945\": container with ID starting with 00cd740c4404e462b1bc90e9acada3291ba9e04cce9a66bf068036db1fee0945 not found: ID does not exist" Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.885937 4771 scope.go:117] "RemoveContainer" containerID="59ef4a3642384b996c829237fb903f1d93239560603462909590268a1dacf201" Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.914626 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f082e8d-a1dc-43e2-9c41-7292439e0f88-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f082e8d-a1dc-43e2-9c41-7292439e0f88" (UID: "2f082e8d-a1dc-43e2-9c41-7292439e0f88"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.921334 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093bc468-6ace-4a5a-a695-b8b202f64bcd-config-data\") pod \"093bc468-6ace-4a5a-a695-b8b202f64bcd\" (UID: \"093bc468-6ace-4a5a-a695-b8b202f64bcd\") " Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.921670 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vxf2\" (UniqueName: \"kubernetes.io/projected/093bc468-6ace-4a5a-a695-b8b202f64bcd-kube-api-access-6vxf2\") pod \"093bc468-6ace-4a5a-a695-b8b202f64bcd\" (UID: \"093bc468-6ace-4a5a-a695-b8b202f64bcd\") " Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.921746 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093bc468-6ace-4a5a-a695-b8b202f64bcd-combined-ca-bundle\") pod \"093bc468-6ace-4a5a-a695-b8b202f64bcd\" (UID: \"093bc468-6ace-4a5a-a695-b8b202f64bcd\") " Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.922260 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f082e8d-a1dc-43e2-9c41-7292439e0f88-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.922278 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f082e8d-a1dc-43e2-9c41-7292439e0f88-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.922290 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f082e8d-a1dc-43e2-9c41-7292439e0f88-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.922299 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wj2w4\" (UniqueName: \"kubernetes.io/projected/2f082e8d-a1dc-43e2-9c41-7292439e0f88-kube-api-access-wj2w4\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.922308 4771 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f082e8d-a1dc-43e2-9c41-7292439e0f88-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.922316 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f082e8d-a1dc-43e2-9c41-7292439e0f88-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.923310 4771 scope.go:117] "RemoveContainer" containerID="066a0b6b4d75bced7db6789f537a8751d33dcd9619cd4611dc307029b5332d5b" Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.927732 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/093bc468-6ace-4a5a-a695-b8b202f64bcd-kube-api-access-6vxf2" (OuterVolumeSpecName: "kube-api-access-6vxf2") pod "093bc468-6ace-4a5a-a695-b8b202f64bcd" (UID: "093bc468-6ace-4a5a-a695-b8b202f64bcd"). InnerVolumeSpecName "kube-api-access-6vxf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.941521 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f082e8d-a1dc-43e2-9c41-7292439e0f88-config-data" (OuterVolumeSpecName: "config-data") pod "2f082e8d-a1dc-43e2-9c41-7292439e0f88" (UID: "2f082e8d-a1dc-43e2-9c41-7292439e0f88"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:43 crc kubenswrapper[4771]: E0227 01:26:43.947093 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/093bc468-6ace-4a5a-a695-b8b202f64bcd-config-data podName:093bc468-6ace-4a5a-a695-b8b202f64bcd nodeName:}" failed. No retries permitted until 2026-02-27 01:26:44.447059727 +0000 UTC m=+1317.384621005 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/093bc468-6ace-4a5a-a695-b8b202f64bcd-config-data") pod "093bc468-6ace-4a5a-a695-b8b202f64bcd" (UID: "093bc468-6ace-4a5a-a695-b8b202f64bcd") : error deleting /var/lib/kubelet/pods/093bc468-6ace-4a5a-a695-b8b202f64bcd/volume-subpaths: remove /var/lib/kubelet/pods/093bc468-6ace-4a5a-a695-b8b202f64bcd/volume-subpaths: no such file or directory Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.949671 4771 scope.go:117] "RemoveContainer" containerID="86667540c8643b9730217724f4fb285e78d2795af4c5bef6c7338c4e58fde941" Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.949859 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093bc468-6ace-4a5a-a695-b8b202f64bcd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "093bc468-6ace-4a5a-a695-b8b202f64bcd" (UID: "093bc468-6ace-4a5a-a695-b8b202f64bcd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:43 crc kubenswrapper[4771]: I0227 01:26:43.989404 4771 scope.go:117] "RemoveContainer" containerID="421067aec0baefa92d900b90c6183d82a646a6960ef02dc705de4958f67e734e" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.024897 4771 scope.go:117] "RemoveContainer" containerID="59ef4a3642384b996c829237fb903f1d93239560603462909590268a1dacf201" Feb 27 01:26:44 crc kubenswrapper[4771]: E0227 01:26:44.025394 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59ef4a3642384b996c829237fb903f1d93239560603462909590268a1dacf201\": container with ID starting with 59ef4a3642384b996c829237fb903f1d93239560603462909590268a1dacf201 not found: ID does not exist" containerID="59ef4a3642384b996c829237fb903f1d93239560603462909590268a1dacf201" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.025427 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59ef4a3642384b996c829237fb903f1d93239560603462909590268a1dacf201"} err="failed to get container status \"59ef4a3642384b996c829237fb903f1d93239560603462909590268a1dacf201\": rpc error: code = NotFound desc = could not find container \"59ef4a3642384b996c829237fb903f1d93239560603462909590268a1dacf201\": container with ID starting with 59ef4a3642384b996c829237fb903f1d93239560603462909590268a1dacf201 not found: ID does not exist" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.025447 4771 scope.go:117] "RemoveContainer" containerID="066a0b6b4d75bced7db6789f537a8751d33dcd9619cd4611dc307029b5332d5b" Feb 27 01:26:44 crc kubenswrapper[4771]: E0227 01:26:44.025784 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"066a0b6b4d75bced7db6789f537a8751d33dcd9619cd4611dc307029b5332d5b\": container with ID starting with 066a0b6b4d75bced7db6789f537a8751d33dcd9619cd4611dc307029b5332d5b not found: ID does not exist" containerID="066a0b6b4d75bced7db6789f537a8751d33dcd9619cd4611dc307029b5332d5b" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.025838 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"066a0b6b4d75bced7db6789f537a8751d33dcd9619cd4611dc307029b5332d5b"} err="failed to get container status \"066a0b6b4d75bced7db6789f537a8751d33dcd9619cd4611dc307029b5332d5b\": rpc error: code = NotFound desc = could not find container \"066a0b6b4d75bced7db6789f537a8751d33dcd9619cd4611dc307029b5332d5b\": container with ID starting with 066a0b6b4d75bced7db6789f537a8751d33dcd9619cd4611dc307029b5332d5b not found: ID does not exist" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.025866 4771 scope.go:117] "RemoveContainer" containerID="86667540c8643b9730217724f4fb285e78d2795af4c5bef6c7338c4e58fde941" Feb 27 01:26:44 crc kubenswrapper[4771]: E0227 01:26:44.026098 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86667540c8643b9730217724f4fb285e78d2795af4c5bef6c7338c4e58fde941\": container with ID starting with 86667540c8643b9730217724f4fb285e78d2795af4c5bef6c7338c4e58fde941 not found: ID does not exist" containerID="86667540c8643b9730217724f4fb285e78d2795af4c5bef6c7338c4e58fde941" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.026133 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86667540c8643b9730217724f4fb285e78d2795af4c5bef6c7338c4e58fde941"} err="failed to get container status \"86667540c8643b9730217724f4fb285e78d2795af4c5bef6c7338c4e58fde941\": rpc error: code = NotFound desc = could not find container \"86667540c8643b9730217724f4fb285e78d2795af4c5bef6c7338c4e58fde941\": container with ID starting with 86667540c8643b9730217724f4fb285e78d2795af4c5bef6c7338c4e58fde941 not found: ID does not exist" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.026147 4771 scope.go:117] "RemoveContainer" containerID="421067aec0baefa92d900b90c6183d82a646a6960ef02dc705de4958f67e734e" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.026288 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093bc468-6ace-4a5a-a695-b8b202f64bcd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.026332 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f082e8d-a1dc-43e2-9c41-7292439e0f88-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.026351 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vxf2\" (UniqueName: \"kubernetes.io/projected/093bc468-6ace-4a5a-a695-b8b202f64bcd-kube-api-access-6vxf2\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:44 crc kubenswrapper[4771]: E0227 01:26:44.026385 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"421067aec0baefa92d900b90c6183d82a646a6960ef02dc705de4958f67e734e\": container with ID starting with 421067aec0baefa92d900b90c6183d82a646a6960ef02dc705de4958f67e734e not found: ID does not exist" containerID="421067aec0baefa92d900b90c6183d82a646a6960ef02dc705de4958f67e734e" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.026434 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"421067aec0baefa92d900b90c6183d82a646a6960ef02dc705de4958f67e734e"} err="failed to get container status \"421067aec0baefa92d900b90c6183d82a646a6960ef02dc705de4958f67e734e\": rpc error: code = NotFound desc = could not find container \"421067aec0baefa92d900b90c6183d82a646a6960ef02dc705de4958f67e734e\": container with ID starting with 421067aec0baefa92d900b90c6183d82a646a6960ef02dc705de4958f67e734e not found: ID does not exist" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.201430 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.210610 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.229218 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:26:44 crc kubenswrapper[4771]: E0227 01:26:44.229775 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="093bc468-6ace-4a5a-a695-b8b202f64bcd" containerName="nova-cell1-novncproxy-novncproxy" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.229865 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="093bc468-6ace-4a5a-a695-b8b202f64bcd" containerName="nova-cell1-novncproxy-novncproxy" Feb 27 01:26:44 crc kubenswrapper[4771]: E0227 01:26:44.229918 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f082e8d-a1dc-43e2-9c41-7292439e0f88" containerName="proxy-httpd" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.229962 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f082e8d-a1dc-43e2-9c41-7292439e0f88" containerName="proxy-httpd" Feb 27 01:26:44 crc kubenswrapper[4771]: E0227 01:26:44.231086 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f082e8d-a1dc-43e2-9c41-7292439e0f88" containerName="sg-core" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.231187 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f082e8d-a1dc-43e2-9c41-7292439e0f88" containerName="sg-core" Feb 27 01:26:44 crc kubenswrapper[4771]: E0227 01:26:44.231250 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f082e8d-a1dc-43e2-9c41-7292439e0f88" containerName="ceilometer-notification-agent" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.231309 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f082e8d-a1dc-43e2-9c41-7292439e0f88" containerName="ceilometer-notification-agent" Feb 27 01:26:44 crc kubenswrapper[4771]: E0227 01:26:44.231361 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f082e8d-a1dc-43e2-9c41-7292439e0f88" containerName="ceilometer-central-agent" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.231414 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f082e8d-a1dc-43e2-9c41-7292439e0f88" containerName="ceilometer-central-agent" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.231661 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f082e8d-a1dc-43e2-9c41-7292439e0f88" containerName="ceilometer-notification-agent" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.231738 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f082e8d-a1dc-43e2-9c41-7292439e0f88" containerName="ceilometer-central-agent" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.231786 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f082e8d-a1dc-43e2-9c41-7292439e0f88" containerName="sg-core" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.231847 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="093bc468-6ace-4a5a-a695-b8b202f64bcd" containerName="nova-cell1-novncproxy-novncproxy" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.231904 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f082e8d-a1dc-43e2-9c41-7292439e0f88" containerName="proxy-httpd" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.233574 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.240074 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.240336 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.240592 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.275506 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.332340 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " pod="openstack/ceilometer-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.332383 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " pod="openstack/ceilometer-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.332456 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-config-data\") pod \"ceilometer-0\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " pod="openstack/ceilometer-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.332513 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdd2c054-ca14-46ab-9916-8df49d806879-run-httpd\") pod \"ceilometer-0\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " pod="openstack/ceilometer-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.332581 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-scripts\") pod \"ceilometer-0\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " pod="openstack/ceilometer-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.332683 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdd2c054-ca14-46ab-9916-8df49d806879-log-httpd\") pod \"ceilometer-0\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " pod="openstack/ceilometer-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.332740 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzhgj\" (UniqueName: \"kubernetes.io/projected/cdd2c054-ca14-46ab-9916-8df49d806879-kube-api-access-xzhgj\") pod \"ceilometer-0\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " pod="openstack/ceilometer-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.333209 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " pod="openstack/ceilometer-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.435482 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " pod="openstack/ceilometer-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.435704 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " pod="openstack/ceilometer-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.435749 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " pod="openstack/ceilometer-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.435803 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-config-data\") pod \"ceilometer-0\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " pod="openstack/ceilometer-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.435846 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdd2c054-ca14-46ab-9916-8df49d806879-run-httpd\") pod \"ceilometer-0\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " pod="openstack/ceilometer-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.435885 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdd2c054-ca14-46ab-9916-8df49d806879-log-httpd\") pod \"ceilometer-0\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " pod="openstack/ceilometer-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.435922 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-scripts\") pod \"ceilometer-0\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " pod="openstack/ceilometer-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.435961 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzhgj\" (UniqueName: \"kubernetes.io/projected/cdd2c054-ca14-46ab-9916-8df49d806879-kube-api-access-xzhgj\") pod \"ceilometer-0\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " pod="openstack/ceilometer-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.436412 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdd2c054-ca14-46ab-9916-8df49d806879-run-httpd\") pod \"ceilometer-0\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " pod="openstack/ceilometer-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.436420 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdd2c054-ca14-46ab-9916-8df49d806879-log-httpd\") pod \"ceilometer-0\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " pod="openstack/ceilometer-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.440456 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-scripts\") pod \"ceilometer-0\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " pod="openstack/ceilometer-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.441110 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " pod="openstack/ceilometer-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.445969 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " pod="openstack/ceilometer-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.446845 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " pod="openstack/ceilometer-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.453765 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-config-data\") pod \"ceilometer-0\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " pod="openstack/ceilometer-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.453985 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzhgj\" (UniqueName: \"kubernetes.io/projected/cdd2c054-ca14-46ab-9916-8df49d806879-kube-api-access-xzhgj\") pod \"ceilometer-0\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " pod="openstack/ceilometer-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.546008 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093bc468-6ace-4a5a-a695-b8b202f64bcd-config-data\") pod \"093bc468-6ace-4a5a-a695-b8b202f64bcd\" (UID: \"093bc468-6ace-4a5a-a695-b8b202f64bcd\") " Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.551447 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093bc468-6ace-4a5a-a695-b8b202f64bcd-config-data" (OuterVolumeSpecName: "config-data") pod "093bc468-6ace-4a5a-a695-b8b202f64bcd" (UID: "093bc468-6ace-4a5a-a695-b8b202f64bcd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.612746 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.647902 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093bc468-6ace-4a5a-a695-b8b202f64bcd-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.817154 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.828988 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.841766 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.843051 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.846999 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.847108 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.849829 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.850494 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.953641 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdmr2\" (UniqueName: \"kubernetes.io/projected/f748ee94-8cc7-4616-a035-a35770442cbc-kube-api-access-fdmr2\") pod \"nova-cell1-novncproxy-0\" (UID: \"f748ee94-8cc7-4616-a035-a35770442cbc\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.953702 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f748ee94-8cc7-4616-a035-a35770442cbc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f748ee94-8cc7-4616-a035-a35770442cbc\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.953845 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f748ee94-8cc7-4616-a035-a35770442cbc-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f748ee94-8cc7-4616-a035-a35770442cbc\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.953876 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f748ee94-8cc7-4616-a035-a35770442cbc-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f748ee94-8cc7-4616-a035-a35770442cbc\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:44 crc kubenswrapper[4771]: I0227 01:26:44.954219 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f748ee94-8cc7-4616-a035-a35770442cbc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f748ee94-8cc7-4616-a035-a35770442cbc\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:45 crc kubenswrapper[4771]: I0227 01:26:45.055989 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdmr2\" (UniqueName: \"kubernetes.io/projected/f748ee94-8cc7-4616-a035-a35770442cbc-kube-api-access-fdmr2\") pod \"nova-cell1-novncproxy-0\" (UID: \"f748ee94-8cc7-4616-a035-a35770442cbc\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:45 crc kubenswrapper[4771]: I0227 01:26:45.056047 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f748ee94-8cc7-4616-a035-a35770442cbc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f748ee94-8cc7-4616-a035-a35770442cbc\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:45 crc kubenswrapper[4771]: I0227 01:26:45.056130 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f748ee94-8cc7-4616-a035-a35770442cbc-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f748ee94-8cc7-4616-a035-a35770442cbc\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:45 crc kubenswrapper[4771]: I0227 01:26:45.056159 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f748ee94-8cc7-4616-a035-a35770442cbc-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f748ee94-8cc7-4616-a035-a35770442cbc\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:45 crc kubenswrapper[4771]: I0227 01:26:45.056255 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f748ee94-8cc7-4616-a035-a35770442cbc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f748ee94-8cc7-4616-a035-a35770442cbc\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:45 crc kubenswrapper[4771]: I0227 01:26:45.062294 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f748ee94-8cc7-4616-a035-a35770442cbc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f748ee94-8cc7-4616-a035-a35770442cbc\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:45 crc kubenswrapper[4771]: I0227 01:26:45.062337 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f748ee94-8cc7-4616-a035-a35770442cbc-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f748ee94-8cc7-4616-a035-a35770442cbc\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:45 crc kubenswrapper[4771]: I0227 01:26:45.062310 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f748ee94-8cc7-4616-a035-a35770442cbc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f748ee94-8cc7-4616-a035-a35770442cbc\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:45 crc kubenswrapper[4771]: I0227 01:26:45.064954 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f748ee94-8cc7-4616-a035-a35770442cbc-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f748ee94-8cc7-4616-a035-a35770442cbc\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:45 crc kubenswrapper[4771]: I0227 01:26:45.073426 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdmr2\" (UniqueName: \"kubernetes.io/projected/f748ee94-8cc7-4616-a035-a35770442cbc-kube-api-access-fdmr2\") pod \"nova-cell1-novncproxy-0\" (UID: \"f748ee94-8cc7-4616-a035-a35770442cbc\") " pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:45 crc kubenswrapper[4771]: I0227 01:26:45.156064 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:26:45 crc kubenswrapper[4771]: I0227 01:26:45.160792 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:45 crc kubenswrapper[4771]: I0227 01:26:45.649599 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 27 01:26:45 crc kubenswrapper[4771]: W0227 01:26:45.664883 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf748ee94_8cc7_4616_a035_a35770442cbc.slice/crio-0e36558598d4241bac1ddefe84018fd427bd86c771f088f9d7111672710e1072 WatchSource:0}: Error finding container 0e36558598d4241bac1ddefe84018fd427bd86c771f088f9d7111672710e1072: Status 404 returned error can't find the container with id 0e36558598d4241bac1ddefe84018fd427bd86c771f088f9d7111672710e1072 Feb 27 01:26:45 crc kubenswrapper[4771]: I0227 01:26:45.794400 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="093bc468-6ace-4a5a-a695-b8b202f64bcd" path="/var/lib/kubelet/pods/093bc468-6ace-4a5a-a695-b8b202f64bcd/volumes" Feb 27 01:26:45 crc kubenswrapper[4771]: I0227 01:26:45.796096 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f082e8d-a1dc-43e2-9c41-7292439e0f88" path="/var/lib/kubelet/pods/2f082e8d-a1dc-43e2-9c41-7292439e0f88/volumes" Feb 27 01:26:45 crc kubenswrapper[4771]: I0227 01:26:45.898543 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f748ee94-8cc7-4616-a035-a35770442cbc","Type":"ContainerStarted","Data":"0e36558598d4241bac1ddefe84018fd427bd86c771f088f9d7111672710e1072"} Feb 27 01:26:45 crc kubenswrapper[4771]: I0227 01:26:45.900865 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdd2c054-ca14-46ab-9916-8df49d806879","Type":"ContainerStarted","Data":"77208d23ba05e7130ae77725e71610af52c678553fb025faeae8c6fca33d36e6"} Feb 27 01:26:46 crc kubenswrapper[4771]: I0227 01:26:46.922761 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f748ee94-8cc7-4616-a035-a35770442cbc","Type":"ContainerStarted","Data":"57d80686925551d72caafc0c17e2808770747ee42a21d462220d12c72489d86a"} Feb 27 01:26:46 crc kubenswrapper[4771]: I0227 01:26:46.926981 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdd2c054-ca14-46ab-9916-8df49d806879","Type":"ContainerStarted","Data":"0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c"} Feb 27 01:26:46 crc kubenswrapper[4771]: I0227 01:26:46.927026 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdd2c054-ca14-46ab-9916-8df49d806879","Type":"ContainerStarted","Data":"5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04"} Feb 27 01:26:46 crc kubenswrapper[4771]: I0227 01:26:46.949437 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.949415513 podStartE2EDuration="2.949415513s" podCreationTimestamp="2026-02-27 01:26:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:26:46.944402198 +0000 UTC m=+1319.881963556" watchObservedRunningTime="2026-02-27 01:26:46.949415513 +0000 UTC m=+1319.886976801" Feb 27 01:26:47 crc kubenswrapper[4771]: I0227 01:26:47.011012 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 27 01:26:47 crc kubenswrapper[4771]: I0227 01:26:47.011585 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 27 01:26:47 crc kubenswrapper[4771]: I0227 01:26:47.012293 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 27 01:26:47 crc kubenswrapper[4771]: I0227 01:26:47.016129 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 27 01:26:47 crc kubenswrapper[4771]: I0227 01:26:47.939078 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdd2c054-ca14-46ab-9916-8df49d806879","Type":"ContainerStarted","Data":"83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c"} Feb 27 01:26:47 crc kubenswrapper[4771]: I0227 01:26:47.940353 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 27 01:26:47 crc kubenswrapper[4771]: I0227 01:26:47.943673 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 27 01:26:48 crc kubenswrapper[4771]: I0227 01:26:48.135908 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-jc8lh"] Feb 27 01:26:48 crc kubenswrapper[4771]: I0227 01:26:48.148146 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" Feb 27 01:26:48 crc kubenswrapper[4771]: I0227 01:26:48.167114 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-jc8lh"] Feb 27 01:26:48 crc kubenswrapper[4771]: I0227 01:26:48.334618 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-jc8lh\" (UID: \"7e507450-eb79-43bc-ae7b-89352c222a44\") " pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" Feb 27 01:26:48 crc kubenswrapper[4771]: I0227 01:26:48.334820 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-jc8lh\" (UID: \"7e507450-eb79-43bc-ae7b-89352c222a44\") " pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" Feb 27 01:26:48 crc kubenswrapper[4771]: I0227 01:26:48.334858 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-jc8lh\" (UID: \"7e507450-eb79-43bc-ae7b-89352c222a44\") " pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" Feb 27 01:26:48 crc kubenswrapper[4771]: I0227 01:26:48.334921 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-jc8lh\" (UID: \"7e507450-eb79-43bc-ae7b-89352c222a44\") " pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" Feb 27 01:26:48 crc kubenswrapper[4771]: I0227 01:26:48.334973 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h97cn\" (UniqueName: \"kubernetes.io/projected/7e507450-eb79-43bc-ae7b-89352c222a44-kube-api-access-h97cn\") pod \"dnsmasq-dns-89c5cd4d5-jc8lh\" (UID: \"7e507450-eb79-43bc-ae7b-89352c222a44\") " pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" Feb 27 01:26:48 crc kubenswrapper[4771]: I0227 01:26:48.335008 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-config\") pod \"dnsmasq-dns-89c5cd4d5-jc8lh\" (UID: \"7e507450-eb79-43bc-ae7b-89352c222a44\") " pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" Feb 27 01:26:48 crc kubenswrapper[4771]: I0227 01:26:48.436719 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-jc8lh\" (UID: \"7e507450-eb79-43bc-ae7b-89352c222a44\") " pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" Feb 27 01:26:48 crc kubenswrapper[4771]: I0227 01:26:48.436907 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-jc8lh\" (UID: \"7e507450-eb79-43bc-ae7b-89352c222a44\") " pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" Feb 27 01:26:48 crc kubenswrapper[4771]: I0227 01:26:48.436947 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-jc8lh\" (UID: \"7e507450-eb79-43bc-ae7b-89352c222a44\") " pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" Feb 27 01:26:48 crc kubenswrapper[4771]: I0227 01:26:48.436982 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-jc8lh\" (UID: \"7e507450-eb79-43bc-ae7b-89352c222a44\") " pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" Feb 27 01:26:48 crc kubenswrapper[4771]: I0227 01:26:48.437005 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h97cn\" (UniqueName: \"kubernetes.io/projected/7e507450-eb79-43bc-ae7b-89352c222a44-kube-api-access-h97cn\") pod \"dnsmasq-dns-89c5cd4d5-jc8lh\" (UID: \"7e507450-eb79-43bc-ae7b-89352c222a44\") " pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" Feb 27 01:26:48 crc kubenswrapper[4771]: I0227 01:26:48.437041 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-config\") pod \"dnsmasq-dns-89c5cd4d5-jc8lh\" (UID: \"7e507450-eb79-43bc-ae7b-89352c222a44\") " pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" Feb 27 01:26:48 crc kubenswrapper[4771]: I0227 01:26:48.437984 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-config\") pod \"dnsmasq-dns-89c5cd4d5-jc8lh\" (UID: \"7e507450-eb79-43bc-ae7b-89352c222a44\") " pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" Feb 27 01:26:48 crc kubenswrapper[4771]: I0227 01:26:48.438620 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-jc8lh\" (UID: \"7e507450-eb79-43bc-ae7b-89352c222a44\") " pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" Feb 27 01:26:48 crc kubenswrapper[4771]: I0227 01:26:48.439232 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-jc8lh\" (UID: \"7e507450-eb79-43bc-ae7b-89352c222a44\") " pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" Feb 27 01:26:48 crc kubenswrapper[4771]: I0227 01:26:48.439941 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-jc8lh\" (UID: \"7e507450-eb79-43bc-ae7b-89352c222a44\") " pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" Feb 27 01:26:48 crc kubenswrapper[4771]: I0227 01:26:48.440926 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-jc8lh\" (UID: \"7e507450-eb79-43bc-ae7b-89352c222a44\") " pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" Feb 27 01:26:48 crc kubenswrapper[4771]: I0227 01:26:48.470494 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h97cn\" (UniqueName: \"kubernetes.io/projected/7e507450-eb79-43bc-ae7b-89352c222a44-kube-api-access-h97cn\") pod \"dnsmasq-dns-89c5cd4d5-jc8lh\" (UID: \"7e507450-eb79-43bc-ae7b-89352c222a44\") " pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" Feb 27 01:26:48 crc kubenswrapper[4771]: I0227 01:26:48.494664 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" Feb 27 01:26:48 crc kubenswrapper[4771]: I0227 01:26:48.971871 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-jc8lh"] Feb 27 01:26:49 crc kubenswrapper[4771]: I0227 01:26:49.179229 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 27 01:26:49 crc kubenswrapper[4771]: I0227 01:26:49.960741 4771 generic.go:334] "Generic (PLEG): container finished" podID="7e507450-eb79-43bc-ae7b-89352c222a44" containerID="b47bf402f1d987d47b0b75f1599f21e4974fcb2e6a2d7d3420a07b36a7033fc2" exitCode=0 Feb 27 01:26:49 crc kubenswrapper[4771]: I0227 01:26:49.960809 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" event={"ID":"7e507450-eb79-43bc-ae7b-89352c222a44","Type":"ContainerDied","Data":"b47bf402f1d987d47b0b75f1599f21e4974fcb2e6a2d7d3420a07b36a7033fc2"} Feb 27 01:26:49 crc kubenswrapper[4771]: I0227 01:26:49.961102 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" event={"ID":"7e507450-eb79-43bc-ae7b-89352c222a44","Type":"ContainerStarted","Data":"0b28ee3e6bbde0f2462b65ab797aeae0b40bb827edfe5c4e7f638877f52e26f6"} Feb 27 01:26:49 crc kubenswrapper[4771]: I0227 01:26:49.965485 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdd2c054-ca14-46ab-9916-8df49d806879","Type":"ContainerStarted","Data":"542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c"} Feb 27 01:26:49 crc kubenswrapper[4771]: I0227 01:26:49.965768 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 01:26:50 crc kubenswrapper[4771]: I0227 01:26:50.008540 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.075230379 podStartE2EDuration="6.008521881s" podCreationTimestamp="2026-02-27 01:26:44 +0000 UTC" firstStartedPulling="2026-02-27 01:26:45.158633162 +0000 UTC m=+1318.096194460" lastFinishedPulling="2026-02-27 01:26:49.091924674 +0000 UTC m=+1322.029485962" observedRunningTime="2026-02-27 01:26:50.005004835 +0000 UTC m=+1322.942566153" watchObservedRunningTime="2026-02-27 01:26:50.008521881 +0000 UTC m=+1322.946083169" Feb 27 01:26:50 crc kubenswrapper[4771]: I0227 01:26:50.161626 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:50 crc kubenswrapper[4771]: I0227 01:26:50.726907 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 27 01:26:50 crc kubenswrapper[4771]: I0227 01:26:50.969259 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:26:50 crc kubenswrapper[4771]: I0227 01:26:50.981525 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0f3aca07-3840-4b9d-85f8-86599214f911" containerName="nova-api-log" containerID="cri-o://3d1195a4447640cb2139971edfb945a9f0e592bb9483c3518ec0899259141839" gracePeriod=30 Feb 27 01:26:50 crc kubenswrapper[4771]: I0227 01:26:50.981977 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0f3aca07-3840-4b9d-85f8-86599214f911" containerName="nova-api-api" containerID="cri-o://ea3176dc132bfd634de96d7d6c3c3c5b8796732cc28b642bd241b0da88679c95" gracePeriod=30 Feb 27 01:26:50 crc kubenswrapper[4771]: I0227 01:26:50.982257 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" event={"ID":"7e507450-eb79-43bc-ae7b-89352c222a44","Type":"ContainerStarted","Data":"47fc1671416415f62793ec01aad9d8d3dd23dbaa69355a05c405d3c3e52adabf"} Feb 27 01:26:50 crc kubenswrapper[4771]: I0227 01:26:50.983138 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" Feb 27 01:26:51 crc kubenswrapper[4771]: I0227 01:26:51.017199 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" podStartSLOduration=3.017175507 podStartE2EDuration="3.017175507s" podCreationTimestamp="2026-02-27 01:26:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:26:51.01435735 +0000 UTC m=+1323.951918638" watchObservedRunningTime="2026-02-27 01:26:51.017175507 +0000 UTC m=+1323.954736805" Feb 27 01:26:51 crc kubenswrapper[4771]: I0227 01:26:51.997759 4771 generic.go:334] "Generic (PLEG): container finished" podID="0f3aca07-3840-4b9d-85f8-86599214f911" containerID="3d1195a4447640cb2139971edfb945a9f0e592bb9483c3518ec0899259141839" exitCode=143 Feb 27 01:26:51 crc kubenswrapper[4771]: I0227 01:26:51.998624 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0f3aca07-3840-4b9d-85f8-86599214f911","Type":"ContainerDied","Data":"3d1195a4447640cb2139971edfb945a9f0e592bb9483c3518ec0899259141839"} Feb 27 01:26:51 crc kubenswrapper[4771]: I0227 01:26:51.998799 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cdd2c054-ca14-46ab-9916-8df49d806879" containerName="ceilometer-central-agent" containerID="cri-o://5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04" gracePeriod=30 Feb 27 01:26:51 crc kubenswrapper[4771]: I0227 01:26:51.999164 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cdd2c054-ca14-46ab-9916-8df49d806879" containerName="proxy-httpd" containerID="cri-o://542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c" gracePeriod=30 Feb 27 01:26:51 crc kubenswrapper[4771]: I0227 01:26:51.999248 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cdd2c054-ca14-46ab-9916-8df49d806879" containerName="ceilometer-notification-agent" containerID="cri-o://0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c" gracePeriod=30 Feb 27 01:26:51 crc kubenswrapper[4771]: I0227 01:26:51.999290 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cdd2c054-ca14-46ab-9916-8df49d806879" containerName="sg-core" containerID="cri-o://83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c" gracePeriod=30 Feb 27 01:26:52 crc kubenswrapper[4771]: I0227 01:26:52.829290 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:26:52 crc kubenswrapper[4771]: I0227 01:26:52.929237 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-config-data\") pod \"cdd2c054-ca14-46ab-9916-8df49d806879\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " Feb 27 01:26:52 crc kubenswrapper[4771]: I0227 01:26:52.929309 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-combined-ca-bundle\") pod \"cdd2c054-ca14-46ab-9916-8df49d806879\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " Feb 27 01:26:52 crc kubenswrapper[4771]: I0227 01:26:52.929349 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-scripts\") pod \"cdd2c054-ca14-46ab-9916-8df49d806879\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " Feb 27 01:26:52 crc kubenswrapper[4771]: I0227 01:26:52.929417 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-ceilometer-tls-certs\") pod \"cdd2c054-ca14-46ab-9916-8df49d806879\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " Feb 27 01:26:52 crc kubenswrapper[4771]: I0227 01:26:52.929441 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzhgj\" (UniqueName: \"kubernetes.io/projected/cdd2c054-ca14-46ab-9916-8df49d806879-kube-api-access-xzhgj\") pod \"cdd2c054-ca14-46ab-9916-8df49d806879\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " Feb 27 01:26:52 crc kubenswrapper[4771]: I0227 01:26:52.929524 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-sg-core-conf-yaml\") pod \"cdd2c054-ca14-46ab-9916-8df49d806879\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " Feb 27 01:26:52 crc kubenswrapper[4771]: I0227 01:26:52.929586 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdd2c054-ca14-46ab-9916-8df49d806879-run-httpd\") pod \"cdd2c054-ca14-46ab-9916-8df49d806879\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " Feb 27 01:26:52 crc kubenswrapper[4771]: I0227 01:26:52.929696 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdd2c054-ca14-46ab-9916-8df49d806879-log-httpd\") pod \"cdd2c054-ca14-46ab-9916-8df49d806879\" (UID: \"cdd2c054-ca14-46ab-9916-8df49d806879\") " Feb 27 01:26:52 crc kubenswrapper[4771]: I0227 01:26:52.930242 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdd2c054-ca14-46ab-9916-8df49d806879-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cdd2c054-ca14-46ab-9916-8df49d806879" (UID: "cdd2c054-ca14-46ab-9916-8df49d806879"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:26:52 crc kubenswrapper[4771]: I0227 01:26:52.930468 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdd2c054-ca14-46ab-9916-8df49d806879-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cdd2c054-ca14-46ab-9916-8df49d806879" (UID: "cdd2c054-ca14-46ab-9916-8df49d806879"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:26:52 crc kubenswrapper[4771]: I0227 01:26:52.934865 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdd2c054-ca14-46ab-9916-8df49d806879-kube-api-access-xzhgj" (OuterVolumeSpecName: "kube-api-access-xzhgj") pod "cdd2c054-ca14-46ab-9916-8df49d806879" (UID: "cdd2c054-ca14-46ab-9916-8df49d806879"). InnerVolumeSpecName "kube-api-access-xzhgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:26:52 crc kubenswrapper[4771]: I0227 01:26:52.938686 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-scripts" (OuterVolumeSpecName: "scripts") pod "cdd2c054-ca14-46ab-9916-8df49d806879" (UID: "cdd2c054-ca14-46ab-9916-8df49d806879"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:52 crc kubenswrapper[4771]: I0227 01:26:52.965527 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cdd2c054-ca14-46ab-9916-8df49d806879" (UID: "cdd2c054-ca14-46ab-9916-8df49d806879"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:52 crc kubenswrapper[4771]: I0227 01:26:52.992495 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "cdd2c054-ca14-46ab-9916-8df49d806879" (UID: "cdd2c054-ca14-46ab-9916-8df49d806879"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.004301 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdd2c054-ca14-46ab-9916-8df49d806879" (UID: "cdd2c054-ca14-46ab-9916-8df49d806879"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.013270 4771 generic.go:334] "Generic (PLEG): container finished" podID="cdd2c054-ca14-46ab-9916-8df49d806879" containerID="542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c" exitCode=0 Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.013299 4771 generic.go:334] "Generic (PLEG): container finished" podID="cdd2c054-ca14-46ab-9916-8df49d806879" containerID="83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c" exitCode=2 Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.013308 4771 generic.go:334] "Generic (PLEG): container finished" podID="cdd2c054-ca14-46ab-9916-8df49d806879" containerID="0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c" exitCode=0 Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.013316 4771 generic.go:334] "Generic (PLEG): container finished" podID="cdd2c054-ca14-46ab-9916-8df49d806879" containerID="5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04" exitCode=0 Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.013374 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.013412 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdd2c054-ca14-46ab-9916-8df49d806879","Type":"ContainerDied","Data":"542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c"} Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.013480 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdd2c054-ca14-46ab-9916-8df49d806879","Type":"ContainerDied","Data":"83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c"} Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.013495 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdd2c054-ca14-46ab-9916-8df49d806879","Type":"ContainerDied","Data":"0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c"} Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.013507 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdd2c054-ca14-46ab-9916-8df49d806879","Type":"ContainerDied","Data":"5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04"} Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.013521 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdd2c054-ca14-46ab-9916-8df49d806879","Type":"ContainerDied","Data":"77208d23ba05e7130ae77725e71610af52c678553fb025faeae8c6fca33d36e6"} Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.013540 4771 scope.go:117] "RemoveContainer" containerID="542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.032465 4771 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.032507 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdd2c054-ca14-46ab-9916-8df49d806879-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.032518 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdd2c054-ca14-46ab-9916-8df49d806879-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.032527 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.032536 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.032581 4771 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.032593 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzhgj\" (UniqueName: \"kubernetes.io/projected/cdd2c054-ca14-46ab-9916-8df49d806879-kube-api-access-xzhgj\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.033463 4771 scope.go:117] "RemoveContainer" containerID="83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.054654 4771 scope.go:117] "RemoveContainer" containerID="0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.060208 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-config-data" (OuterVolumeSpecName: "config-data") pod "cdd2c054-ca14-46ab-9916-8df49d806879" (UID: "cdd2c054-ca14-46ab-9916-8df49d806879"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.073375 4771 scope.go:117] "RemoveContainer" containerID="5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.090229 4771 scope.go:117] "RemoveContainer" containerID="542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c" Feb 27 01:26:53 crc kubenswrapper[4771]: E0227 01:26:53.090916 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c\": container with ID starting with 542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c not found: ID does not exist" containerID="542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.090983 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c"} err="failed to get container status \"542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c\": rpc error: code = NotFound desc = could not find container \"542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c\": container with ID starting with 542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c not found: ID does not exist" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.091030 4771 scope.go:117] "RemoveContainer" containerID="83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c" Feb 27 01:26:53 crc kubenswrapper[4771]: E0227 01:26:53.091619 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c\": container with ID starting with 83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c not found: ID does not exist" containerID="83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.091650 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c"} err="failed to get container status \"83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c\": rpc error: code = NotFound desc = could not find container \"83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c\": container with ID starting with 83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c not found: ID does not exist" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.091670 4771 scope.go:117] "RemoveContainer" containerID="0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c" Feb 27 01:26:53 crc kubenswrapper[4771]: E0227 01:26:53.092078 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c\": container with ID starting with 0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c not found: ID does not exist" containerID="0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.092131 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c"} err="failed to get container status \"0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c\": rpc error: code = NotFound desc = could not find container \"0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c\": container with ID starting with 0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c not found: ID does not exist" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.092162 4771 scope.go:117] "RemoveContainer" containerID="5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04" Feb 27 01:26:53 crc kubenswrapper[4771]: E0227 01:26:53.092452 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04\": container with ID starting with 5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04 not found: ID does not exist" containerID="5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.092482 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04"} err="failed to get container status \"5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04\": rpc error: code = NotFound desc = could not find container \"5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04\": container with ID starting with 5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04 not found: ID does not exist" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.092500 4771 scope.go:117] "RemoveContainer" containerID="542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.092852 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c"} err="failed to get container status \"542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c\": rpc error: code = NotFound desc = could not find container \"542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c\": container with ID starting with 542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c not found: ID does not exist" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.092876 4771 scope.go:117] "RemoveContainer" containerID="83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.093135 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c"} err="failed to get container status \"83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c\": rpc error: code = NotFound desc = could not find container \"83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c\": container with ID starting with 83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c not found: ID does not exist" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.093157 4771 scope.go:117] "RemoveContainer" containerID="0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.093410 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c"} err="failed to get container status \"0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c\": rpc error: code = NotFound desc = could not find container \"0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c\": container with ID starting with 0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c not found: ID does not exist" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.093439 4771 scope.go:117] "RemoveContainer" containerID="5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.093714 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04"} err="failed to get container status \"5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04\": rpc error: code = NotFound desc = could not find container \"5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04\": container with ID starting with 5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04 not found: ID does not exist" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.093739 4771 scope.go:117] "RemoveContainer" containerID="542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.094029 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c"} err="failed to get container status \"542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c\": rpc error: code = NotFound desc = could not find container \"542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c\": container with ID starting with 542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c not found: ID does not exist" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.094074 4771 scope.go:117] "RemoveContainer" containerID="83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.094383 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c"} err="failed to get container status \"83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c\": rpc error: code = NotFound desc = could not find container \"83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c\": container with ID starting with 83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c not found: ID does not exist" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.094429 4771 scope.go:117] "RemoveContainer" containerID="0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.094760 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c"} err="failed to get container status \"0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c\": rpc error: code = NotFound desc = could not find container \"0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c\": container with ID starting with 0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c not found: ID does not exist" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.094785 4771 scope.go:117] "RemoveContainer" containerID="5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.095013 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04"} err="failed to get container status \"5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04\": rpc error: code = NotFound desc = could not find container \"5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04\": container with ID starting with 5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04 not found: ID does not exist" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.095053 4771 scope.go:117] "RemoveContainer" containerID="542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.095308 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c"} err="failed to get container status \"542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c\": rpc error: code = NotFound desc = could not find container \"542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c\": container with ID starting with 542f70b231b8f1c142614b64a655f4e36003eb03963365227b4aa6b79c9ee45c not found: ID does not exist" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.095333 4771 scope.go:117] "RemoveContainer" containerID="83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.095607 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c"} err="failed to get container status \"83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c\": rpc error: code = NotFound desc = could not find container \"83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c\": container with ID starting with 83dbe9f0fdd23e16ff381095d8a3df6c0e3f482ed87027df7bb0822e1d9af39c not found: ID does not exist" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.095654 4771 scope.go:117] "RemoveContainer" containerID="0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.095971 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c"} err="failed to get container status \"0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c\": rpc error: code = NotFound desc = could not find container \"0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c\": container with ID starting with 0bf55372ece06e7630239a477e3ac4aa7769686d684182eeedd69db0e0fb3b0c not found: ID does not exist" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.096000 4771 scope.go:117] "RemoveContainer" containerID="5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.096255 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04"} err="failed to get container status \"5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04\": rpc error: code = NotFound desc = could not find container \"5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04\": container with ID starting with 5d5f77d0921a00561c63844a8d008f08f3e94a83eb93d16d1dc693621b999e04 not found: ID does not exist" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.133882 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdd2c054-ca14-46ab-9916-8df49d806879-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.380950 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.396870 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.411169 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:26:53 crc kubenswrapper[4771]: E0227 01:26:53.411784 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdd2c054-ca14-46ab-9916-8df49d806879" containerName="ceilometer-central-agent" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.411814 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdd2c054-ca14-46ab-9916-8df49d806879" containerName="ceilometer-central-agent" Feb 27 01:26:53 crc kubenswrapper[4771]: E0227 01:26:53.411859 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdd2c054-ca14-46ab-9916-8df49d806879" containerName="ceilometer-notification-agent" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.411873 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdd2c054-ca14-46ab-9916-8df49d806879" containerName="ceilometer-notification-agent" Feb 27 01:26:53 crc kubenswrapper[4771]: E0227 01:26:53.411902 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdd2c054-ca14-46ab-9916-8df49d806879" containerName="sg-core" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.411914 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdd2c054-ca14-46ab-9916-8df49d806879" containerName="sg-core" Feb 27 01:26:53 crc kubenswrapper[4771]: E0227 01:26:53.411940 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdd2c054-ca14-46ab-9916-8df49d806879" containerName="proxy-httpd" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.411952 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdd2c054-ca14-46ab-9916-8df49d806879" containerName="proxy-httpd" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.412277 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdd2c054-ca14-46ab-9916-8df49d806879" containerName="proxy-httpd" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.412329 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdd2c054-ca14-46ab-9916-8df49d806879" containerName="ceilometer-central-agent" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.412348 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdd2c054-ca14-46ab-9916-8df49d806879" containerName="ceilometer-notification-agent" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.412375 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdd2c054-ca14-46ab-9916-8df49d806879" containerName="sg-core" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.415324 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.421134 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.421612 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.421671 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.423175 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.545330 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26685008-55b9-4176-98b8-f915a6004b36-run-httpd\") pod \"ceilometer-0\" (UID: \"26685008-55b9-4176-98b8-f915a6004b36\") " pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.545389 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/26685008-55b9-4176-98b8-f915a6004b36-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"26685008-55b9-4176-98b8-f915a6004b36\") " pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.545436 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26685008-55b9-4176-98b8-f915a6004b36-config-data\") pod \"ceilometer-0\" (UID: \"26685008-55b9-4176-98b8-f915a6004b36\") " pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.545523 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26685008-55b9-4176-98b8-f915a6004b36-log-httpd\") pod \"ceilometer-0\" (UID: \"26685008-55b9-4176-98b8-f915a6004b36\") " pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.545684 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26685008-55b9-4176-98b8-f915a6004b36-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"26685008-55b9-4176-98b8-f915a6004b36\") " pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.545976 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26685008-55b9-4176-98b8-f915a6004b36-scripts\") pod \"ceilometer-0\" (UID: \"26685008-55b9-4176-98b8-f915a6004b36\") " pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.546118 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/26685008-55b9-4176-98b8-f915a6004b36-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"26685008-55b9-4176-98b8-f915a6004b36\") " pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.546177 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh5zh\" (UniqueName: \"kubernetes.io/projected/26685008-55b9-4176-98b8-f915a6004b36-kube-api-access-lh5zh\") pod \"ceilometer-0\" (UID: \"26685008-55b9-4176-98b8-f915a6004b36\") " pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.647639 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26685008-55b9-4176-98b8-f915a6004b36-scripts\") pod \"ceilometer-0\" (UID: \"26685008-55b9-4176-98b8-f915a6004b36\") " pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.647708 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/26685008-55b9-4176-98b8-f915a6004b36-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"26685008-55b9-4176-98b8-f915a6004b36\") " pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.647731 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh5zh\" (UniqueName: \"kubernetes.io/projected/26685008-55b9-4176-98b8-f915a6004b36-kube-api-access-lh5zh\") pod \"ceilometer-0\" (UID: \"26685008-55b9-4176-98b8-f915a6004b36\") " pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.647825 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26685008-55b9-4176-98b8-f915a6004b36-run-httpd\") pod \"ceilometer-0\" (UID: \"26685008-55b9-4176-98b8-f915a6004b36\") " pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.647842 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/26685008-55b9-4176-98b8-f915a6004b36-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"26685008-55b9-4176-98b8-f915a6004b36\") " pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.647860 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26685008-55b9-4176-98b8-f915a6004b36-config-data\") pod \"ceilometer-0\" (UID: \"26685008-55b9-4176-98b8-f915a6004b36\") " pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.647877 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26685008-55b9-4176-98b8-f915a6004b36-log-httpd\") pod \"ceilometer-0\" (UID: \"26685008-55b9-4176-98b8-f915a6004b36\") " pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.647897 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26685008-55b9-4176-98b8-f915a6004b36-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"26685008-55b9-4176-98b8-f915a6004b36\") " pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.648657 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26685008-55b9-4176-98b8-f915a6004b36-log-httpd\") pod \"ceilometer-0\" (UID: \"26685008-55b9-4176-98b8-f915a6004b36\") " pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.648693 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26685008-55b9-4176-98b8-f915a6004b36-run-httpd\") pod \"ceilometer-0\" (UID: \"26685008-55b9-4176-98b8-f915a6004b36\") " pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.652019 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/26685008-55b9-4176-98b8-f915a6004b36-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"26685008-55b9-4176-98b8-f915a6004b36\") " pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.652124 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/26685008-55b9-4176-98b8-f915a6004b36-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"26685008-55b9-4176-98b8-f915a6004b36\") " pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.653308 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26685008-55b9-4176-98b8-f915a6004b36-config-data\") pod \"ceilometer-0\" (UID: \"26685008-55b9-4176-98b8-f915a6004b36\") " pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.659649 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26685008-55b9-4176-98b8-f915a6004b36-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"26685008-55b9-4176-98b8-f915a6004b36\") " pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.662145 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26685008-55b9-4176-98b8-f915a6004b36-scripts\") pod \"ceilometer-0\" (UID: \"26685008-55b9-4176-98b8-f915a6004b36\") " pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.667887 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh5zh\" (UniqueName: \"kubernetes.io/projected/26685008-55b9-4176-98b8-f915a6004b36-kube-api-access-lh5zh\") pod \"ceilometer-0\" (UID: \"26685008-55b9-4176-98b8-f915a6004b36\") " pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.737952 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 27 01:26:53 crc kubenswrapper[4771]: I0227 01:26:53.785053 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdd2c054-ca14-46ab-9916-8df49d806879" path="/var/lib/kubelet/pods/cdd2c054-ca14-46ab-9916-8df49d806879/volumes" Feb 27 01:26:54 crc kubenswrapper[4771]: I0227 01:26:54.206996 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 27 01:26:54 crc kubenswrapper[4771]: W0227 01:26:54.229075 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26685008_55b9_4176_98b8_f915a6004b36.slice/crio-c9ba376d8f8c6753dd4f0308f5f20fa95a95e6e1d50f8c9dc9b5a051693da2d6 WatchSource:0}: Error finding container c9ba376d8f8c6753dd4f0308f5f20fa95a95e6e1d50f8c9dc9b5a051693da2d6: Status 404 returned error can't find the container with id c9ba376d8f8c6753dd4f0308f5f20fa95a95e6e1d50f8c9dc9b5a051693da2d6 Feb 27 01:26:54 crc kubenswrapper[4771]: I0227 01:26:54.687587 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 01:26:54 crc kubenswrapper[4771]: I0227 01:26:54.872791 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f3aca07-3840-4b9d-85f8-86599214f911-combined-ca-bundle\") pod \"0f3aca07-3840-4b9d-85f8-86599214f911\" (UID: \"0f3aca07-3840-4b9d-85f8-86599214f911\") " Feb 27 01:26:54 crc kubenswrapper[4771]: I0227 01:26:54.872849 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4hhx\" (UniqueName: \"kubernetes.io/projected/0f3aca07-3840-4b9d-85f8-86599214f911-kube-api-access-l4hhx\") pod \"0f3aca07-3840-4b9d-85f8-86599214f911\" (UID: \"0f3aca07-3840-4b9d-85f8-86599214f911\") " Feb 27 01:26:54 crc kubenswrapper[4771]: I0227 01:26:54.872980 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f3aca07-3840-4b9d-85f8-86599214f911-logs\") pod \"0f3aca07-3840-4b9d-85f8-86599214f911\" (UID: \"0f3aca07-3840-4b9d-85f8-86599214f911\") " Feb 27 01:26:54 crc kubenswrapper[4771]: I0227 01:26:54.873024 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f3aca07-3840-4b9d-85f8-86599214f911-config-data\") pod \"0f3aca07-3840-4b9d-85f8-86599214f911\" (UID: \"0f3aca07-3840-4b9d-85f8-86599214f911\") " Feb 27 01:26:54 crc kubenswrapper[4771]: I0227 01:26:54.874959 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f3aca07-3840-4b9d-85f8-86599214f911-logs" (OuterVolumeSpecName: "logs") pod "0f3aca07-3840-4b9d-85f8-86599214f911" (UID: "0f3aca07-3840-4b9d-85f8-86599214f911"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:26:54 crc kubenswrapper[4771]: I0227 01:26:54.877107 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f3aca07-3840-4b9d-85f8-86599214f911-logs\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:54 crc kubenswrapper[4771]: I0227 01:26:54.884437 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f3aca07-3840-4b9d-85f8-86599214f911-kube-api-access-l4hhx" (OuterVolumeSpecName: "kube-api-access-l4hhx") pod "0f3aca07-3840-4b9d-85f8-86599214f911" (UID: "0f3aca07-3840-4b9d-85f8-86599214f911"). InnerVolumeSpecName "kube-api-access-l4hhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:26:54 crc kubenswrapper[4771]: I0227 01:26:54.918864 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f3aca07-3840-4b9d-85f8-86599214f911-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f3aca07-3840-4b9d-85f8-86599214f911" (UID: "0f3aca07-3840-4b9d-85f8-86599214f911"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:54 crc kubenswrapper[4771]: I0227 01:26:54.923364 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f3aca07-3840-4b9d-85f8-86599214f911-config-data" (OuterVolumeSpecName: "config-data") pod "0f3aca07-3840-4b9d-85f8-86599214f911" (UID: "0f3aca07-3840-4b9d-85f8-86599214f911"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:26:54 crc kubenswrapper[4771]: I0227 01:26:54.979304 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f3aca07-3840-4b9d-85f8-86599214f911-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:54 crc kubenswrapper[4771]: I0227 01:26:54.979347 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f3aca07-3840-4b9d-85f8-86599214f911-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:54 crc kubenswrapper[4771]: I0227 01:26:54.979360 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4hhx\" (UniqueName: \"kubernetes.io/projected/0f3aca07-3840-4b9d-85f8-86599214f911-kube-api-access-l4hhx\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.034706 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26685008-55b9-4176-98b8-f915a6004b36","Type":"ContainerStarted","Data":"fc30a27810addeb174ae7019e2d94060cfedd18bce01c7fa18629895565abdd6"} Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.034745 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26685008-55b9-4176-98b8-f915a6004b36","Type":"ContainerStarted","Data":"c9ba376d8f8c6753dd4f0308f5f20fa95a95e6e1d50f8c9dc9b5a051693da2d6"} Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.037475 4771 generic.go:334] "Generic (PLEG): container finished" podID="0f3aca07-3840-4b9d-85f8-86599214f911" containerID="ea3176dc132bfd634de96d7d6c3c3c5b8796732cc28b642bd241b0da88679c95" exitCode=0 Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.037512 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0f3aca07-3840-4b9d-85f8-86599214f911","Type":"ContainerDied","Data":"ea3176dc132bfd634de96d7d6c3c3c5b8796732cc28b642bd241b0da88679c95"} Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.037531 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0f3aca07-3840-4b9d-85f8-86599214f911","Type":"ContainerDied","Data":"068f84c6c4b7b7cbd149deee0b89c86c3a3933111854bfe840ea50f11d657d5c"} Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.037578 4771 scope.go:117] "RemoveContainer" containerID="ea3176dc132bfd634de96d7d6c3c3c5b8796732cc28b642bd241b0da88679c95" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.037960 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.062746 4771 scope.go:117] "RemoveContainer" containerID="3d1195a4447640cb2139971edfb945a9f0e592bb9483c3518ec0899259141839" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.077330 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.085284 4771 scope.go:117] "RemoveContainer" containerID="ea3176dc132bfd634de96d7d6c3c3c5b8796732cc28b642bd241b0da88679c95" Feb 27 01:26:55 crc kubenswrapper[4771]: E0227 01:26:55.085793 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea3176dc132bfd634de96d7d6c3c3c5b8796732cc28b642bd241b0da88679c95\": container with ID starting with ea3176dc132bfd634de96d7d6c3c3c5b8796732cc28b642bd241b0da88679c95 not found: ID does not exist" containerID="ea3176dc132bfd634de96d7d6c3c3c5b8796732cc28b642bd241b0da88679c95" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.085848 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea3176dc132bfd634de96d7d6c3c3c5b8796732cc28b642bd241b0da88679c95"} err="failed to get container status \"ea3176dc132bfd634de96d7d6c3c3c5b8796732cc28b642bd241b0da88679c95\": rpc error: code = NotFound desc = could not find container \"ea3176dc132bfd634de96d7d6c3c3c5b8796732cc28b642bd241b0da88679c95\": container with ID starting with ea3176dc132bfd634de96d7d6c3c3c5b8796732cc28b642bd241b0da88679c95 not found: ID does not exist" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.085886 4771 scope.go:117] "RemoveContainer" containerID="3d1195a4447640cb2139971edfb945a9f0e592bb9483c3518ec0899259141839" Feb 27 01:26:55 crc kubenswrapper[4771]: E0227 01:26:55.086264 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d1195a4447640cb2139971edfb945a9f0e592bb9483c3518ec0899259141839\": container with ID starting with 3d1195a4447640cb2139971edfb945a9f0e592bb9483c3518ec0899259141839 not found: ID does not exist" containerID="3d1195a4447640cb2139971edfb945a9f0e592bb9483c3518ec0899259141839" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.086287 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d1195a4447640cb2139971edfb945a9f0e592bb9483c3518ec0899259141839"} err="failed to get container status \"3d1195a4447640cb2139971edfb945a9f0e592bb9483c3518ec0899259141839\": rpc error: code = NotFound desc = could not find container \"3d1195a4447640cb2139971edfb945a9f0e592bb9483c3518ec0899259141839\": container with ID starting with 3d1195a4447640cb2139971edfb945a9f0e592bb9483c3518ec0899259141839 not found: ID does not exist" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.108019 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.128334 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 27 01:26:55 crc kubenswrapper[4771]: E0227 01:26:55.128863 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3aca07-3840-4b9d-85f8-86599214f911" containerName="nova-api-log" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.128926 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3aca07-3840-4b9d-85f8-86599214f911" containerName="nova-api-log" Feb 27 01:26:55 crc kubenswrapper[4771]: E0227 01:26:55.128982 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3aca07-3840-4b9d-85f8-86599214f911" containerName="nova-api-api" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.129055 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3aca07-3840-4b9d-85f8-86599214f911" containerName="nova-api-api" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.129263 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f3aca07-3840-4b9d-85f8-86599214f911" containerName="nova-api-log" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.129321 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f3aca07-3840-4b9d-85f8-86599214f911" containerName="nova-api-api" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.130267 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.133979 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.134489 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.134673 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.139810 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.161434 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.183503 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.286792 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b95f2002-9af0-440b-956d-2734dde1a919-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b95f2002-9af0-440b-956d-2734dde1a919\") " pod="openstack/nova-api-0" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.286844 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b95f2002-9af0-440b-956d-2734dde1a919-config-data\") pod \"nova-api-0\" (UID: \"b95f2002-9af0-440b-956d-2734dde1a919\") " pod="openstack/nova-api-0" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.286872 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rxfk\" (UniqueName: \"kubernetes.io/projected/b95f2002-9af0-440b-956d-2734dde1a919-kube-api-access-9rxfk\") pod \"nova-api-0\" (UID: \"b95f2002-9af0-440b-956d-2734dde1a919\") " pod="openstack/nova-api-0" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.287080 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b95f2002-9af0-440b-956d-2734dde1a919-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b95f2002-9af0-440b-956d-2734dde1a919\") " pod="openstack/nova-api-0" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.287127 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b95f2002-9af0-440b-956d-2734dde1a919-logs\") pod \"nova-api-0\" (UID: \"b95f2002-9af0-440b-956d-2734dde1a919\") " pod="openstack/nova-api-0" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.287147 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b95f2002-9af0-440b-956d-2734dde1a919-public-tls-certs\") pod \"nova-api-0\" (UID: \"b95f2002-9af0-440b-956d-2734dde1a919\") " pod="openstack/nova-api-0" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.388990 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b95f2002-9af0-440b-956d-2734dde1a919-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b95f2002-9af0-440b-956d-2734dde1a919\") " pod="openstack/nova-api-0" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.389092 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b95f2002-9af0-440b-956d-2734dde1a919-logs\") pod \"nova-api-0\" (UID: \"b95f2002-9af0-440b-956d-2734dde1a919\") " pod="openstack/nova-api-0" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.389128 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b95f2002-9af0-440b-956d-2734dde1a919-public-tls-certs\") pod \"nova-api-0\" (UID: \"b95f2002-9af0-440b-956d-2734dde1a919\") " pod="openstack/nova-api-0" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.389235 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b95f2002-9af0-440b-956d-2734dde1a919-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b95f2002-9af0-440b-956d-2734dde1a919\") " pod="openstack/nova-api-0" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.389273 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b95f2002-9af0-440b-956d-2734dde1a919-config-data\") pod \"nova-api-0\" (UID: \"b95f2002-9af0-440b-956d-2734dde1a919\") " pod="openstack/nova-api-0" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.389314 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rxfk\" (UniqueName: \"kubernetes.io/projected/b95f2002-9af0-440b-956d-2734dde1a919-kube-api-access-9rxfk\") pod \"nova-api-0\" (UID: \"b95f2002-9af0-440b-956d-2734dde1a919\") " pod="openstack/nova-api-0" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.390860 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b95f2002-9af0-440b-956d-2734dde1a919-logs\") pod \"nova-api-0\" (UID: \"b95f2002-9af0-440b-956d-2734dde1a919\") " pod="openstack/nova-api-0" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.393140 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b95f2002-9af0-440b-956d-2734dde1a919-config-data\") pod \"nova-api-0\" (UID: \"b95f2002-9af0-440b-956d-2734dde1a919\") " pod="openstack/nova-api-0" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.394616 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b95f2002-9af0-440b-956d-2734dde1a919-public-tls-certs\") pod \"nova-api-0\" (UID: \"b95f2002-9af0-440b-956d-2734dde1a919\") " pod="openstack/nova-api-0" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.397369 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b95f2002-9af0-440b-956d-2734dde1a919-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b95f2002-9af0-440b-956d-2734dde1a919\") " pod="openstack/nova-api-0" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.398705 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b95f2002-9af0-440b-956d-2734dde1a919-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b95f2002-9af0-440b-956d-2734dde1a919\") " pod="openstack/nova-api-0" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.406806 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rxfk\" (UniqueName: \"kubernetes.io/projected/b95f2002-9af0-440b-956d-2734dde1a919-kube-api-access-9rxfk\") pod \"nova-api-0\" (UID: \"b95f2002-9af0-440b-956d-2734dde1a919\") " pod="openstack/nova-api-0" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.452430 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.787076 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f3aca07-3840-4b9d-85f8-86599214f911" path="/var/lib/kubelet/pods/0f3aca07-3840-4b9d-85f8-86599214f911/volumes" Feb 27 01:26:55 crc kubenswrapper[4771]: I0227 01:26:55.947247 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 01:26:55 crc kubenswrapper[4771]: W0227 01:26:55.949161 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb95f2002_9af0_440b_956d_2734dde1a919.slice/crio-321ada706e0a6907360f751ad517012d615e733d4e28797816aaadaf3eeec543 WatchSource:0}: Error finding container 321ada706e0a6907360f751ad517012d615e733d4e28797816aaadaf3eeec543: Status 404 returned error can't find the container with id 321ada706e0a6907360f751ad517012d615e733d4e28797816aaadaf3eeec543 Feb 27 01:26:56 crc kubenswrapper[4771]: I0227 01:26:56.047865 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b95f2002-9af0-440b-956d-2734dde1a919","Type":"ContainerStarted","Data":"321ada706e0a6907360f751ad517012d615e733d4e28797816aaadaf3eeec543"} Feb 27 01:26:56 crc kubenswrapper[4771]: I0227 01:26:56.052222 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26685008-55b9-4176-98b8-f915a6004b36","Type":"ContainerStarted","Data":"ecbc8dc5fa007f4f67950a5dbf85f40a34b565dfe2aec392055169e325961e21"} Feb 27 01:26:56 crc kubenswrapper[4771]: I0227 01:26:56.078770 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 27 01:26:56 crc kubenswrapper[4771]: I0227 01:26:56.317102 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-mzh27"] Feb 27 01:26:56 crc kubenswrapper[4771]: I0227 01:26:56.322257 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mzh27" Feb 27 01:26:56 crc kubenswrapper[4771]: I0227 01:26:56.324536 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 27 01:26:56 crc kubenswrapper[4771]: I0227 01:26:56.324823 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 27 01:26:56 crc kubenswrapper[4771]: I0227 01:26:56.334665 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mzh27"] Feb 27 01:26:56 crc kubenswrapper[4771]: I0227 01:26:56.409317 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f3b6c0-daf5-40b9-bdd9-008890a2684a-config-data\") pod \"nova-cell1-cell-mapping-mzh27\" (UID: \"f0f3b6c0-daf5-40b9-bdd9-008890a2684a\") " pod="openstack/nova-cell1-cell-mapping-mzh27" Feb 27 01:26:56 crc kubenswrapper[4771]: I0227 01:26:56.409441 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-977ts\" (UniqueName: \"kubernetes.io/projected/f0f3b6c0-daf5-40b9-bdd9-008890a2684a-kube-api-access-977ts\") pod \"nova-cell1-cell-mapping-mzh27\" (UID: \"f0f3b6c0-daf5-40b9-bdd9-008890a2684a\") " pod="openstack/nova-cell1-cell-mapping-mzh27" Feb 27 01:26:56 crc kubenswrapper[4771]: I0227 01:26:56.409510 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f3b6c0-daf5-40b9-bdd9-008890a2684a-scripts\") pod \"nova-cell1-cell-mapping-mzh27\" (UID: \"f0f3b6c0-daf5-40b9-bdd9-008890a2684a\") " pod="openstack/nova-cell1-cell-mapping-mzh27" Feb 27 01:26:56 crc kubenswrapper[4771]: I0227 01:26:56.409570 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f3b6c0-daf5-40b9-bdd9-008890a2684a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mzh27\" (UID: \"f0f3b6c0-daf5-40b9-bdd9-008890a2684a\") " pod="openstack/nova-cell1-cell-mapping-mzh27" Feb 27 01:26:56 crc kubenswrapper[4771]: I0227 01:26:56.511350 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f3b6c0-daf5-40b9-bdd9-008890a2684a-config-data\") pod \"nova-cell1-cell-mapping-mzh27\" (UID: \"f0f3b6c0-daf5-40b9-bdd9-008890a2684a\") " pod="openstack/nova-cell1-cell-mapping-mzh27" Feb 27 01:26:56 crc kubenswrapper[4771]: I0227 01:26:56.511492 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-977ts\" (UniqueName: \"kubernetes.io/projected/f0f3b6c0-daf5-40b9-bdd9-008890a2684a-kube-api-access-977ts\") pod \"nova-cell1-cell-mapping-mzh27\" (UID: \"f0f3b6c0-daf5-40b9-bdd9-008890a2684a\") " pod="openstack/nova-cell1-cell-mapping-mzh27" Feb 27 01:26:56 crc kubenswrapper[4771]: I0227 01:26:56.511582 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f3b6c0-daf5-40b9-bdd9-008890a2684a-scripts\") pod \"nova-cell1-cell-mapping-mzh27\" (UID: \"f0f3b6c0-daf5-40b9-bdd9-008890a2684a\") " pod="openstack/nova-cell1-cell-mapping-mzh27" Feb 27 01:26:56 crc kubenswrapper[4771]: I0227 01:26:56.511632 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f3b6c0-daf5-40b9-bdd9-008890a2684a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mzh27\" (UID: \"f0f3b6c0-daf5-40b9-bdd9-008890a2684a\") " pod="openstack/nova-cell1-cell-mapping-mzh27" Feb 27 01:26:56 crc kubenswrapper[4771]: I0227 01:26:56.515823 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f3b6c0-daf5-40b9-bdd9-008890a2684a-config-data\") pod \"nova-cell1-cell-mapping-mzh27\" (UID: \"f0f3b6c0-daf5-40b9-bdd9-008890a2684a\") " pod="openstack/nova-cell1-cell-mapping-mzh27" Feb 27 01:26:56 crc kubenswrapper[4771]: I0227 01:26:56.516290 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f3b6c0-daf5-40b9-bdd9-008890a2684a-scripts\") pod \"nova-cell1-cell-mapping-mzh27\" (UID: \"f0f3b6c0-daf5-40b9-bdd9-008890a2684a\") " pod="openstack/nova-cell1-cell-mapping-mzh27" Feb 27 01:26:56 crc kubenswrapper[4771]: I0227 01:26:56.529177 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-977ts\" (UniqueName: \"kubernetes.io/projected/f0f3b6c0-daf5-40b9-bdd9-008890a2684a-kube-api-access-977ts\") pod \"nova-cell1-cell-mapping-mzh27\" (UID: \"f0f3b6c0-daf5-40b9-bdd9-008890a2684a\") " pod="openstack/nova-cell1-cell-mapping-mzh27" Feb 27 01:26:56 crc kubenswrapper[4771]: I0227 01:26:56.529813 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f3b6c0-daf5-40b9-bdd9-008890a2684a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mzh27\" (UID: \"f0f3b6c0-daf5-40b9-bdd9-008890a2684a\") " pod="openstack/nova-cell1-cell-mapping-mzh27" Feb 27 01:26:56 crc kubenswrapper[4771]: I0227 01:26:56.675515 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mzh27" Feb 27 01:26:57 crc kubenswrapper[4771]: I0227 01:26:57.065010 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b95f2002-9af0-440b-956d-2734dde1a919","Type":"ContainerStarted","Data":"67b43932ca2388709a10439977aacc1ede8a467713f717bc06b171fd409aa17d"} Feb 27 01:26:57 crc kubenswrapper[4771]: I0227 01:26:57.065360 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b95f2002-9af0-440b-956d-2734dde1a919","Type":"ContainerStarted","Data":"6d058c14f94f6075b3d3fc347b91c26d963448674cbe80dcd44a15eb3a8e7e26"} Feb 27 01:26:57 crc kubenswrapper[4771]: I0227 01:26:57.071236 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26685008-55b9-4176-98b8-f915a6004b36","Type":"ContainerStarted","Data":"4a14db5511e363959d02b908ac771059e9694a5f316c163b7be52a0515c1dc50"} Feb 27 01:26:57 crc kubenswrapper[4771]: I0227 01:26:57.092315 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.092295272 podStartE2EDuration="2.092295272s" podCreationTimestamp="2026-02-27 01:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:26:57.079904065 +0000 UTC m=+1330.017465353" watchObservedRunningTime="2026-02-27 01:26:57.092295272 +0000 UTC m=+1330.029856560" Feb 27 01:26:57 crc kubenswrapper[4771]: I0227 01:26:57.179708 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mzh27"] Feb 27 01:26:57 crc kubenswrapper[4771]: W0227 01:26:57.180367 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0f3b6c0_daf5_40b9_bdd9_008890a2684a.slice/crio-142b070946535f18e157924e484be334b583fb16706efb9e88d5b697f118a3bd WatchSource:0}: Error finding container 142b070946535f18e157924e484be334b583fb16706efb9e88d5b697f118a3bd: Status 404 returned error can't find the container with id 142b070946535f18e157924e484be334b583fb16706efb9e88d5b697f118a3bd Feb 27 01:26:58 crc kubenswrapper[4771]: I0227 01:26:58.085285 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mzh27" event={"ID":"f0f3b6c0-daf5-40b9-bdd9-008890a2684a","Type":"ContainerStarted","Data":"795b1623dbd9b500c659a38f5a9f2f23a7f05e928acadd25f8f99fccf7637d4f"} Feb 27 01:26:58 crc kubenswrapper[4771]: I0227 01:26:58.085602 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mzh27" event={"ID":"f0f3b6c0-daf5-40b9-bdd9-008890a2684a","Type":"ContainerStarted","Data":"142b070946535f18e157924e484be334b583fb16706efb9e88d5b697f118a3bd"} Feb 27 01:26:58 crc kubenswrapper[4771]: I0227 01:26:58.109470 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-mzh27" podStartSLOduration=2.109449239 podStartE2EDuration="2.109449239s" podCreationTimestamp="2026-02-27 01:26:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:26:58.103734384 +0000 UTC m=+1331.041295712" watchObservedRunningTime="2026-02-27 01:26:58.109449239 +0000 UTC m=+1331.047010537" Feb 27 01:26:58 crc kubenswrapper[4771]: I0227 01:26:58.496770 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" Feb 27 01:26:58 crc kubenswrapper[4771]: I0227 01:26:58.604465 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-mr7mf"] Feb 27 01:26:58 crc kubenswrapper[4771]: I0227 01:26:58.604717 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" podUID="bc9923bd-11c1-4d1d-965b-17e8352ece8c" containerName="dnsmasq-dns" containerID="cri-o://8ea850a3026c7791dfec2596a9f06c5992bde67d230a4b9e7ae6bb75c4c9544d" gracePeriod=10 Feb 27 01:26:58 crc kubenswrapper[4771]: I0227 01:26:58.954165 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:26:58 crc kubenswrapper[4771]: I0227 01:26:58.954451 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.093792 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.094800 4771 generic.go:334] "Generic (PLEG): container finished" podID="bc9923bd-11c1-4d1d-965b-17e8352ece8c" containerID="8ea850a3026c7791dfec2596a9f06c5992bde67d230a4b9e7ae6bb75c4c9544d" exitCode=0 Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.094829 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" event={"ID":"bc9923bd-11c1-4d1d-965b-17e8352ece8c","Type":"ContainerDied","Data":"8ea850a3026c7791dfec2596a9f06c5992bde67d230a4b9e7ae6bb75c4c9544d"} Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.094860 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" event={"ID":"bc9923bd-11c1-4d1d-965b-17e8352ece8c","Type":"ContainerDied","Data":"fcb1ce90d61040b27a02a35ac5a12c3b9e8ac6802b97c7f6a109a99577c307b9"} Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.094893 4771 scope.go:117] "RemoveContainer" containerID="8ea850a3026c7791dfec2596a9f06c5992bde67d230a4b9e7ae6bb75c4c9544d" Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.097871 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"26685008-55b9-4176-98b8-f915a6004b36","Type":"ContainerStarted","Data":"1b77e5d17d405a8c38075258c600d3de2ad39f99faeadcb0cc29042a96ee6b48"} Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.121993 4771 scope.go:117] "RemoveContainer" containerID="c68767e3698a273f9632698e9f3c9ae6e8c33000b9ea252b4be09ef6cfffdc06" Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.141325 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.127303451 podStartE2EDuration="6.141308665s" podCreationTimestamp="2026-02-27 01:26:53 +0000 UTC" firstStartedPulling="2026-02-27 01:26:54.233855313 +0000 UTC m=+1327.171416601" lastFinishedPulling="2026-02-27 01:26:58.247860537 +0000 UTC m=+1331.185421815" observedRunningTime="2026-02-27 01:26:59.131849748 +0000 UTC m=+1332.069411036" watchObservedRunningTime="2026-02-27 01:26:59.141308665 +0000 UTC m=+1332.078869953" Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.180140 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-config\") pod \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\" (UID: \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\") " Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.180187 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-ovsdbserver-sb\") pod \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\" (UID: \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\") " Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.180323 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t64wv\" (UniqueName: \"kubernetes.io/projected/bc9923bd-11c1-4d1d-965b-17e8352ece8c-kube-api-access-t64wv\") pod \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\" (UID: \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\") " Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.180417 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-ovsdbserver-nb\") pod \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\" (UID: \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\") " Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.180447 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-dns-svc\") pod \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\" (UID: \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\") " Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.180604 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-dns-swift-storage-0\") pod \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\" (UID: \"bc9923bd-11c1-4d1d-965b-17e8352ece8c\") " Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.189057 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc9923bd-11c1-4d1d-965b-17e8352ece8c-kube-api-access-t64wv" (OuterVolumeSpecName: "kube-api-access-t64wv") pod "bc9923bd-11c1-4d1d-965b-17e8352ece8c" (UID: "bc9923bd-11c1-4d1d-965b-17e8352ece8c"). InnerVolumeSpecName "kube-api-access-t64wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.195643 4771 scope.go:117] "RemoveContainer" containerID="8ea850a3026c7791dfec2596a9f06c5992bde67d230a4b9e7ae6bb75c4c9544d" Feb 27 01:26:59 crc kubenswrapper[4771]: E0227 01:26:59.197156 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ea850a3026c7791dfec2596a9f06c5992bde67d230a4b9e7ae6bb75c4c9544d\": container with ID starting with 8ea850a3026c7791dfec2596a9f06c5992bde67d230a4b9e7ae6bb75c4c9544d not found: ID does not exist" containerID="8ea850a3026c7791dfec2596a9f06c5992bde67d230a4b9e7ae6bb75c4c9544d" Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.197204 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ea850a3026c7791dfec2596a9f06c5992bde67d230a4b9e7ae6bb75c4c9544d"} err="failed to get container status \"8ea850a3026c7791dfec2596a9f06c5992bde67d230a4b9e7ae6bb75c4c9544d\": rpc error: code = NotFound desc = could not find container \"8ea850a3026c7791dfec2596a9f06c5992bde67d230a4b9e7ae6bb75c4c9544d\": container with ID starting with 8ea850a3026c7791dfec2596a9f06c5992bde67d230a4b9e7ae6bb75c4c9544d not found: ID does not exist" Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.197228 4771 scope.go:117] "RemoveContainer" containerID="c68767e3698a273f9632698e9f3c9ae6e8c33000b9ea252b4be09ef6cfffdc06" Feb 27 01:26:59 crc kubenswrapper[4771]: E0227 01:26:59.197723 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c68767e3698a273f9632698e9f3c9ae6e8c33000b9ea252b4be09ef6cfffdc06\": container with ID starting with c68767e3698a273f9632698e9f3c9ae6e8c33000b9ea252b4be09ef6cfffdc06 not found: ID does not exist" containerID="c68767e3698a273f9632698e9f3c9ae6e8c33000b9ea252b4be09ef6cfffdc06" Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.197739 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c68767e3698a273f9632698e9f3c9ae6e8c33000b9ea252b4be09ef6cfffdc06"} err="failed to get container status \"c68767e3698a273f9632698e9f3c9ae6e8c33000b9ea252b4be09ef6cfffdc06\": rpc error: code = NotFound desc = could not find container \"c68767e3698a273f9632698e9f3c9ae6e8c33000b9ea252b4be09ef6cfffdc06\": container with ID starting with c68767e3698a273f9632698e9f3c9ae6e8c33000b9ea252b4be09ef6cfffdc06 not found: ID does not exist" Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.247676 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bc9923bd-11c1-4d1d-965b-17e8352ece8c" (UID: "bc9923bd-11c1-4d1d-965b-17e8352ece8c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.251872 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bc9923bd-11c1-4d1d-965b-17e8352ece8c" (UID: "bc9923bd-11c1-4d1d-965b-17e8352ece8c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.254060 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bc9923bd-11c1-4d1d-965b-17e8352ece8c" (UID: "bc9923bd-11c1-4d1d-965b-17e8352ece8c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.262889 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-config" (OuterVolumeSpecName: "config") pod "bc9923bd-11c1-4d1d-965b-17e8352ece8c" (UID: "bc9923bd-11c1-4d1d-965b-17e8352ece8c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.274362 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bc9923bd-11c1-4d1d-965b-17e8352ece8c" (UID: "bc9923bd-11c1-4d1d-965b-17e8352ece8c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.282319 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.282349 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.282359 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.282371 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.282380 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc9923bd-11c1-4d1d-965b-17e8352ece8c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 01:26:59 crc kubenswrapper[4771]: I0227 01:26:59.282390 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t64wv\" (UniqueName: \"kubernetes.io/projected/bc9923bd-11c1-4d1d-965b-17e8352ece8c-kube-api-access-t64wv\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:00 crc kubenswrapper[4771]: I0227 01:27:00.108879 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-mr7mf" Feb 27 01:27:00 crc kubenswrapper[4771]: I0227 01:27:00.109248 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 27 01:27:00 crc kubenswrapper[4771]: I0227 01:27:00.144807 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-mr7mf"] Feb 27 01:27:00 crc kubenswrapper[4771]: I0227 01:27:00.154964 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-mr7mf"] Feb 27 01:27:01 crc kubenswrapper[4771]: I0227 01:27:01.792929 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc9923bd-11c1-4d1d-965b-17e8352ece8c" path="/var/lib/kubelet/pods/bc9923bd-11c1-4d1d-965b-17e8352ece8c/volumes" Feb 27 01:27:02 crc kubenswrapper[4771]: I0227 01:27:02.131245 4771 generic.go:334] "Generic (PLEG): container finished" podID="f0f3b6c0-daf5-40b9-bdd9-008890a2684a" containerID="795b1623dbd9b500c659a38f5a9f2f23a7f05e928acadd25f8f99fccf7637d4f" exitCode=0 Feb 27 01:27:02 crc kubenswrapper[4771]: I0227 01:27:02.131308 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mzh27" event={"ID":"f0f3b6c0-daf5-40b9-bdd9-008890a2684a","Type":"ContainerDied","Data":"795b1623dbd9b500c659a38f5a9f2f23a7f05e928acadd25f8f99fccf7637d4f"} Feb 27 01:27:03 crc kubenswrapper[4771]: I0227 01:27:03.556646 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mzh27" Feb 27 01:27:03 crc kubenswrapper[4771]: I0227 01:27:03.675080 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f3b6c0-daf5-40b9-bdd9-008890a2684a-combined-ca-bundle\") pod \"f0f3b6c0-daf5-40b9-bdd9-008890a2684a\" (UID: \"f0f3b6c0-daf5-40b9-bdd9-008890a2684a\") " Feb 27 01:27:03 crc kubenswrapper[4771]: I0227 01:27:03.675286 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-977ts\" (UniqueName: \"kubernetes.io/projected/f0f3b6c0-daf5-40b9-bdd9-008890a2684a-kube-api-access-977ts\") pod \"f0f3b6c0-daf5-40b9-bdd9-008890a2684a\" (UID: \"f0f3b6c0-daf5-40b9-bdd9-008890a2684a\") " Feb 27 01:27:03 crc kubenswrapper[4771]: I0227 01:27:03.675317 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f3b6c0-daf5-40b9-bdd9-008890a2684a-config-data\") pod \"f0f3b6c0-daf5-40b9-bdd9-008890a2684a\" (UID: \"f0f3b6c0-daf5-40b9-bdd9-008890a2684a\") " Feb 27 01:27:03 crc kubenswrapper[4771]: I0227 01:27:03.675373 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f3b6c0-daf5-40b9-bdd9-008890a2684a-scripts\") pod \"f0f3b6c0-daf5-40b9-bdd9-008890a2684a\" (UID: \"f0f3b6c0-daf5-40b9-bdd9-008890a2684a\") " Feb 27 01:27:03 crc kubenswrapper[4771]: I0227 01:27:03.680875 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0f3b6c0-daf5-40b9-bdd9-008890a2684a-kube-api-access-977ts" (OuterVolumeSpecName: "kube-api-access-977ts") pod "f0f3b6c0-daf5-40b9-bdd9-008890a2684a" (UID: "f0f3b6c0-daf5-40b9-bdd9-008890a2684a"). InnerVolumeSpecName "kube-api-access-977ts". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:27:03 crc kubenswrapper[4771]: I0227 01:27:03.681496 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f3b6c0-daf5-40b9-bdd9-008890a2684a-scripts" (OuterVolumeSpecName: "scripts") pod "f0f3b6c0-daf5-40b9-bdd9-008890a2684a" (UID: "f0f3b6c0-daf5-40b9-bdd9-008890a2684a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:27:03 crc kubenswrapper[4771]: I0227 01:27:03.709086 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f3b6c0-daf5-40b9-bdd9-008890a2684a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0f3b6c0-daf5-40b9-bdd9-008890a2684a" (UID: "f0f3b6c0-daf5-40b9-bdd9-008890a2684a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:27:03 crc kubenswrapper[4771]: I0227 01:27:03.713599 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f3b6c0-daf5-40b9-bdd9-008890a2684a-config-data" (OuterVolumeSpecName: "config-data") pod "f0f3b6c0-daf5-40b9-bdd9-008890a2684a" (UID: "f0f3b6c0-daf5-40b9-bdd9-008890a2684a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:27:03 crc kubenswrapper[4771]: I0227 01:27:03.777989 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f3b6c0-daf5-40b9-bdd9-008890a2684a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:03 crc kubenswrapper[4771]: I0227 01:27:03.778210 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-977ts\" (UniqueName: \"kubernetes.io/projected/f0f3b6c0-daf5-40b9-bdd9-008890a2684a-kube-api-access-977ts\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:03 crc kubenswrapper[4771]: I0227 01:27:03.778332 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f3b6c0-daf5-40b9-bdd9-008890a2684a-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:03 crc kubenswrapper[4771]: I0227 01:27:03.778456 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f3b6c0-daf5-40b9-bdd9-008890a2684a-scripts\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:04 crc kubenswrapper[4771]: I0227 01:27:04.155657 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mzh27" event={"ID":"f0f3b6c0-daf5-40b9-bdd9-008890a2684a","Type":"ContainerDied","Data":"142b070946535f18e157924e484be334b583fb16706efb9e88d5b697f118a3bd"} Feb 27 01:27:04 crc kubenswrapper[4771]: I0227 01:27:04.156083 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="142b070946535f18e157924e484be334b583fb16706efb9e88d5b697f118a3bd" Feb 27 01:27:04 crc kubenswrapper[4771]: I0227 01:27:04.155736 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mzh27" Feb 27 01:27:04 crc kubenswrapper[4771]: I0227 01:27:04.385471 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 27 01:27:04 crc kubenswrapper[4771]: I0227 01:27:04.385790 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b95f2002-9af0-440b-956d-2734dde1a919" containerName="nova-api-log" containerID="cri-o://6d058c14f94f6075b3d3fc347b91c26d963448674cbe80dcd44a15eb3a8e7e26" gracePeriod=30 Feb 27 01:27:04 crc kubenswrapper[4771]: I0227 01:27:04.385870 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b95f2002-9af0-440b-956d-2734dde1a919" containerName="nova-api-api" containerID="cri-o://67b43932ca2388709a10439977aacc1ede8a467713f717bc06b171fd409aa17d" gracePeriod=30 Feb 27 01:27:04 crc kubenswrapper[4771]: I0227 01:27:04.409099 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 01:27:04 crc kubenswrapper[4771]: I0227 01:27:04.409345 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="bf83205a-9abc-4989-8804-2368ba05c0ef" containerName="nova-scheduler-scheduler" containerID="cri-o://cfc3024b9a7b28be1f5bcdd93565d72778f90a3bcd576c9e85767b245cd395cb" gracePeriod=30 Feb 27 01:27:04 crc kubenswrapper[4771]: I0227 01:27:04.510755 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 01:27:04 crc kubenswrapper[4771]: I0227 01:27:04.511271 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ccd33248-70b6-45be-852d-d10692d396ad" containerName="nova-metadata-log" containerID="cri-o://a14ac2fd614cde8c7e8a809fc3c047f9af302e879db29dc00bc8dc394a87bfe1" gracePeriod=30 Feb 27 01:27:04 crc kubenswrapper[4771]: I0227 01:27:04.511654 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ccd33248-70b6-45be-852d-d10692d396ad" containerName="nova-metadata-metadata" containerID="cri-o://f9a5dc02012b3c6cfdd5f5baa51125985982be78f0918140df7bbbd0715b337e" gracePeriod=30 Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.165048 4771 generic.go:334] "Generic (PLEG): container finished" podID="ccd33248-70b6-45be-852d-d10692d396ad" containerID="a14ac2fd614cde8c7e8a809fc3c047f9af302e879db29dc00bc8dc394a87bfe1" exitCode=143 Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.165131 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ccd33248-70b6-45be-852d-d10692d396ad","Type":"ContainerDied","Data":"a14ac2fd614cde8c7e8a809fc3c047f9af302e879db29dc00bc8dc394a87bfe1"} Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.167626 4771 generic.go:334] "Generic (PLEG): container finished" podID="b95f2002-9af0-440b-956d-2734dde1a919" containerID="67b43932ca2388709a10439977aacc1ede8a467713f717bc06b171fd409aa17d" exitCode=0 Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.167652 4771 generic.go:334] "Generic (PLEG): container finished" podID="b95f2002-9af0-440b-956d-2734dde1a919" containerID="6d058c14f94f6075b3d3fc347b91c26d963448674cbe80dcd44a15eb3a8e7e26" exitCode=143 Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.167675 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b95f2002-9af0-440b-956d-2734dde1a919","Type":"ContainerDied","Data":"67b43932ca2388709a10439977aacc1ede8a467713f717bc06b171fd409aa17d"} Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.167698 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b95f2002-9af0-440b-956d-2734dde1a919","Type":"ContainerDied","Data":"6d058c14f94f6075b3d3fc347b91c26d963448674cbe80dcd44a15eb3a8e7e26"} Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.167710 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b95f2002-9af0-440b-956d-2734dde1a919","Type":"ContainerDied","Data":"321ada706e0a6907360f751ad517012d615e733d4e28797816aaadaf3eeec543"} Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.167720 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="321ada706e0a6907360f751ad517012d615e733d4e28797816aaadaf3eeec543" Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.200941 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.327164 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rxfk\" (UniqueName: \"kubernetes.io/projected/b95f2002-9af0-440b-956d-2734dde1a919-kube-api-access-9rxfk\") pod \"b95f2002-9af0-440b-956d-2734dde1a919\" (UID: \"b95f2002-9af0-440b-956d-2734dde1a919\") " Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.327282 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b95f2002-9af0-440b-956d-2734dde1a919-internal-tls-certs\") pod \"b95f2002-9af0-440b-956d-2734dde1a919\" (UID: \"b95f2002-9af0-440b-956d-2734dde1a919\") " Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.327384 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b95f2002-9af0-440b-956d-2734dde1a919-public-tls-certs\") pod \"b95f2002-9af0-440b-956d-2734dde1a919\" (UID: \"b95f2002-9af0-440b-956d-2734dde1a919\") " Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.327431 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b95f2002-9af0-440b-956d-2734dde1a919-combined-ca-bundle\") pod \"b95f2002-9af0-440b-956d-2734dde1a919\" (UID: \"b95f2002-9af0-440b-956d-2734dde1a919\") " Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.327458 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b95f2002-9af0-440b-956d-2734dde1a919-config-data\") pod \"b95f2002-9af0-440b-956d-2734dde1a919\" (UID: \"b95f2002-9af0-440b-956d-2734dde1a919\") " Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.327484 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b95f2002-9af0-440b-956d-2734dde1a919-logs\") pod \"b95f2002-9af0-440b-956d-2734dde1a919\" (UID: \"b95f2002-9af0-440b-956d-2734dde1a919\") " Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.328323 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b95f2002-9af0-440b-956d-2734dde1a919-logs" (OuterVolumeSpecName: "logs") pod "b95f2002-9af0-440b-956d-2734dde1a919" (UID: "b95f2002-9af0-440b-956d-2734dde1a919"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.328968 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b95f2002-9af0-440b-956d-2734dde1a919-logs\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.333952 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b95f2002-9af0-440b-956d-2734dde1a919-kube-api-access-9rxfk" (OuterVolumeSpecName: "kube-api-access-9rxfk") pod "b95f2002-9af0-440b-956d-2734dde1a919" (UID: "b95f2002-9af0-440b-956d-2734dde1a919"). InnerVolumeSpecName "kube-api-access-9rxfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.364152 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b95f2002-9af0-440b-956d-2734dde1a919-config-data" (OuterVolumeSpecName: "config-data") pod "b95f2002-9af0-440b-956d-2734dde1a919" (UID: "b95f2002-9af0-440b-956d-2734dde1a919"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.373231 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b95f2002-9af0-440b-956d-2734dde1a919-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b95f2002-9af0-440b-956d-2734dde1a919" (UID: "b95f2002-9af0-440b-956d-2734dde1a919"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.385643 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b95f2002-9af0-440b-956d-2734dde1a919-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b95f2002-9af0-440b-956d-2734dde1a919" (UID: "b95f2002-9af0-440b-956d-2734dde1a919"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.385820 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b95f2002-9af0-440b-956d-2734dde1a919-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b95f2002-9af0-440b-956d-2734dde1a919" (UID: "b95f2002-9af0-440b-956d-2734dde1a919"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.430280 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rxfk\" (UniqueName: \"kubernetes.io/projected/b95f2002-9af0-440b-956d-2734dde1a919-kube-api-access-9rxfk\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.430322 4771 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b95f2002-9af0-440b-956d-2734dde1a919-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.430335 4771 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b95f2002-9af0-440b-956d-2734dde1a919-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.430348 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b95f2002-9af0-440b-956d-2734dde1a919-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:05 crc kubenswrapper[4771]: I0227 01:27:05.430359 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b95f2002-9af0-440b-956d-2734dde1a919-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.178388 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.205681 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.215446 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.252863 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 27 01:27:06 crc kubenswrapper[4771]: E0227 01:27:06.253302 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b95f2002-9af0-440b-956d-2734dde1a919" containerName="nova-api-api" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.253323 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b95f2002-9af0-440b-956d-2734dde1a919" containerName="nova-api-api" Feb 27 01:27:06 crc kubenswrapper[4771]: E0227 01:27:06.253345 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc9923bd-11c1-4d1d-965b-17e8352ece8c" containerName="init" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.253354 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc9923bd-11c1-4d1d-965b-17e8352ece8c" containerName="init" Feb 27 01:27:06 crc kubenswrapper[4771]: E0227 01:27:06.253366 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0f3b6c0-daf5-40b9-bdd9-008890a2684a" containerName="nova-manage" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.253372 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0f3b6c0-daf5-40b9-bdd9-008890a2684a" containerName="nova-manage" Feb 27 01:27:06 crc kubenswrapper[4771]: E0227 01:27:06.253394 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc9923bd-11c1-4d1d-965b-17e8352ece8c" containerName="dnsmasq-dns" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.253400 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc9923bd-11c1-4d1d-965b-17e8352ece8c" containerName="dnsmasq-dns" Feb 27 01:27:06 crc kubenswrapper[4771]: E0227 01:27:06.253414 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b95f2002-9af0-440b-956d-2734dde1a919" containerName="nova-api-log" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.253420 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b95f2002-9af0-440b-956d-2734dde1a919" containerName="nova-api-log" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.253626 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b95f2002-9af0-440b-956d-2734dde1a919" containerName="nova-api-log" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.253639 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc9923bd-11c1-4d1d-965b-17e8352ece8c" containerName="dnsmasq-dns" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.253654 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0f3b6c0-daf5-40b9-bdd9-008890a2684a" containerName="nova-manage" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.253669 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b95f2002-9af0-440b-956d-2734dde1a919" containerName="nova-api-api" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.254627 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.258482 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.258503 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.260413 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.266124 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.345736 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01f57ee9-e99c-48b0-834b-af553e0c7e5f-logs\") pod \"nova-api-0\" (UID: \"01f57ee9-e99c-48b0-834b-af553e0c7e5f\") " pod="openstack/nova-api-0" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.345812 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01f57ee9-e99c-48b0-834b-af553e0c7e5f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"01f57ee9-e99c-48b0-834b-af553e0c7e5f\") " pod="openstack/nova-api-0" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.345839 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01f57ee9-e99c-48b0-834b-af553e0c7e5f-config-data\") pod \"nova-api-0\" (UID: \"01f57ee9-e99c-48b0-834b-af553e0c7e5f\") " pod="openstack/nova-api-0" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.345957 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6rv5\" (UniqueName: \"kubernetes.io/projected/01f57ee9-e99c-48b0-834b-af553e0c7e5f-kube-api-access-b6rv5\") pod \"nova-api-0\" (UID: \"01f57ee9-e99c-48b0-834b-af553e0c7e5f\") " pod="openstack/nova-api-0" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.346436 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01f57ee9-e99c-48b0-834b-af553e0c7e5f-public-tls-certs\") pod \"nova-api-0\" (UID: \"01f57ee9-e99c-48b0-834b-af553e0c7e5f\") " pod="openstack/nova-api-0" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.346722 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01f57ee9-e99c-48b0-834b-af553e0c7e5f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"01f57ee9-e99c-48b0-834b-af553e0c7e5f\") " pod="openstack/nova-api-0" Feb 27 01:27:06 crc kubenswrapper[4771]: E0227 01:27:06.359355 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cfc3024b9a7b28be1f5bcdd93565d72778f90a3bcd576c9e85767b245cd395cb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 27 01:27:06 crc kubenswrapper[4771]: E0227 01:27:06.361097 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cfc3024b9a7b28be1f5bcdd93565d72778f90a3bcd576c9e85767b245cd395cb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 27 01:27:06 crc kubenswrapper[4771]: E0227 01:27:06.366349 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cfc3024b9a7b28be1f5bcdd93565d72778f90a3bcd576c9e85767b245cd395cb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 27 01:27:06 crc kubenswrapper[4771]: E0227 01:27:06.366390 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="bf83205a-9abc-4989-8804-2368ba05c0ef" containerName="nova-scheduler-scheduler" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.449832 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01f57ee9-e99c-48b0-834b-af553e0c7e5f-logs\") pod \"nova-api-0\" (UID: \"01f57ee9-e99c-48b0-834b-af553e0c7e5f\") " pod="openstack/nova-api-0" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.449917 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01f57ee9-e99c-48b0-834b-af553e0c7e5f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"01f57ee9-e99c-48b0-834b-af553e0c7e5f\") " pod="openstack/nova-api-0" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.449951 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01f57ee9-e99c-48b0-834b-af553e0c7e5f-config-data\") pod \"nova-api-0\" (UID: \"01f57ee9-e99c-48b0-834b-af553e0c7e5f\") " pod="openstack/nova-api-0" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.449974 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6rv5\" (UniqueName: \"kubernetes.io/projected/01f57ee9-e99c-48b0-834b-af553e0c7e5f-kube-api-access-b6rv5\") pod \"nova-api-0\" (UID: \"01f57ee9-e99c-48b0-834b-af553e0c7e5f\") " pod="openstack/nova-api-0" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.450059 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01f57ee9-e99c-48b0-834b-af553e0c7e5f-public-tls-certs\") pod \"nova-api-0\" (UID: \"01f57ee9-e99c-48b0-834b-af553e0c7e5f\") " pod="openstack/nova-api-0" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.450114 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01f57ee9-e99c-48b0-834b-af553e0c7e5f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"01f57ee9-e99c-48b0-834b-af553e0c7e5f\") " pod="openstack/nova-api-0" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.450516 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01f57ee9-e99c-48b0-834b-af553e0c7e5f-logs\") pod \"nova-api-0\" (UID: \"01f57ee9-e99c-48b0-834b-af553e0c7e5f\") " pod="openstack/nova-api-0" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.459829 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01f57ee9-e99c-48b0-834b-af553e0c7e5f-config-data\") pod \"nova-api-0\" (UID: \"01f57ee9-e99c-48b0-834b-af553e0c7e5f\") " pod="openstack/nova-api-0" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.462221 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01f57ee9-e99c-48b0-834b-af553e0c7e5f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"01f57ee9-e99c-48b0-834b-af553e0c7e5f\") " pod="openstack/nova-api-0" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.465724 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/01f57ee9-e99c-48b0-834b-af553e0c7e5f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"01f57ee9-e99c-48b0-834b-af553e0c7e5f\") " pod="openstack/nova-api-0" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.469439 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01f57ee9-e99c-48b0-834b-af553e0c7e5f-public-tls-certs\") pod \"nova-api-0\" (UID: \"01f57ee9-e99c-48b0-834b-af553e0c7e5f\") " pod="openstack/nova-api-0" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.472376 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6rv5\" (UniqueName: \"kubernetes.io/projected/01f57ee9-e99c-48b0-834b-af553e0c7e5f-kube-api-access-b6rv5\") pod \"nova-api-0\" (UID: \"01f57ee9-e99c-48b0-834b-af553e0c7e5f\") " pod="openstack/nova-api-0" Feb 27 01:27:06 crc kubenswrapper[4771]: I0227 01:27:06.573487 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 27 01:27:07 crc kubenswrapper[4771]: I0227 01:27:07.093679 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 27 01:27:07 crc kubenswrapper[4771]: I0227 01:27:07.191726 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01f57ee9-e99c-48b0-834b-af553e0c7e5f","Type":"ContainerStarted","Data":"dc8e4e6f09856241191c623ee914f25a160fd44de2e1efea032ffc462ab111d9"} Feb 27 01:27:07 crc kubenswrapper[4771]: I0227 01:27:07.790244 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b95f2002-9af0-440b-956d-2734dde1a919" path="/var/lib/kubelet/pods/b95f2002-9af0-440b-956d-2734dde1a919/volumes" Feb 27 01:27:07 crc kubenswrapper[4771]: I0227 01:27:07.917446 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ccd33248-70b6-45be-852d-d10692d396ad" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": read tcp 10.217.0.2:47816->10.217.0.204:8775: read: connection reset by peer" Feb 27 01:27:07 crc kubenswrapper[4771]: I0227 01:27:07.917485 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ccd33248-70b6-45be-852d-d10692d396ad" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": read tcp 10.217.0.2:47806->10.217.0.204:8775: read: connection reset by peer" Feb 27 01:27:08 crc kubenswrapper[4771]: I0227 01:27:08.209434 4771 generic.go:334] "Generic (PLEG): container finished" podID="ccd33248-70b6-45be-852d-d10692d396ad" containerID="f9a5dc02012b3c6cfdd5f5baa51125985982be78f0918140df7bbbd0715b337e" exitCode=0 Feb 27 01:27:08 crc kubenswrapper[4771]: I0227 01:27:08.209581 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ccd33248-70b6-45be-852d-d10692d396ad","Type":"ContainerDied","Data":"f9a5dc02012b3c6cfdd5f5baa51125985982be78f0918140df7bbbd0715b337e"} Feb 27 01:27:08 crc kubenswrapper[4771]: I0227 01:27:08.212255 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01f57ee9-e99c-48b0-834b-af553e0c7e5f","Type":"ContainerStarted","Data":"faaf5e5cf9a3b45729e0b3674714d48fb7fa980a596e9782df6b48bd565ad4ae"} Feb 27 01:27:08 crc kubenswrapper[4771]: I0227 01:27:08.212296 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01f57ee9-e99c-48b0-834b-af553e0c7e5f","Type":"ContainerStarted","Data":"ac100c240abe8dd98a68976ddc26dd7581d4fa4aa3ec416b4b5167198a4573af"} Feb 27 01:27:08 crc kubenswrapper[4771]: I0227 01:27:08.245659 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.245637745 podStartE2EDuration="2.245637745s" podCreationTimestamp="2026-02-27 01:27:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:27:08.235280024 +0000 UTC m=+1341.172841312" watchObservedRunningTime="2026-02-27 01:27:08.245637745 +0000 UTC m=+1341.183199063" Feb 27 01:27:08 crc kubenswrapper[4771]: I0227 01:27:08.366851 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 01:27:08 crc kubenswrapper[4771]: I0227 01:27:08.496185 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccd33248-70b6-45be-852d-d10692d396ad-config-data\") pod \"ccd33248-70b6-45be-852d-d10692d396ad\" (UID: \"ccd33248-70b6-45be-852d-d10692d396ad\") " Feb 27 01:27:08 crc kubenswrapper[4771]: I0227 01:27:08.496795 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccd33248-70b6-45be-852d-d10692d396ad-combined-ca-bundle\") pod \"ccd33248-70b6-45be-852d-d10692d396ad\" (UID: \"ccd33248-70b6-45be-852d-d10692d396ad\") " Feb 27 01:27:08 crc kubenswrapper[4771]: I0227 01:27:08.497006 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccd33248-70b6-45be-852d-d10692d396ad-logs\") pod \"ccd33248-70b6-45be-852d-d10692d396ad\" (UID: \"ccd33248-70b6-45be-852d-d10692d396ad\") " Feb 27 01:27:08 crc kubenswrapper[4771]: I0227 01:27:08.497051 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5r2s\" (UniqueName: \"kubernetes.io/projected/ccd33248-70b6-45be-852d-d10692d396ad-kube-api-access-p5r2s\") pod \"ccd33248-70b6-45be-852d-d10692d396ad\" (UID: \"ccd33248-70b6-45be-852d-d10692d396ad\") " Feb 27 01:27:08 crc kubenswrapper[4771]: I0227 01:27:08.497098 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccd33248-70b6-45be-852d-d10692d396ad-nova-metadata-tls-certs\") pod \"ccd33248-70b6-45be-852d-d10692d396ad\" (UID: \"ccd33248-70b6-45be-852d-d10692d396ad\") " Feb 27 01:27:08 crc kubenswrapper[4771]: I0227 01:27:08.497921 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccd33248-70b6-45be-852d-d10692d396ad-logs" (OuterVolumeSpecName: "logs") pod "ccd33248-70b6-45be-852d-d10692d396ad" (UID: "ccd33248-70b6-45be-852d-d10692d396ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:27:08 crc kubenswrapper[4771]: I0227 01:27:08.498562 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccd33248-70b6-45be-852d-d10692d396ad-logs\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:08 crc kubenswrapper[4771]: I0227 01:27:08.503875 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccd33248-70b6-45be-852d-d10692d396ad-kube-api-access-p5r2s" (OuterVolumeSpecName: "kube-api-access-p5r2s") pod "ccd33248-70b6-45be-852d-d10692d396ad" (UID: "ccd33248-70b6-45be-852d-d10692d396ad"). InnerVolumeSpecName "kube-api-access-p5r2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:27:08 crc kubenswrapper[4771]: I0227 01:27:08.533899 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd33248-70b6-45be-852d-d10692d396ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ccd33248-70b6-45be-852d-d10692d396ad" (UID: "ccd33248-70b6-45be-852d-d10692d396ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:27:08 crc kubenswrapper[4771]: I0227 01:27:08.539995 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd33248-70b6-45be-852d-d10692d396ad-config-data" (OuterVolumeSpecName: "config-data") pod "ccd33248-70b6-45be-852d-d10692d396ad" (UID: "ccd33248-70b6-45be-852d-d10692d396ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:27:08 crc kubenswrapper[4771]: I0227 01:27:08.582489 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd33248-70b6-45be-852d-d10692d396ad-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ccd33248-70b6-45be-852d-d10692d396ad" (UID: "ccd33248-70b6-45be-852d-d10692d396ad"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:27:08 crc kubenswrapper[4771]: I0227 01:27:08.600418 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5r2s\" (UniqueName: \"kubernetes.io/projected/ccd33248-70b6-45be-852d-d10692d396ad-kube-api-access-p5r2s\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:08 crc kubenswrapper[4771]: I0227 01:27:08.600447 4771 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccd33248-70b6-45be-852d-d10692d396ad-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:08 crc kubenswrapper[4771]: I0227 01:27:08.600457 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccd33248-70b6-45be-852d-d10692d396ad-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:08 crc kubenswrapper[4771]: I0227 01:27:08.600468 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccd33248-70b6-45be-852d-d10692d396ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.229065 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.229050 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ccd33248-70b6-45be-852d-d10692d396ad","Type":"ContainerDied","Data":"9c87c321841e287a1fb801c55d56aa6af0473a132e5ac064def59317645ad27e"} Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.229207 4771 scope.go:117] "RemoveContainer" containerID="f9a5dc02012b3c6cfdd5f5baa51125985982be78f0918140df7bbbd0715b337e" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.276948 4771 scope.go:117] "RemoveContainer" containerID="a14ac2fd614cde8c7e8a809fc3c047f9af302e879db29dc00bc8dc394a87bfe1" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.284352 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.334424 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.355645 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 27 01:27:09 crc kubenswrapper[4771]: E0227 01:27:09.356182 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccd33248-70b6-45be-852d-d10692d396ad" containerName="nova-metadata-log" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.356209 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd33248-70b6-45be-852d-d10692d396ad" containerName="nova-metadata-log" Feb 27 01:27:09 crc kubenswrapper[4771]: E0227 01:27:09.356236 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccd33248-70b6-45be-852d-d10692d396ad" containerName="nova-metadata-metadata" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.356245 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd33248-70b6-45be-852d-d10692d396ad" containerName="nova-metadata-metadata" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.360828 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccd33248-70b6-45be-852d-d10692d396ad" containerName="nova-metadata-metadata" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.360875 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccd33248-70b6-45be-852d-d10692d396ad" containerName="nova-metadata-log" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.362459 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.366405 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.367203 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.373360 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.521287 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81dfb61e-b373-4273-b55c-0d4680f89779-logs\") pod \"nova-metadata-0\" (UID: \"81dfb61e-b373-4273-b55c-0d4680f89779\") " pod="openstack/nova-metadata-0" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.521413 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnb66\" (UniqueName: \"kubernetes.io/projected/81dfb61e-b373-4273-b55c-0d4680f89779-kube-api-access-tnb66\") pod \"nova-metadata-0\" (UID: \"81dfb61e-b373-4273-b55c-0d4680f89779\") " pod="openstack/nova-metadata-0" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.521470 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/81dfb61e-b373-4273-b55c-0d4680f89779-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"81dfb61e-b373-4273-b55c-0d4680f89779\") " pod="openstack/nova-metadata-0" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.521816 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81dfb61e-b373-4273-b55c-0d4680f89779-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"81dfb61e-b373-4273-b55c-0d4680f89779\") " pod="openstack/nova-metadata-0" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.521893 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81dfb61e-b373-4273-b55c-0d4680f89779-config-data\") pod \"nova-metadata-0\" (UID: \"81dfb61e-b373-4273-b55c-0d4680f89779\") " pod="openstack/nova-metadata-0" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.625427 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81dfb61e-b373-4273-b55c-0d4680f89779-logs\") pod \"nova-metadata-0\" (UID: \"81dfb61e-b373-4273-b55c-0d4680f89779\") " pod="openstack/nova-metadata-0" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.626113 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnb66\" (UniqueName: \"kubernetes.io/projected/81dfb61e-b373-4273-b55c-0d4680f89779-kube-api-access-tnb66\") pod \"nova-metadata-0\" (UID: \"81dfb61e-b373-4273-b55c-0d4680f89779\") " pod="openstack/nova-metadata-0" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.626173 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/81dfb61e-b373-4273-b55c-0d4680f89779-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"81dfb61e-b373-4273-b55c-0d4680f89779\") " pod="openstack/nova-metadata-0" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.626283 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81dfb61e-b373-4273-b55c-0d4680f89779-logs\") pod \"nova-metadata-0\" (UID: \"81dfb61e-b373-4273-b55c-0d4680f89779\") " pod="openstack/nova-metadata-0" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.626309 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81dfb61e-b373-4273-b55c-0d4680f89779-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"81dfb61e-b373-4273-b55c-0d4680f89779\") " pod="openstack/nova-metadata-0" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.626470 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81dfb61e-b373-4273-b55c-0d4680f89779-config-data\") pod \"nova-metadata-0\" (UID: \"81dfb61e-b373-4273-b55c-0d4680f89779\") " pod="openstack/nova-metadata-0" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.631735 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81dfb61e-b373-4273-b55c-0d4680f89779-config-data\") pod \"nova-metadata-0\" (UID: \"81dfb61e-b373-4273-b55c-0d4680f89779\") " pod="openstack/nova-metadata-0" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.632780 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81dfb61e-b373-4273-b55c-0d4680f89779-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"81dfb61e-b373-4273-b55c-0d4680f89779\") " pod="openstack/nova-metadata-0" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.649130 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/81dfb61e-b373-4273-b55c-0d4680f89779-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"81dfb61e-b373-4273-b55c-0d4680f89779\") " pod="openstack/nova-metadata-0" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.649219 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnb66\" (UniqueName: \"kubernetes.io/projected/81dfb61e-b373-4273-b55c-0d4680f89779-kube-api-access-tnb66\") pod \"nova-metadata-0\" (UID: \"81dfb61e-b373-4273-b55c-0d4680f89779\") " pod="openstack/nova-metadata-0" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.680605 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 27 01:27:09 crc kubenswrapper[4771]: I0227 01:27:09.790015 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccd33248-70b6-45be-852d-d10692d396ad" path="/var/lib/kubelet/pods/ccd33248-70b6-45be-852d-d10692d396ad/volumes" Feb 27 01:27:10 crc kubenswrapper[4771]: I0227 01:27:10.194219 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 27 01:27:10 crc kubenswrapper[4771]: I0227 01:27:10.238822 4771 generic.go:334] "Generic (PLEG): container finished" podID="bf83205a-9abc-4989-8804-2368ba05c0ef" containerID="cfc3024b9a7b28be1f5bcdd93565d72778f90a3bcd576c9e85767b245cd395cb" exitCode=0 Feb 27 01:27:10 crc kubenswrapper[4771]: I0227 01:27:10.238890 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bf83205a-9abc-4989-8804-2368ba05c0ef","Type":"ContainerDied","Data":"cfc3024b9a7b28be1f5bcdd93565d72778f90a3bcd576c9e85767b245cd395cb"} Feb 27 01:27:10 crc kubenswrapper[4771]: I0227 01:27:10.238922 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bf83205a-9abc-4989-8804-2368ba05c0ef","Type":"ContainerDied","Data":"5e58d30f676353340076883927be9cb25bdd80576e088f7ae5936a7ea410365b"} Feb 27 01:27:10 crc kubenswrapper[4771]: I0227 01:27:10.238937 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e58d30f676353340076883927be9cb25bdd80576e088f7ae5936a7ea410365b" Feb 27 01:27:10 crc kubenswrapper[4771]: I0227 01:27:10.250866 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 01:27:10 crc kubenswrapper[4771]: I0227 01:27:10.251981 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81dfb61e-b373-4273-b55c-0d4680f89779","Type":"ContainerStarted","Data":"afd1c0b150bb5f30c4bfda10a33838046d091e3287ee3dbd4d7d7361d91842d0"} Feb 27 01:27:10 crc kubenswrapper[4771]: I0227 01:27:10.341159 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltqtj\" (UniqueName: \"kubernetes.io/projected/bf83205a-9abc-4989-8804-2368ba05c0ef-kube-api-access-ltqtj\") pod \"bf83205a-9abc-4989-8804-2368ba05c0ef\" (UID: \"bf83205a-9abc-4989-8804-2368ba05c0ef\") " Feb 27 01:27:10 crc kubenswrapper[4771]: I0227 01:27:10.341295 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf83205a-9abc-4989-8804-2368ba05c0ef-config-data\") pod \"bf83205a-9abc-4989-8804-2368ba05c0ef\" (UID: \"bf83205a-9abc-4989-8804-2368ba05c0ef\") " Feb 27 01:27:10 crc kubenswrapper[4771]: I0227 01:27:10.341413 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf83205a-9abc-4989-8804-2368ba05c0ef-combined-ca-bundle\") pod \"bf83205a-9abc-4989-8804-2368ba05c0ef\" (UID: \"bf83205a-9abc-4989-8804-2368ba05c0ef\") " Feb 27 01:27:10 crc kubenswrapper[4771]: I0227 01:27:10.347305 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf83205a-9abc-4989-8804-2368ba05c0ef-kube-api-access-ltqtj" (OuterVolumeSpecName: "kube-api-access-ltqtj") pod "bf83205a-9abc-4989-8804-2368ba05c0ef" (UID: "bf83205a-9abc-4989-8804-2368ba05c0ef"). InnerVolumeSpecName "kube-api-access-ltqtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:27:10 crc kubenswrapper[4771]: I0227 01:27:10.374195 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf83205a-9abc-4989-8804-2368ba05c0ef-config-data" (OuterVolumeSpecName: "config-data") pod "bf83205a-9abc-4989-8804-2368ba05c0ef" (UID: "bf83205a-9abc-4989-8804-2368ba05c0ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:27:10 crc kubenswrapper[4771]: I0227 01:27:10.381161 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf83205a-9abc-4989-8804-2368ba05c0ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf83205a-9abc-4989-8804-2368ba05c0ef" (UID: "bf83205a-9abc-4989-8804-2368ba05c0ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:27:10 crc kubenswrapper[4771]: I0227 01:27:10.443239 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltqtj\" (UniqueName: \"kubernetes.io/projected/bf83205a-9abc-4989-8804-2368ba05c0ef-kube-api-access-ltqtj\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:10 crc kubenswrapper[4771]: I0227 01:27:10.443521 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf83205a-9abc-4989-8804-2368ba05c0ef-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:10 crc kubenswrapper[4771]: I0227 01:27:10.443534 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf83205a-9abc-4989-8804-2368ba05c0ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:11 crc kubenswrapper[4771]: I0227 01:27:11.269439 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 01:27:11 crc kubenswrapper[4771]: I0227 01:27:11.271213 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81dfb61e-b373-4273-b55c-0d4680f89779","Type":"ContainerStarted","Data":"c9d3d9ed095ead5c5a23107e1c0b541d8fdc4d3fb5b1a6de41644eccce21c018"} Feb 27 01:27:11 crc kubenswrapper[4771]: I0227 01:27:11.271455 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81dfb61e-b373-4273-b55c-0d4680f89779","Type":"ContainerStarted","Data":"d509c9e37e4ac3ea5395dbd6853b07ca9e2aad42d504f8afa32b58b19e0042e8"} Feb 27 01:27:11 crc kubenswrapper[4771]: I0227 01:27:11.317586 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.317494889 podStartE2EDuration="2.317494889s" podCreationTimestamp="2026-02-27 01:27:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:27:11.310190621 +0000 UTC m=+1344.247751959" watchObservedRunningTime="2026-02-27 01:27:11.317494889 +0000 UTC m=+1344.255056277" Feb 27 01:27:11 crc kubenswrapper[4771]: I0227 01:27:11.356028 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 01:27:11 crc kubenswrapper[4771]: I0227 01:27:11.367131 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 01:27:11 crc kubenswrapper[4771]: I0227 01:27:11.390698 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 01:27:11 crc kubenswrapper[4771]: E0227 01:27:11.391136 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf83205a-9abc-4989-8804-2368ba05c0ef" containerName="nova-scheduler-scheduler" Feb 27 01:27:11 crc kubenswrapper[4771]: I0227 01:27:11.391159 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf83205a-9abc-4989-8804-2368ba05c0ef" containerName="nova-scheduler-scheduler" Feb 27 01:27:11 crc kubenswrapper[4771]: I0227 01:27:11.391357 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf83205a-9abc-4989-8804-2368ba05c0ef" containerName="nova-scheduler-scheduler" Feb 27 01:27:11 crc kubenswrapper[4771]: I0227 01:27:11.391995 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 01:27:11 crc kubenswrapper[4771]: I0227 01:27:11.392088 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 01:27:11 crc kubenswrapper[4771]: I0227 01:27:11.417904 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 27 01:27:11 crc kubenswrapper[4771]: I0227 01:27:11.467181 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvbbm\" (UniqueName: \"kubernetes.io/projected/ec376af9-95db-45e8-bb5b-1a4bec9e0197-kube-api-access-xvbbm\") pod \"nova-scheduler-0\" (UID: \"ec376af9-95db-45e8-bb5b-1a4bec9e0197\") " pod="openstack/nova-scheduler-0" Feb 27 01:27:11 crc kubenswrapper[4771]: I0227 01:27:11.467411 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec376af9-95db-45e8-bb5b-1a4bec9e0197-config-data\") pod \"nova-scheduler-0\" (UID: \"ec376af9-95db-45e8-bb5b-1a4bec9e0197\") " pod="openstack/nova-scheduler-0" Feb 27 01:27:11 crc kubenswrapper[4771]: I0227 01:27:11.467445 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec376af9-95db-45e8-bb5b-1a4bec9e0197-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ec376af9-95db-45e8-bb5b-1a4bec9e0197\") " pod="openstack/nova-scheduler-0" Feb 27 01:27:11 crc kubenswrapper[4771]: I0227 01:27:11.569699 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvbbm\" (UniqueName: \"kubernetes.io/projected/ec376af9-95db-45e8-bb5b-1a4bec9e0197-kube-api-access-xvbbm\") pod \"nova-scheduler-0\" (UID: \"ec376af9-95db-45e8-bb5b-1a4bec9e0197\") " pod="openstack/nova-scheduler-0" Feb 27 01:27:11 crc kubenswrapper[4771]: I0227 01:27:11.569959 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec376af9-95db-45e8-bb5b-1a4bec9e0197-config-data\") pod \"nova-scheduler-0\" (UID: \"ec376af9-95db-45e8-bb5b-1a4bec9e0197\") " pod="openstack/nova-scheduler-0" Feb 27 01:27:11 crc kubenswrapper[4771]: I0227 01:27:11.570004 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec376af9-95db-45e8-bb5b-1a4bec9e0197-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ec376af9-95db-45e8-bb5b-1a4bec9e0197\") " pod="openstack/nova-scheduler-0" Feb 27 01:27:11 crc kubenswrapper[4771]: I0227 01:27:11.577849 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec376af9-95db-45e8-bb5b-1a4bec9e0197-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ec376af9-95db-45e8-bb5b-1a4bec9e0197\") " pod="openstack/nova-scheduler-0" Feb 27 01:27:11 crc kubenswrapper[4771]: I0227 01:27:11.578037 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec376af9-95db-45e8-bb5b-1a4bec9e0197-config-data\") pod \"nova-scheduler-0\" (UID: \"ec376af9-95db-45e8-bb5b-1a4bec9e0197\") " pod="openstack/nova-scheduler-0" Feb 27 01:27:11 crc kubenswrapper[4771]: I0227 01:27:11.596720 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvbbm\" (UniqueName: \"kubernetes.io/projected/ec376af9-95db-45e8-bb5b-1a4bec9e0197-kube-api-access-xvbbm\") pod \"nova-scheduler-0\" (UID: \"ec376af9-95db-45e8-bb5b-1a4bec9e0197\") " pod="openstack/nova-scheduler-0" Feb 27 01:27:11 crc kubenswrapper[4771]: I0227 01:27:11.733586 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 27 01:27:11 crc kubenswrapper[4771]: I0227 01:27:11.794094 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf83205a-9abc-4989-8804-2368ba05c0ef" path="/var/lib/kubelet/pods/bf83205a-9abc-4989-8804-2368ba05c0ef/volumes" Feb 27 01:27:12 crc kubenswrapper[4771]: I0227 01:27:12.260207 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 27 01:27:12 crc kubenswrapper[4771]: W0227 01:27:12.261737 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec376af9_95db_45e8_bb5b_1a4bec9e0197.slice/crio-6173c04f54c01d314b1558ab4020ff945dbe3c1679d008afcc60891ffb0f2982 WatchSource:0}: Error finding container 6173c04f54c01d314b1558ab4020ff945dbe3c1679d008afcc60891ffb0f2982: Status 404 returned error can't find the container with id 6173c04f54c01d314b1558ab4020ff945dbe3c1679d008afcc60891ffb0f2982 Feb 27 01:27:12 crc kubenswrapper[4771]: I0227 01:27:12.286930 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ec376af9-95db-45e8-bb5b-1a4bec9e0197","Type":"ContainerStarted","Data":"6173c04f54c01d314b1558ab4020ff945dbe3c1679d008afcc60891ffb0f2982"} Feb 27 01:27:13 crc kubenswrapper[4771]: I0227 01:27:13.303257 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ec376af9-95db-45e8-bb5b-1a4bec9e0197","Type":"ContainerStarted","Data":"355f0a44f07c4d3c20d469ade62290978d0dc582a04735f3cee50b84812504c2"} Feb 27 01:27:13 crc kubenswrapper[4771]: I0227 01:27:13.328419 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.328396326 podStartE2EDuration="2.328396326s" podCreationTimestamp="2026-02-27 01:27:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:27:13.324429818 +0000 UTC m=+1346.261991146" watchObservedRunningTime="2026-02-27 01:27:13.328396326 +0000 UTC m=+1346.265957644" Feb 27 01:27:14 crc kubenswrapper[4771]: I0227 01:27:14.681350 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 27 01:27:14 crc kubenswrapper[4771]: I0227 01:27:14.681932 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 27 01:27:16 crc kubenswrapper[4771]: I0227 01:27:16.574335 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 01:27:16 crc kubenswrapper[4771]: I0227 01:27:16.574397 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 27 01:27:16 crc kubenswrapper[4771]: I0227 01:27:16.734488 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 27 01:27:17 crc kubenswrapper[4771]: I0227 01:27:17.602860 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="01f57ee9-e99c-48b0-834b-af553e0c7e5f" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.214:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 01:27:17 crc kubenswrapper[4771]: I0227 01:27:17.603280 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="01f57ee9-e99c-48b0-834b-af553e0c7e5f" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.214:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 01:27:19 crc kubenswrapper[4771]: I0227 01:27:19.681462 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 27 01:27:19 crc kubenswrapper[4771]: I0227 01:27:19.681890 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 27 01:27:20 crc kubenswrapper[4771]: I0227 01:27:20.688874 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="81dfb61e-b373-4273-b55c-0d4680f89779" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 01:27:20 crc kubenswrapper[4771]: I0227 01:27:20.696790 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="81dfb61e-b373-4273-b55c-0d4680f89779" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 27 01:27:21 crc kubenswrapper[4771]: I0227 01:27:21.734500 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 27 01:27:21 crc kubenswrapper[4771]: I0227 01:27:21.768915 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 27 01:27:22 crc kubenswrapper[4771]: I0227 01:27:22.449698 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 27 01:27:23 crc kubenswrapper[4771]: I0227 01:27:23.752700 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 27 01:27:26 crc kubenswrapper[4771]: I0227 01:27:26.586810 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 27 01:27:26 crc kubenswrapper[4771]: I0227 01:27:26.587706 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 27 01:27:26 crc kubenswrapper[4771]: I0227 01:27:26.593016 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 27 01:27:26 crc kubenswrapper[4771]: I0227 01:27:26.598593 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 27 01:27:27 crc kubenswrapper[4771]: I0227 01:27:27.479998 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 27 01:27:27 crc kubenswrapper[4771]: I0227 01:27:27.489383 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 27 01:27:28 crc kubenswrapper[4771]: I0227 01:27:28.953190 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:27:28 crc kubenswrapper[4771]: I0227 01:27:28.953539 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:27:28 crc kubenswrapper[4771]: I0227 01:27:28.953611 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 01:27:28 crc kubenswrapper[4771]: I0227 01:27:28.954803 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"466a33b6112ab220887139a7abe10596ba6afedbccef8b636c28177f74cb6a85"} pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 01:27:28 crc kubenswrapper[4771]: I0227 01:27:28.954907 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" containerID="cri-o://466a33b6112ab220887139a7abe10596ba6afedbccef8b636c28177f74cb6a85" gracePeriod=600 Feb 27 01:27:28 crc kubenswrapper[4771]: E0227 01:27:28.989476 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca81e505_d53f_496e_bd26_7cec669591e4.slice/crio-466a33b6112ab220887139a7abe10596ba6afedbccef8b636c28177f74cb6a85.scope\": RecentStats: unable to find data in memory cache]" Feb 27 01:27:29 crc kubenswrapper[4771]: I0227 01:27:29.503823 4771 generic.go:334] "Generic (PLEG): container finished" podID="ca81e505-d53f-496e-bd26-7cec669591e4" containerID="466a33b6112ab220887139a7abe10596ba6afedbccef8b636c28177f74cb6a85" exitCode=0 Feb 27 01:27:29 crc kubenswrapper[4771]: I0227 01:27:29.503885 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerDied","Data":"466a33b6112ab220887139a7abe10596ba6afedbccef8b636c28177f74cb6a85"} Feb 27 01:27:29 crc kubenswrapper[4771]: I0227 01:27:29.504263 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerStarted","Data":"c2fa94f2e2bead8dd6b922ec063e6ef0f0039cd25cc010b30deb4ce3bb130b4c"} Feb 27 01:27:29 crc kubenswrapper[4771]: I0227 01:27:29.504289 4771 scope.go:117] "RemoveContainer" containerID="f3112e69f234defa1fcff4a9c5517c895c98346bf69153547a5fa6e13f50fed1" Feb 27 01:27:29 crc kubenswrapper[4771]: I0227 01:27:29.690444 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 27 01:27:29 crc kubenswrapper[4771]: I0227 01:27:29.691049 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 27 01:27:29 crc kubenswrapper[4771]: I0227 01:27:29.705286 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 27 01:27:29 crc kubenswrapper[4771]: I0227 01:27:29.710922 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 27 01:27:37 crc kubenswrapper[4771]: I0227 01:27:37.825464 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 01:27:38 crc kubenswrapper[4771]: I0227 01:27:38.642640 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 01:27:42 crc kubenswrapper[4771]: I0227 01:27:42.157250 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="a2c84581-5806-46dd-b352-390ef2d9826c" containerName="rabbitmq" containerID="cri-o://15d2961cadf42189b71af6f1511da1e669276312eea977d283256c35c14a13bf" gracePeriod=604796 Feb 27 01:27:42 crc kubenswrapper[4771]: I0227 01:27:42.670216 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="a3aec8d2-008a-4b77-a30b-23f8e812e332" containerName="rabbitmq" containerID="cri-o://67b2e3fd5aad9a96898bb725ef413d83e0bc7369e8bc03e5c113578e5985f46f" gracePeriod=604796 Feb 27 01:27:48 crc kubenswrapper[4771]: I0227 01:27:48.728156 4771 generic.go:334] "Generic (PLEG): container finished" podID="a2c84581-5806-46dd-b352-390ef2d9826c" containerID="15d2961cadf42189b71af6f1511da1e669276312eea977d283256c35c14a13bf" exitCode=0 Feb 27 01:27:48 crc kubenswrapper[4771]: I0227 01:27:48.728711 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a2c84581-5806-46dd-b352-390ef2d9826c","Type":"ContainerDied","Data":"15d2961cadf42189b71af6f1511da1e669276312eea977d283256c35c14a13bf"} Feb 27 01:27:48 crc kubenswrapper[4771]: I0227 01:27:48.965685 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.095425 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a2c84581-5806-46dd-b352-390ef2d9826c-config-data\") pod \"a2c84581-5806-46dd-b352-390ef2d9826c\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.095473 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a2c84581-5806-46dd-b352-390ef2d9826c-rabbitmq-erlang-cookie\") pod \"a2c84581-5806-46dd-b352-390ef2d9826c\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.095510 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"a2c84581-5806-46dd-b352-390ef2d9826c\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.095628 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a2c84581-5806-46dd-b352-390ef2d9826c-rabbitmq-tls\") pod \"a2c84581-5806-46dd-b352-390ef2d9826c\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.095669 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mqvl\" (UniqueName: \"kubernetes.io/projected/a2c84581-5806-46dd-b352-390ef2d9826c-kube-api-access-6mqvl\") pod \"a2c84581-5806-46dd-b352-390ef2d9826c\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.095704 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a2c84581-5806-46dd-b352-390ef2d9826c-pod-info\") pod \"a2c84581-5806-46dd-b352-390ef2d9826c\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.095752 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a2c84581-5806-46dd-b352-390ef2d9826c-rabbitmq-confd\") pod \"a2c84581-5806-46dd-b352-390ef2d9826c\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.095770 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a2c84581-5806-46dd-b352-390ef2d9826c-erlang-cookie-secret\") pod \"a2c84581-5806-46dd-b352-390ef2d9826c\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.095864 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a2c84581-5806-46dd-b352-390ef2d9826c-rabbitmq-plugins\") pod \"a2c84581-5806-46dd-b352-390ef2d9826c\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.095882 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a2c84581-5806-46dd-b352-390ef2d9826c-server-conf\") pod \"a2c84581-5806-46dd-b352-390ef2d9826c\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.095905 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a2c84581-5806-46dd-b352-390ef2d9826c-plugins-conf\") pod \"a2c84581-5806-46dd-b352-390ef2d9826c\" (UID: \"a2c84581-5806-46dd-b352-390ef2d9826c\") " Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.096875 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2c84581-5806-46dd-b352-390ef2d9826c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a2c84581-5806-46dd-b352-390ef2d9826c" (UID: "a2c84581-5806-46dd-b352-390ef2d9826c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.098095 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2c84581-5806-46dd-b352-390ef2d9826c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a2c84581-5806-46dd-b352-390ef2d9826c" (UID: "a2c84581-5806-46dd-b352-390ef2d9826c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.099439 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2c84581-5806-46dd-b352-390ef2d9826c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a2c84581-5806-46dd-b352-390ef2d9826c" (UID: "a2c84581-5806-46dd-b352-390ef2d9826c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.104662 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a2c84581-5806-46dd-b352-390ef2d9826c-pod-info" (OuterVolumeSpecName: "pod-info") pod "a2c84581-5806-46dd-b352-390ef2d9826c" (UID: "a2c84581-5806-46dd-b352-390ef2d9826c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.106371 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2c84581-5806-46dd-b352-390ef2d9826c-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a2c84581-5806-46dd-b352-390ef2d9826c" (UID: "a2c84581-5806-46dd-b352-390ef2d9826c"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.110813 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2c84581-5806-46dd-b352-390ef2d9826c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a2c84581-5806-46dd-b352-390ef2d9826c" (UID: "a2c84581-5806-46dd-b352-390ef2d9826c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.118277 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2c84581-5806-46dd-b352-390ef2d9826c-kube-api-access-6mqvl" (OuterVolumeSpecName: "kube-api-access-6mqvl") pod "a2c84581-5806-46dd-b352-390ef2d9826c" (UID: "a2c84581-5806-46dd-b352-390ef2d9826c"). InnerVolumeSpecName "kube-api-access-6mqvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.119765 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "a2c84581-5806-46dd-b352-390ef2d9826c" (UID: "a2c84581-5806-46dd-b352-390ef2d9826c"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.141648 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2c84581-5806-46dd-b352-390ef2d9826c-config-data" (OuterVolumeSpecName: "config-data") pod "a2c84581-5806-46dd-b352-390ef2d9826c" (UID: "a2c84581-5806-46dd-b352-390ef2d9826c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.162359 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2c84581-5806-46dd-b352-390ef2d9826c-server-conf" (OuterVolumeSpecName: "server-conf") pod "a2c84581-5806-46dd-b352-390ef2d9826c" (UID: "a2c84581-5806-46dd-b352-390ef2d9826c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.201816 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a2c84581-5806-46dd-b352-390ef2d9826c-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.201842 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a2c84581-5806-46dd-b352-390ef2d9826c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.201865 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.201874 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a2c84581-5806-46dd-b352-390ef2d9826c-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.201883 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mqvl\" (UniqueName: \"kubernetes.io/projected/a2c84581-5806-46dd-b352-390ef2d9826c-kube-api-access-6mqvl\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.201891 4771 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a2c84581-5806-46dd-b352-390ef2d9826c-pod-info\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.201898 4771 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a2c84581-5806-46dd-b352-390ef2d9826c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.201906 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a2c84581-5806-46dd-b352-390ef2d9826c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.201913 4771 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a2c84581-5806-46dd-b352-390ef2d9826c-server-conf\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.201922 4771 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a2c84581-5806-46dd-b352-390ef2d9826c-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.206953 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.224520 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.256949 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2c84581-5806-46dd-b352-390ef2d9826c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a2c84581-5806-46dd-b352-390ef2d9826c" (UID: "a2c84581-5806-46dd-b352-390ef2d9826c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.302503 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a3aec8d2-008a-4b77-a30b-23f8e812e332-rabbitmq-confd\") pod \"a3aec8d2-008a-4b77-a30b-23f8e812e332\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.302706 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a3aec8d2-008a-4b77-a30b-23f8e812e332-erlang-cookie-secret\") pod \"a3aec8d2-008a-4b77-a30b-23f8e812e332\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.302737 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"a3aec8d2-008a-4b77-a30b-23f8e812e332\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.302759 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a3aec8d2-008a-4b77-a30b-23f8e812e332-rabbitmq-tls\") pod \"a3aec8d2-008a-4b77-a30b-23f8e812e332\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.302815 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a3aec8d2-008a-4b77-a30b-23f8e812e332-rabbitmq-erlang-cookie\") pod \"a3aec8d2-008a-4b77-a30b-23f8e812e332\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.302852 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a3aec8d2-008a-4b77-a30b-23f8e812e332-pod-info\") pod \"a3aec8d2-008a-4b77-a30b-23f8e812e332\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.302882 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a3aec8d2-008a-4b77-a30b-23f8e812e332-rabbitmq-plugins\") pod \"a3aec8d2-008a-4b77-a30b-23f8e812e332\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.302930 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3aec8d2-008a-4b77-a30b-23f8e812e332-config-data\") pod \"a3aec8d2-008a-4b77-a30b-23f8e812e332\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.302982 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j2lp\" (UniqueName: \"kubernetes.io/projected/a3aec8d2-008a-4b77-a30b-23f8e812e332-kube-api-access-4j2lp\") pod \"a3aec8d2-008a-4b77-a30b-23f8e812e332\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.303012 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a3aec8d2-008a-4b77-a30b-23f8e812e332-plugins-conf\") pod \"a3aec8d2-008a-4b77-a30b-23f8e812e332\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.303039 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a3aec8d2-008a-4b77-a30b-23f8e812e332-server-conf\") pod \"a3aec8d2-008a-4b77-a30b-23f8e812e332\" (UID: \"a3aec8d2-008a-4b77-a30b-23f8e812e332\") " Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.303814 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a2c84581-5806-46dd-b352-390ef2d9826c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.303848 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.303867 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3aec8d2-008a-4b77-a30b-23f8e812e332-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a3aec8d2-008a-4b77-a30b-23f8e812e332" (UID: "a3aec8d2-008a-4b77-a30b-23f8e812e332"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.304242 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3aec8d2-008a-4b77-a30b-23f8e812e332-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a3aec8d2-008a-4b77-a30b-23f8e812e332" (UID: "a3aec8d2-008a-4b77-a30b-23f8e812e332"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.305740 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3aec8d2-008a-4b77-a30b-23f8e812e332-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a3aec8d2-008a-4b77-a30b-23f8e812e332" (UID: "a3aec8d2-008a-4b77-a30b-23f8e812e332"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.306455 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3aec8d2-008a-4b77-a30b-23f8e812e332-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a3aec8d2-008a-4b77-a30b-23f8e812e332" (UID: "a3aec8d2-008a-4b77-a30b-23f8e812e332"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.306813 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "a3aec8d2-008a-4b77-a30b-23f8e812e332" (UID: "a3aec8d2-008a-4b77-a30b-23f8e812e332"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.318362 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a3aec8d2-008a-4b77-a30b-23f8e812e332-pod-info" (OuterVolumeSpecName: "pod-info") pod "a3aec8d2-008a-4b77-a30b-23f8e812e332" (UID: "a3aec8d2-008a-4b77-a30b-23f8e812e332"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.318564 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3aec8d2-008a-4b77-a30b-23f8e812e332-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a3aec8d2-008a-4b77-a30b-23f8e812e332" (UID: "a3aec8d2-008a-4b77-a30b-23f8e812e332"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.319691 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3aec8d2-008a-4b77-a30b-23f8e812e332-kube-api-access-4j2lp" (OuterVolumeSpecName: "kube-api-access-4j2lp") pod "a3aec8d2-008a-4b77-a30b-23f8e812e332" (UID: "a3aec8d2-008a-4b77-a30b-23f8e812e332"). InnerVolumeSpecName "kube-api-access-4j2lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.362816 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3aec8d2-008a-4b77-a30b-23f8e812e332-server-conf" (OuterVolumeSpecName: "server-conf") pod "a3aec8d2-008a-4b77-a30b-23f8e812e332" (UID: "a3aec8d2-008a-4b77-a30b-23f8e812e332"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.371785 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3aec8d2-008a-4b77-a30b-23f8e812e332-config-data" (OuterVolumeSpecName: "config-data") pod "a3aec8d2-008a-4b77-a30b-23f8e812e332" (UID: "a3aec8d2-008a-4b77-a30b-23f8e812e332"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.405680 4771 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a3aec8d2-008a-4b77-a30b-23f8e812e332-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.405730 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.405741 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a3aec8d2-008a-4b77-a30b-23f8e812e332-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.405752 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a3aec8d2-008a-4b77-a30b-23f8e812e332-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.405761 4771 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a3aec8d2-008a-4b77-a30b-23f8e812e332-pod-info\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.405769 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a3aec8d2-008a-4b77-a30b-23f8e812e332-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.405777 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3aec8d2-008a-4b77-a30b-23f8e812e332-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.405785 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j2lp\" (UniqueName: \"kubernetes.io/projected/a3aec8d2-008a-4b77-a30b-23f8e812e332-kube-api-access-4j2lp\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.405793 4771 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a3aec8d2-008a-4b77-a30b-23f8e812e332-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.405800 4771 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a3aec8d2-008a-4b77-a30b-23f8e812e332-server-conf\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.429754 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3aec8d2-008a-4b77-a30b-23f8e812e332-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a3aec8d2-008a-4b77-a30b-23f8e812e332" (UID: "a3aec8d2-008a-4b77-a30b-23f8e812e332"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.432368 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.507335 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.507375 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a3aec8d2-008a-4b77-a30b-23f8e812e332-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.743301 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a2c84581-5806-46dd-b352-390ef2d9826c","Type":"ContainerDied","Data":"ff6ef1cf89726a6dc0f950f4744bd95917409a77a16eeef7189f699af48b4915"} Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.743531 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.743667 4771 scope.go:117] "RemoveContainer" containerID="15d2961cadf42189b71af6f1511da1e669276312eea977d283256c35c14a13bf" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.750064 4771 generic.go:334] "Generic (PLEG): container finished" podID="a3aec8d2-008a-4b77-a30b-23f8e812e332" containerID="67b2e3fd5aad9a96898bb725ef413d83e0bc7369e8bc03e5c113578e5985f46f" exitCode=0 Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.750117 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a3aec8d2-008a-4b77-a30b-23f8e812e332","Type":"ContainerDied","Data":"67b2e3fd5aad9a96898bb725ef413d83e0bc7369e8bc03e5c113578e5985f46f"} Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.750144 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.750165 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a3aec8d2-008a-4b77-a30b-23f8e812e332","Type":"ContainerDied","Data":"9aa91a0925394f3ab132912f5de855e61d621abd853412282c32c069f18c50a5"} Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.790528 4771 scope.go:117] "RemoveContainer" containerID="21065341d65c55328d33fca19982cb91c451939d0b0dd32c90272cca9aecf888" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.792170 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.794667 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.818950 4771 scope.go:117] "RemoveContainer" containerID="67b2e3fd5aad9a96898bb725ef413d83e0bc7369e8bc03e5c113578e5985f46f" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.832608 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.844448 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.858244 4771 scope.go:117] "RemoveContainer" containerID="29e220567126edda63a240d888ea845a559902f6854e201515bb3868c5d74e70" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.871530 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 01:27:49 crc kubenswrapper[4771]: E0227 01:27:49.874607 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2c84581-5806-46dd-b352-390ef2d9826c" containerName="setup-container" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.874634 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2c84581-5806-46dd-b352-390ef2d9826c" containerName="setup-container" Feb 27 01:27:49 crc kubenswrapper[4771]: E0227 01:27:49.874655 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3aec8d2-008a-4b77-a30b-23f8e812e332" containerName="setup-container" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.874664 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3aec8d2-008a-4b77-a30b-23f8e812e332" containerName="setup-container" Feb 27 01:27:49 crc kubenswrapper[4771]: E0227 01:27:49.874691 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2c84581-5806-46dd-b352-390ef2d9826c" containerName="rabbitmq" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.874697 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2c84581-5806-46dd-b352-390ef2d9826c" containerName="rabbitmq" Feb 27 01:27:49 crc kubenswrapper[4771]: E0227 01:27:49.874719 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3aec8d2-008a-4b77-a30b-23f8e812e332" containerName="rabbitmq" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.874725 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3aec8d2-008a-4b77-a30b-23f8e812e332" containerName="rabbitmq" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.875513 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3aec8d2-008a-4b77-a30b-23f8e812e332" containerName="rabbitmq" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.875541 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2c84581-5806-46dd-b352-390ef2d9826c" containerName="rabbitmq" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.879895 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.882830 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.884735 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.884780 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.884812 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.884997 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.885134 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.885339 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.885608 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.886215 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-sl82b" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.891410 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.891501 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-pd5wj" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.891701 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.891820 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.891849 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.891919 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.892007 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.908920 4771 scope.go:117] "RemoveContainer" containerID="67b2e3fd5aad9a96898bb725ef413d83e0bc7369e8bc03e5c113578e5985f46f" Feb 27 01:27:49 crc kubenswrapper[4771]: E0227 01:27:49.913213 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67b2e3fd5aad9a96898bb725ef413d83e0bc7369e8bc03e5c113578e5985f46f\": container with ID starting with 67b2e3fd5aad9a96898bb725ef413d83e0bc7369e8bc03e5c113578e5985f46f not found: ID does not exist" containerID="67b2e3fd5aad9a96898bb725ef413d83e0bc7369e8bc03e5c113578e5985f46f" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.914610 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67b2e3fd5aad9a96898bb725ef413d83e0bc7369e8bc03e5c113578e5985f46f"} err="failed to get container status \"67b2e3fd5aad9a96898bb725ef413d83e0bc7369e8bc03e5c113578e5985f46f\": rpc error: code = NotFound desc = could not find container \"67b2e3fd5aad9a96898bb725ef413d83e0bc7369e8bc03e5c113578e5985f46f\": container with ID starting with 67b2e3fd5aad9a96898bb725ef413d83e0bc7369e8bc03e5c113578e5985f46f not found: ID does not exist" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.914702 4771 scope.go:117] "RemoveContainer" containerID="29e220567126edda63a240d888ea845a559902f6854e201515bb3868c5d74e70" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.914343 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lswhl\" (UniqueName: \"kubernetes.io/projected/7813115d-b642-406c-892d-61b10c9777d2-kube-api-access-lswhl\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.914979 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7813115d-b642-406c-892d-61b10c9777d2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.915076 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7813115d-b642-406c-892d-61b10c9777d2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.915179 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7813115d-b642-406c-892d-61b10c9777d2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.915260 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7813115d-b642-406c-892d-61b10c9777d2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.915388 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7813115d-b642-406c-892d-61b10c9777d2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.915482 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7813115d-b642-406c-892d-61b10c9777d2-config-data\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.919941 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.920126 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7813115d-b642-406c-892d-61b10c9777d2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.920288 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7813115d-b642-406c-892d-61b10c9777d2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.920507 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7813115d-b642-406c-892d-61b10c9777d2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.921852 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 01:27:49 crc kubenswrapper[4771]: E0227 01:27:49.922984 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29e220567126edda63a240d888ea845a559902f6854e201515bb3868c5d74e70\": container with ID starting with 29e220567126edda63a240d888ea845a559902f6854e201515bb3868c5d74e70 not found: ID does not exist" containerID="29e220567126edda63a240d888ea845a559902f6854e201515bb3868c5d74e70" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.923028 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29e220567126edda63a240d888ea845a559902f6854e201515bb3868c5d74e70"} err="failed to get container status \"29e220567126edda63a240d888ea845a559902f6854e201515bb3868c5d74e70\": rpc error: code = NotFound desc = could not find container \"29e220567126edda63a240d888ea845a559902f6854e201515bb3868c5d74e70\": container with ID starting with 29e220567126edda63a240d888ea845a559902f6854e201515bb3868c5d74e70 not found: ID does not exist" Feb 27 01:27:49 crc kubenswrapper[4771]: I0227 01:27:49.933880 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.022188 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7813115d-b642-406c-892d-61b10c9777d2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.022245 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7813115d-b642-406c-892d-61b10c9777d2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.022269 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7813115d-b642-406c-892d-61b10c9777d2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.022290 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/370e8739-d955-433e-8f61-b8e3bc1d8dc7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.022318 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/370e8739-d955-433e-8f61-b8e3bc1d8dc7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.022340 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7813115d-b642-406c-892d-61b10c9777d2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.022360 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/370e8739-d955-433e-8f61-b8e3bc1d8dc7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.022381 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7813115d-b642-406c-892d-61b10c9777d2-config-data\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.022403 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/370e8739-d955-433e-8f61-b8e3bc1d8dc7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.022426 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.022441 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7813115d-b642-406c-892d-61b10c9777d2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.022463 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/370e8739-d955-433e-8f61-b8e3bc1d8dc7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.022482 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnxz9\" (UniqueName: \"kubernetes.io/projected/370e8739-d955-433e-8f61-b8e3bc1d8dc7-kube-api-access-gnxz9\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.022508 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7813115d-b642-406c-892d-61b10c9777d2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.022526 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.022541 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/370e8739-d955-433e-8f61-b8e3bc1d8dc7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.022606 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/370e8739-d955-433e-8f61-b8e3bc1d8dc7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.022625 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7813115d-b642-406c-892d-61b10c9777d2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.022646 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/370e8739-d955-433e-8f61-b8e3bc1d8dc7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.022685 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lswhl\" (UniqueName: \"kubernetes.io/projected/7813115d-b642-406c-892d-61b10c9777d2-kube-api-access-lswhl\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.022707 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/370e8739-d955-433e-8f61-b8e3bc1d8dc7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.022738 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7813115d-b642-406c-892d-61b10c9777d2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.026065 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7813115d-b642-406c-892d-61b10c9777d2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.026734 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7813115d-b642-406c-892d-61b10c9777d2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.028238 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7813115d-b642-406c-892d-61b10c9777d2-config-data\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.028403 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.029068 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7813115d-b642-406c-892d-61b10c9777d2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.029664 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7813115d-b642-406c-892d-61b10c9777d2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.031284 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7813115d-b642-406c-892d-61b10c9777d2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.032296 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7813115d-b642-406c-892d-61b10c9777d2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.033113 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7813115d-b642-406c-892d-61b10c9777d2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.036331 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7813115d-b642-406c-892d-61b10c9777d2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.048594 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lswhl\" (UniqueName: \"kubernetes.io/projected/7813115d-b642-406c-892d-61b10c9777d2-kube-api-access-lswhl\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.067495 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"7813115d-b642-406c-892d-61b10c9777d2\") " pod="openstack/rabbitmq-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.123790 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/370e8739-d955-433e-8f61-b8e3bc1d8dc7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.123875 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/370e8739-d955-433e-8f61-b8e3bc1d8dc7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.123908 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/370e8739-d955-433e-8f61-b8e3bc1d8dc7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.123936 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/370e8739-d955-433e-8f61-b8e3bc1d8dc7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.123962 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/370e8739-d955-433e-8f61-b8e3bc1d8dc7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.123995 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/370e8739-d955-433e-8f61-b8e3bc1d8dc7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.124010 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnxz9\" (UniqueName: \"kubernetes.io/projected/370e8739-d955-433e-8f61-b8e3bc1d8dc7-kube-api-access-gnxz9\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.124043 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.124063 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/370e8739-d955-433e-8f61-b8e3bc1d8dc7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.124113 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/370e8739-d955-433e-8f61-b8e3bc1d8dc7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.124138 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/370e8739-d955-433e-8f61-b8e3bc1d8dc7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.124274 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/370e8739-d955-433e-8f61-b8e3bc1d8dc7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.124797 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/370e8739-d955-433e-8f61-b8e3bc1d8dc7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.124916 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/370e8739-d955-433e-8f61-b8e3bc1d8dc7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.125042 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.125344 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/370e8739-d955-433e-8f61-b8e3bc1d8dc7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.125564 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/370e8739-d955-433e-8f61-b8e3bc1d8dc7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.129061 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/370e8739-d955-433e-8f61-b8e3bc1d8dc7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.129522 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/370e8739-d955-433e-8f61-b8e3bc1d8dc7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.130371 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/370e8739-d955-433e-8f61-b8e3bc1d8dc7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.135465 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/370e8739-d955-433e-8f61-b8e3bc1d8dc7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.142237 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnxz9\" (UniqueName: \"kubernetes.io/projected/370e8739-d955-433e-8f61-b8e3bc1d8dc7-kube-api-access-gnxz9\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.151116 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"370e8739-d955-433e-8f61-b8e3bc1d8dc7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.259108 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.274602 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.811716 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 27 01:27:50 crc kubenswrapper[4771]: W0227 01:27:50.814276 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7813115d_b642_406c_892d_61b10c9777d2.slice/crio-297a48e25abc55a14c2bad7a4502c5e7bc76999bffade98eabd796aec2af3d7c WatchSource:0}: Error finding container 297a48e25abc55a14c2bad7a4502c5e7bc76999bffade98eabd796aec2af3d7c: Status 404 returned error can't find the container with id 297a48e25abc55a14c2bad7a4502c5e7bc76999bffade98eabd796aec2af3d7c Feb 27 01:27:50 crc kubenswrapper[4771]: I0227 01:27:50.873040 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 27 01:27:50 crc kubenswrapper[4771]: W0227 01:27:50.874556 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod370e8739_d955_433e_8f61_b8e3bc1d8dc7.slice/crio-1e007bd6ded6dccf3fd09c193b80bfc7fcefac8e5539e7e15af05b59ff3d15ea WatchSource:0}: Error finding container 1e007bd6ded6dccf3fd09c193b80bfc7fcefac8e5539e7e15af05b59ff3d15ea: Status 404 returned error can't find the container with id 1e007bd6ded6dccf3fd09c193b80bfc7fcefac8e5539e7e15af05b59ff3d15ea Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.662465 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-vvrn8"] Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.664818 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.670586 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.680906 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-vvrn8"] Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.755744 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-vvrn8\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.755837 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-vvrn8\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.755930 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-vvrn8\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.755955 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-vvrn8\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.755974 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-config\") pod \"dnsmasq-dns-79bd4cc8c9-vvrn8\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.756015 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq9c4\" (UniqueName: \"kubernetes.io/projected/55452564-ea38-46a1-bb44-7cac2ae8840a-kube-api-access-qq9c4\") pod \"dnsmasq-dns-79bd4cc8c9-vvrn8\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.756045 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-vvrn8\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.769176 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"370e8739-d955-433e-8f61-b8e3bc1d8dc7","Type":"ContainerStarted","Data":"1e007bd6ded6dccf3fd09c193b80bfc7fcefac8e5539e7e15af05b59ff3d15ea"} Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.770494 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7813115d-b642-406c-892d-61b10c9777d2","Type":"ContainerStarted","Data":"297a48e25abc55a14c2bad7a4502c5e7bc76999bffade98eabd796aec2af3d7c"} Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.785312 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2c84581-5806-46dd-b352-390ef2d9826c" path="/var/lib/kubelet/pods/a2c84581-5806-46dd-b352-390ef2d9826c/volumes" Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.786208 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3aec8d2-008a-4b77-a30b-23f8e812e332" path="/var/lib/kubelet/pods/a3aec8d2-008a-4b77-a30b-23f8e812e332/volumes" Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.858521 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq9c4\" (UniqueName: \"kubernetes.io/projected/55452564-ea38-46a1-bb44-7cac2ae8840a-kube-api-access-qq9c4\") pod \"dnsmasq-dns-79bd4cc8c9-vvrn8\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.858627 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-vvrn8\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.858729 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-vvrn8\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.858786 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-vvrn8\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.858887 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-vvrn8\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.858939 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-vvrn8\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.858957 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-config\") pod \"dnsmasq-dns-79bd4cc8c9-vvrn8\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.859439 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-vvrn8\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.859921 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-vvrn8\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.859976 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-vvrn8\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.860096 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-config\") pod \"dnsmasq-dns-79bd4cc8c9-vvrn8\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.860100 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-vvrn8\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.860411 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-vvrn8\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:27:51 crc kubenswrapper[4771]: I0227 01:27:51.882639 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq9c4\" (UniqueName: \"kubernetes.io/projected/55452564-ea38-46a1-bb44-7cac2ae8840a-kube-api-access-qq9c4\") pod \"dnsmasq-dns-79bd4cc8c9-vvrn8\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:27:52 crc kubenswrapper[4771]: I0227 01:27:52.022014 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:27:52 crc kubenswrapper[4771]: I0227 01:27:52.464425 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-vvrn8"] Feb 27 01:27:52 crc kubenswrapper[4771]: W0227 01:27:52.529374 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55452564_ea38_46a1_bb44_7cac2ae8840a.slice/crio-0a05b14a4914d344622a21ab2ccc35f9b7895a911a790dce3c9bd88c70575af5 WatchSource:0}: Error finding container 0a05b14a4914d344622a21ab2ccc35f9b7895a911a790dce3c9bd88c70575af5: Status 404 returned error can't find the container with id 0a05b14a4914d344622a21ab2ccc35f9b7895a911a790dce3c9bd88c70575af5 Feb 27 01:27:52 crc kubenswrapper[4771]: I0227 01:27:52.778268 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"370e8739-d955-433e-8f61-b8e3bc1d8dc7","Type":"ContainerStarted","Data":"240e8ae49c92a5945e4f0381ea3f0fa3667e76a1be7ac727e601acccb64a457f"} Feb 27 01:27:52 crc kubenswrapper[4771]: I0227 01:27:52.780421 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7813115d-b642-406c-892d-61b10c9777d2","Type":"ContainerStarted","Data":"fc882403dc9f93eade06fcb944b7af5e74973bfea4dd83a1fe459d989a67dc4e"} Feb 27 01:27:52 crc kubenswrapper[4771]: I0227 01:27:52.786809 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" event={"ID":"55452564-ea38-46a1-bb44-7cac2ae8840a","Type":"ContainerStarted","Data":"f7368f73e637edf4de0a9418598c66840ba06f9aa57967f476097ffbbf539b74"} Feb 27 01:27:52 crc kubenswrapper[4771]: I0227 01:27:52.786853 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" event={"ID":"55452564-ea38-46a1-bb44-7cac2ae8840a","Type":"ContainerStarted","Data":"0a05b14a4914d344622a21ab2ccc35f9b7895a911a790dce3c9bd88c70575af5"} Feb 27 01:27:53 crc kubenswrapper[4771]: I0227 01:27:53.801096 4771 generic.go:334] "Generic (PLEG): container finished" podID="55452564-ea38-46a1-bb44-7cac2ae8840a" containerID="f7368f73e637edf4de0a9418598c66840ba06f9aa57967f476097ffbbf539b74" exitCode=0 Feb 27 01:27:53 crc kubenswrapper[4771]: I0227 01:27:53.801177 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" event={"ID":"55452564-ea38-46a1-bb44-7cac2ae8840a","Type":"ContainerDied","Data":"f7368f73e637edf4de0a9418598c66840ba06f9aa57967f476097ffbbf539b74"} Feb 27 01:27:54 crc kubenswrapper[4771]: I0227 01:27:54.816960 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" event={"ID":"55452564-ea38-46a1-bb44-7cac2ae8840a","Type":"ContainerStarted","Data":"edbd000fd8085d67ebfe8efaceb824e17b04f8953266f4e158e4f27bd690f6a9"} Feb 27 01:27:54 crc kubenswrapper[4771]: I0227 01:27:54.817812 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:27:54 crc kubenswrapper[4771]: I0227 01:27:54.846685 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" podStartSLOduration=3.846655059 podStartE2EDuration="3.846655059s" podCreationTimestamp="2026-02-27 01:27:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:27:54.838433308 +0000 UTC m=+1387.775994656" watchObservedRunningTime="2026-02-27 01:27:54.846655059 +0000 UTC m=+1387.784216367" Feb 27 01:28:00 crc kubenswrapper[4771]: I0227 01:28:00.150024 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535928-jj8zd"] Feb 27 01:28:00 crc kubenswrapper[4771]: I0227 01:28:00.153179 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535928-jj8zd" Feb 27 01:28:00 crc kubenswrapper[4771]: I0227 01:28:00.157489 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:28:00 crc kubenswrapper[4771]: I0227 01:28:00.158107 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:28:00 crc kubenswrapper[4771]: I0227 01:28:00.158509 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 01:28:00 crc kubenswrapper[4771]: I0227 01:28:00.175343 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535928-jj8zd"] Feb 27 01:28:00 crc kubenswrapper[4771]: I0227 01:28:00.244938 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx8n4\" (UniqueName: \"kubernetes.io/projected/0d28e3a3-ccd6-4c74-996b-cb8844471672-kube-api-access-zx8n4\") pod \"auto-csr-approver-29535928-jj8zd\" (UID: \"0d28e3a3-ccd6-4c74-996b-cb8844471672\") " pod="openshift-infra/auto-csr-approver-29535928-jj8zd" Feb 27 01:28:00 crc kubenswrapper[4771]: I0227 01:28:00.347201 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx8n4\" (UniqueName: \"kubernetes.io/projected/0d28e3a3-ccd6-4c74-996b-cb8844471672-kube-api-access-zx8n4\") pod \"auto-csr-approver-29535928-jj8zd\" (UID: \"0d28e3a3-ccd6-4c74-996b-cb8844471672\") " pod="openshift-infra/auto-csr-approver-29535928-jj8zd" Feb 27 01:28:00 crc kubenswrapper[4771]: I0227 01:28:00.368435 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx8n4\" (UniqueName: \"kubernetes.io/projected/0d28e3a3-ccd6-4c74-996b-cb8844471672-kube-api-access-zx8n4\") pod \"auto-csr-approver-29535928-jj8zd\" (UID: \"0d28e3a3-ccd6-4c74-996b-cb8844471672\") " pod="openshift-infra/auto-csr-approver-29535928-jj8zd" Feb 27 01:28:00 crc kubenswrapper[4771]: I0227 01:28:00.490091 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535928-jj8zd" Feb 27 01:28:01 crc kubenswrapper[4771]: I0227 01:28:01.113240 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535928-jj8zd"] Feb 27 01:28:01 crc kubenswrapper[4771]: I0227 01:28:01.901852 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535928-jj8zd" event={"ID":"0d28e3a3-ccd6-4c74-996b-cb8844471672","Type":"ContainerStarted","Data":"c39d695e98fc2f5f84a98f7cd2a30990db01297510888821898890b33feae4ba"} Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.023952 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.132107 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-jc8lh"] Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.132598 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" podUID="7e507450-eb79-43bc-ae7b-89352c222a44" containerName="dnsmasq-dns" containerID="cri-o://47fc1671416415f62793ec01aad9d8d3dd23dbaa69355a05c405d3c3e52adabf" gracePeriod=10 Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.313063 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55478c4467-b7nss"] Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.316329 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-b7nss" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.357995 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-b7nss"] Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.491881 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/13aff92d-bbb5-4229-8296-90dea52e389a-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-b7nss\" (UID: \"13aff92d-bbb5-4229-8296-90dea52e389a\") " pod="openstack/dnsmasq-dns-55478c4467-b7nss" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.492191 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13aff92d-bbb5-4229-8296-90dea52e389a-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-b7nss\" (UID: \"13aff92d-bbb5-4229-8296-90dea52e389a\") " pod="openstack/dnsmasq-dns-55478c4467-b7nss" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.492233 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13aff92d-bbb5-4229-8296-90dea52e389a-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-b7nss\" (UID: \"13aff92d-bbb5-4229-8296-90dea52e389a\") " pod="openstack/dnsmasq-dns-55478c4467-b7nss" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.492330 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13aff92d-bbb5-4229-8296-90dea52e389a-dns-svc\") pod \"dnsmasq-dns-55478c4467-b7nss\" (UID: \"13aff92d-bbb5-4229-8296-90dea52e389a\") " pod="openstack/dnsmasq-dns-55478c4467-b7nss" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.492355 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb8hs\" (UniqueName: \"kubernetes.io/projected/13aff92d-bbb5-4229-8296-90dea52e389a-kube-api-access-rb8hs\") pod \"dnsmasq-dns-55478c4467-b7nss\" (UID: \"13aff92d-bbb5-4229-8296-90dea52e389a\") " pod="openstack/dnsmasq-dns-55478c4467-b7nss" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.492393 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13aff92d-bbb5-4229-8296-90dea52e389a-config\") pod \"dnsmasq-dns-55478c4467-b7nss\" (UID: \"13aff92d-bbb5-4229-8296-90dea52e389a\") " pod="openstack/dnsmasq-dns-55478c4467-b7nss" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.492417 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13aff92d-bbb5-4229-8296-90dea52e389a-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-b7nss\" (UID: \"13aff92d-bbb5-4229-8296-90dea52e389a\") " pod="openstack/dnsmasq-dns-55478c4467-b7nss" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.593641 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/13aff92d-bbb5-4229-8296-90dea52e389a-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-b7nss\" (UID: \"13aff92d-bbb5-4229-8296-90dea52e389a\") " pod="openstack/dnsmasq-dns-55478c4467-b7nss" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.593689 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13aff92d-bbb5-4229-8296-90dea52e389a-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-b7nss\" (UID: \"13aff92d-bbb5-4229-8296-90dea52e389a\") " pod="openstack/dnsmasq-dns-55478c4467-b7nss" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.593733 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13aff92d-bbb5-4229-8296-90dea52e389a-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-b7nss\" (UID: \"13aff92d-bbb5-4229-8296-90dea52e389a\") " pod="openstack/dnsmasq-dns-55478c4467-b7nss" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.593807 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13aff92d-bbb5-4229-8296-90dea52e389a-dns-svc\") pod \"dnsmasq-dns-55478c4467-b7nss\" (UID: \"13aff92d-bbb5-4229-8296-90dea52e389a\") " pod="openstack/dnsmasq-dns-55478c4467-b7nss" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.593838 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb8hs\" (UniqueName: \"kubernetes.io/projected/13aff92d-bbb5-4229-8296-90dea52e389a-kube-api-access-rb8hs\") pod \"dnsmasq-dns-55478c4467-b7nss\" (UID: \"13aff92d-bbb5-4229-8296-90dea52e389a\") " pod="openstack/dnsmasq-dns-55478c4467-b7nss" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.593876 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13aff92d-bbb5-4229-8296-90dea52e389a-config\") pod \"dnsmasq-dns-55478c4467-b7nss\" (UID: \"13aff92d-bbb5-4229-8296-90dea52e389a\") " pod="openstack/dnsmasq-dns-55478c4467-b7nss" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.593915 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13aff92d-bbb5-4229-8296-90dea52e389a-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-b7nss\" (UID: \"13aff92d-bbb5-4229-8296-90dea52e389a\") " pod="openstack/dnsmasq-dns-55478c4467-b7nss" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.594989 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13aff92d-bbb5-4229-8296-90dea52e389a-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-b7nss\" (UID: \"13aff92d-bbb5-4229-8296-90dea52e389a\") " pod="openstack/dnsmasq-dns-55478c4467-b7nss" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.595636 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/13aff92d-bbb5-4229-8296-90dea52e389a-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-b7nss\" (UID: \"13aff92d-bbb5-4229-8296-90dea52e389a\") " pod="openstack/dnsmasq-dns-55478c4467-b7nss" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.596325 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13aff92d-bbb5-4229-8296-90dea52e389a-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-b7nss\" (UID: \"13aff92d-bbb5-4229-8296-90dea52e389a\") " pod="openstack/dnsmasq-dns-55478c4467-b7nss" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.596670 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13aff92d-bbb5-4229-8296-90dea52e389a-dns-svc\") pod \"dnsmasq-dns-55478c4467-b7nss\" (UID: \"13aff92d-bbb5-4229-8296-90dea52e389a\") " pod="openstack/dnsmasq-dns-55478c4467-b7nss" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.597203 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13aff92d-bbb5-4229-8296-90dea52e389a-config\") pod \"dnsmasq-dns-55478c4467-b7nss\" (UID: \"13aff92d-bbb5-4229-8296-90dea52e389a\") " pod="openstack/dnsmasq-dns-55478c4467-b7nss" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.597316 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13aff92d-bbb5-4229-8296-90dea52e389a-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-b7nss\" (UID: \"13aff92d-bbb5-4229-8296-90dea52e389a\") " pod="openstack/dnsmasq-dns-55478c4467-b7nss" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.628955 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb8hs\" (UniqueName: \"kubernetes.io/projected/13aff92d-bbb5-4229-8296-90dea52e389a-kube-api-access-rb8hs\") pod \"dnsmasq-dns-55478c4467-b7nss\" (UID: \"13aff92d-bbb5-4229-8296-90dea52e389a\") " pod="openstack/dnsmasq-dns-55478c4467-b7nss" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.649122 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-b7nss" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.742236 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.900986 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-config\") pod \"7e507450-eb79-43bc-ae7b-89352c222a44\" (UID: \"7e507450-eb79-43bc-ae7b-89352c222a44\") " Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.901049 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-ovsdbserver-nb\") pod \"7e507450-eb79-43bc-ae7b-89352c222a44\" (UID: \"7e507450-eb79-43bc-ae7b-89352c222a44\") " Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.901127 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h97cn\" (UniqueName: \"kubernetes.io/projected/7e507450-eb79-43bc-ae7b-89352c222a44-kube-api-access-h97cn\") pod \"7e507450-eb79-43bc-ae7b-89352c222a44\" (UID: \"7e507450-eb79-43bc-ae7b-89352c222a44\") " Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.901168 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-ovsdbserver-sb\") pod \"7e507450-eb79-43bc-ae7b-89352c222a44\" (UID: \"7e507450-eb79-43bc-ae7b-89352c222a44\") " Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.901231 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-dns-swift-storage-0\") pod \"7e507450-eb79-43bc-ae7b-89352c222a44\" (UID: \"7e507450-eb79-43bc-ae7b-89352c222a44\") " Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.901321 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-dns-svc\") pod \"7e507450-eb79-43bc-ae7b-89352c222a44\" (UID: \"7e507450-eb79-43bc-ae7b-89352c222a44\") " Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.906138 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e507450-eb79-43bc-ae7b-89352c222a44-kube-api-access-h97cn" (OuterVolumeSpecName: "kube-api-access-h97cn") pod "7e507450-eb79-43bc-ae7b-89352c222a44" (UID: "7e507450-eb79-43bc-ae7b-89352c222a44"). InnerVolumeSpecName "kube-api-access-h97cn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.921117 4771 generic.go:334] "Generic (PLEG): container finished" podID="7e507450-eb79-43bc-ae7b-89352c222a44" containerID="47fc1671416415f62793ec01aad9d8d3dd23dbaa69355a05c405d3c3e52adabf" exitCode=0 Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.921198 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" event={"ID":"7e507450-eb79-43bc-ae7b-89352c222a44","Type":"ContainerDied","Data":"47fc1671416415f62793ec01aad9d8d3dd23dbaa69355a05c405d3c3e52adabf"} Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.921231 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" event={"ID":"7e507450-eb79-43bc-ae7b-89352c222a44","Type":"ContainerDied","Data":"0b28ee3e6bbde0f2462b65ab797aeae0b40bb827edfe5c4e7f638877f52e26f6"} Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.921249 4771 scope.go:117] "RemoveContainer" containerID="47fc1671416415f62793ec01aad9d8d3dd23dbaa69355a05c405d3c3e52adabf" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.921799 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-jc8lh" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.926588 4771 generic.go:334] "Generic (PLEG): container finished" podID="0d28e3a3-ccd6-4c74-996b-cb8844471672" containerID="47b89cdc4291cbcb5b6c29ef410d93e1215af46db51f890e53a4204f77f24d90" exitCode=0 Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.926626 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535928-jj8zd" event={"ID":"0d28e3a3-ccd6-4c74-996b-cb8844471672","Type":"ContainerDied","Data":"47b89cdc4291cbcb5b6c29ef410d93e1215af46db51f890e53a4204f77f24d90"} Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.950702 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-config" (OuterVolumeSpecName: "config") pod "7e507450-eb79-43bc-ae7b-89352c222a44" (UID: "7e507450-eb79-43bc-ae7b-89352c222a44"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.958807 4771 scope.go:117] "RemoveContainer" containerID="b47bf402f1d987d47b0b75f1599f21e4974fcb2e6a2d7d3420a07b36a7033fc2" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.959212 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7e507450-eb79-43bc-ae7b-89352c222a44" (UID: "7e507450-eb79-43bc-ae7b-89352c222a44"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.961424 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7e507450-eb79-43bc-ae7b-89352c222a44" (UID: "7e507450-eb79-43bc-ae7b-89352c222a44"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.963439 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7e507450-eb79-43bc-ae7b-89352c222a44" (UID: "7e507450-eb79-43bc-ae7b-89352c222a44"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.964657 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7e507450-eb79-43bc-ae7b-89352c222a44" (UID: "7e507450-eb79-43bc-ae7b-89352c222a44"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.984884 4771 scope.go:117] "RemoveContainer" containerID="47fc1671416415f62793ec01aad9d8d3dd23dbaa69355a05c405d3c3e52adabf" Feb 27 01:28:02 crc kubenswrapper[4771]: E0227 01:28:02.985610 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47fc1671416415f62793ec01aad9d8d3dd23dbaa69355a05c405d3c3e52adabf\": container with ID starting with 47fc1671416415f62793ec01aad9d8d3dd23dbaa69355a05c405d3c3e52adabf not found: ID does not exist" containerID="47fc1671416415f62793ec01aad9d8d3dd23dbaa69355a05c405d3c3e52adabf" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.985645 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47fc1671416415f62793ec01aad9d8d3dd23dbaa69355a05c405d3c3e52adabf"} err="failed to get container status \"47fc1671416415f62793ec01aad9d8d3dd23dbaa69355a05c405d3c3e52adabf\": rpc error: code = NotFound desc = could not find container \"47fc1671416415f62793ec01aad9d8d3dd23dbaa69355a05c405d3c3e52adabf\": container with ID starting with 47fc1671416415f62793ec01aad9d8d3dd23dbaa69355a05c405d3c3e52adabf not found: ID does not exist" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.985666 4771 scope.go:117] "RemoveContainer" containerID="b47bf402f1d987d47b0b75f1599f21e4974fcb2e6a2d7d3420a07b36a7033fc2" Feb 27 01:28:02 crc kubenswrapper[4771]: E0227 01:28:02.986009 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b47bf402f1d987d47b0b75f1599f21e4974fcb2e6a2d7d3420a07b36a7033fc2\": container with ID starting with b47bf402f1d987d47b0b75f1599f21e4974fcb2e6a2d7d3420a07b36a7033fc2 not found: ID does not exist" containerID="b47bf402f1d987d47b0b75f1599f21e4974fcb2e6a2d7d3420a07b36a7033fc2" Feb 27 01:28:02 crc kubenswrapper[4771]: I0227 01:28:02.986053 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b47bf402f1d987d47b0b75f1599f21e4974fcb2e6a2d7d3420a07b36a7033fc2"} err="failed to get container status \"b47bf402f1d987d47b0b75f1599f21e4974fcb2e6a2d7d3420a07b36a7033fc2\": rpc error: code = NotFound desc = could not find container \"b47bf402f1d987d47b0b75f1599f21e4974fcb2e6a2d7d3420a07b36a7033fc2\": container with ID starting with b47bf402f1d987d47b0b75f1599f21e4974fcb2e6a2d7d3420a07b36a7033fc2 not found: ID does not exist" Feb 27 01:28:03 crc kubenswrapper[4771]: I0227 01:28:03.004728 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h97cn\" (UniqueName: \"kubernetes.io/projected/7e507450-eb79-43bc-ae7b-89352c222a44-kube-api-access-h97cn\") on node \"crc\" DevicePath \"\"" Feb 27 01:28:03 crc kubenswrapper[4771]: I0227 01:28:03.004759 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 01:28:03 crc kubenswrapper[4771]: I0227 01:28:03.004768 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 01:28:03 crc kubenswrapper[4771]: I0227 01:28:03.004778 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 01:28:03 crc kubenswrapper[4771]: I0227 01:28:03.004789 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:28:03 crc kubenswrapper[4771]: I0227 01:28:03.004797 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e507450-eb79-43bc-ae7b-89352c222a44-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 01:28:03 crc kubenswrapper[4771]: I0227 01:28:03.107143 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-b7nss"] Feb 27 01:28:03 crc kubenswrapper[4771]: W0227 01:28:03.116831 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13aff92d_bbb5_4229_8296_90dea52e389a.slice/crio-dd27e7179a549f2f2d745b71f52d4607ba20a3ca0e70565041b29a1693809c95 WatchSource:0}: Error finding container dd27e7179a549f2f2d745b71f52d4607ba20a3ca0e70565041b29a1693809c95: Status 404 returned error can't find the container with id dd27e7179a549f2f2d745b71f52d4607ba20a3ca0e70565041b29a1693809c95 Feb 27 01:28:03 crc kubenswrapper[4771]: I0227 01:28:03.335992 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-jc8lh"] Feb 27 01:28:03 crc kubenswrapper[4771]: I0227 01:28:03.343740 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-jc8lh"] Feb 27 01:28:03 crc kubenswrapper[4771]: I0227 01:28:03.789621 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e507450-eb79-43bc-ae7b-89352c222a44" path="/var/lib/kubelet/pods/7e507450-eb79-43bc-ae7b-89352c222a44/volumes" Feb 27 01:28:03 crc kubenswrapper[4771]: I0227 01:28:03.942300 4771 generic.go:334] "Generic (PLEG): container finished" podID="13aff92d-bbb5-4229-8296-90dea52e389a" containerID="d53fa9eea44ac1abb1e9d72a98e869ca0c30bcb03f03092ef50dde4b4eb8b0df" exitCode=0 Feb 27 01:28:03 crc kubenswrapper[4771]: I0227 01:28:03.942389 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-b7nss" event={"ID":"13aff92d-bbb5-4229-8296-90dea52e389a","Type":"ContainerDied","Data":"d53fa9eea44ac1abb1e9d72a98e869ca0c30bcb03f03092ef50dde4b4eb8b0df"} Feb 27 01:28:03 crc kubenswrapper[4771]: I0227 01:28:03.942430 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-b7nss" event={"ID":"13aff92d-bbb5-4229-8296-90dea52e389a","Type":"ContainerStarted","Data":"dd27e7179a549f2f2d745b71f52d4607ba20a3ca0e70565041b29a1693809c95"} Feb 27 01:28:04 crc kubenswrapper[4771]: I0227 01:28:04.288837 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535928-jj8zd" Feb 27 01:28:04 crc kubenswrapper[4771]: I0227 01:28:04.446212 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zx8n4\" (UniqueName: \"kubernetes.io/projected/0d28e3a3-ccd6-4c74-996b-cb8844471672-kube-api-access-zx8n4\") pod \"0d28e3a3-ccd6-4c74-996b-cb8844471672\" (UID: \"0d28e3a3-ccd6-4c74-996b-cb8844471672\") " Feb 27 01:28:04 crc kubenswrapper[4771]: I0227 01:28:04.449950 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d28e3a3-ccd6-4c74-996b-cb8844471672-kube-api-access-zx8n4" (OuterVolumeSpecName: "kube-api-access-zx8n4") pod "0d28e3a3-ccd6-4c74-996b-cb8844471672" (UID: "0d28e3a3-ccd6-4c74-996b-cb8844471672"). InnerVolumeSpecName "kube-api-access-zx8n4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:28:04 crc kubenswrapper[4771]: I0227 01:28:04.548078 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zx8n4\" (UniqueName: \"kubernetes.io/projected/0d28e3a3-ccd6-4c74-996b-cb8844471672-kube-api-access-zx8n4\") on node \"crc\" DevicePath \"\"" Feb 27 01:28:04 crc kubenswrapper[4771]: I0227 01:28:04.956053 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535928-jj8zd" event={"ID":"0d28e3a3-ccd6-4c74-996b-cb8844471672","Type":"ContainerDied","Data":"c39d695e98fc2f5f84a98f7cd2a30990db01297510888821898890b33feae4ba"} Feb 27 01:28:04 crc kubenswrapper[4771]: I0227 01:28:04.956098 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c39d695e98fc2f5f84a98f7cd2a30990db01297510888821898890b33feae4ba" Feb 27 01:28:04 crc kubenswrapper[4771]: I0227 01:28:04.956117 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535928-jj8zd" Feb 27 01:28:04 crc kubenswrapper[4771]: I0227 01:28:04.958984 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-b7nss" event={"ID":"13aff92d-bbb5-4229-8296-90dea52e389a","Type":"ContainerStarted","Data":"f2432bafcd1504a931ca30c3c53e1720e5ff4dd32f4f727e0fcd6c9b200cdbaa"} Feb 27 01:28:04 crc kubenswrapper[4771]: I0227 01:28:04.959777 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55478c4467-b7nss" Feb 27 01:28:04 crc kubenswrapper[4771]: I0227 01:28:04.987320 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55478c4467-b7nss" podStartSLOduration=2.987295349 podStartE2EDuration="2.987295349s" podCreationTimestamp="2026-02-27 01:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:28:04.978481862 +0000 UTC m=+1397.916043150" watchObservedRunningTime="2026-02-27 01:28:04.987295349 +0000 UTC m=+1397.924856637" Feb 27 01:28:05 crc kubenswrapper[4771]: I0227 01:28:05.357763 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535922-bgv8w"] Feb 27 01:28:05 crc kubenswrapper[4771]: I0227 01:28:05.365533 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535922-bgv8w"] Feb 27 01:28:05 crc kubenswrapper[4771]: I0227 01:28:05.799024 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="630dd7a9-ea0e-4a92-aedc-8f737ea48316" path="/var/lib/kubelet/pods/630dd7a9-ea0e-4a92-aedc-8f737ea48316/volumes" Feb 27 01:28:12 crc kubenswrapper[4771]: I0227 01:28:12.650748 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55478c4467-b7nss" Feb 27 01:28:12 crc kubenswrapper[4771]: I0227 01:28:12.761252 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-vvrn8"] Feb 27 01:28:12 crc kubenswrapper[4771]: I0227 01:28:12.762087 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" podUID="55452564-ea38-46a1-bb44-7cac2ae8840a" containerName="dnsmasq-dns" containerID="cri-o://edbd000fd8085d67ebfe8efaceb824e17b04f8953266f4e158e4f27bd690f6a9" gracePeriod=10 Feb 27 01:28:13 crc kubenswrapper[4771]: I0227 01:28:13.075360 4771 generic.go:334] "Generic (PLEG): container finished" podID="55452564-ea38-46a1-bb44-7cac2ae8840a" containerID="edbd000fd8085d67ebfe8efaceb824e17b04f8953266f4e158e4f27bd690f6a9" exitCode=0 Feb 27 01:28:13 crc kubenswrapper[4771]: I0227 01:28:13.075405 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" event={"ID":"55452564-ea38-46a1-bb44-7cac2ae8840a","Type":"ContainerDied","Data":"edbd000fd8085d67ebfe8efaceb824e17b04f8953266f4e158e4f27bd690f6a9"} Feb 27 01:28:13 crc kubenswrapper[4771]: I0227 01:28:13.306877 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:28:13 crc kubenswrapper[4771]: I0227 01:28:13.448215 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-ovsdbserver-sb\") pod \"55452564-ea38-46a1-bb44-7cac2ae8840a\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " Feb 27 01:28:13 crc kubenswrapper[4771]: I0227 01:28:13.448317 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-config\") pod \"55452564-ea38-46a1-bb44-7cac2ae8840a\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " Feb 27 01:28:13 crc kubenswrapper[4771]: I0227 01:28:13.448365 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-openstack-edpm-ipam\") pod \"55452564-ea38-46a1-bb44-7cac2ae8840a\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " Feb 27 01:28:13 crc kubenswrapper[4771]: I0227 01:28:13.448416 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-dns-svc\") pod \"55452564-ea38-46a1-bb44-7cac2ae8840a\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " Feb 27 01:28:13 crc kubenswrapper[4771]: I0227 01:28:13.448458 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-ovsdbserver-nb\") pod \"55452564-ea38-46a1-bb44-7cac2ae8840a\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " Feb 27 01:28:13 crc kubenswrapper[4771]: I0227 01:28:13.448864 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq9c4\" (UniqueName: \"kubernetes.io/projected/55452564-ea38-46a1-bb44-7cac2ae8840a-kube-api-access-qq9c4\") pod \"55452564-ea38-46a1-bb44-7cac2ae8840a\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " Feb 27 01:28:13 crc kubenswrapper[4771]: I0227 01:28:13.448917 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-dns-swift-storage-0\") pod \"55452564-ea38-46a1-bb44-7cac2ae8840a\" (UID: \"55452564-ea38-46a1-bb44-7cac2ae8840a\") " Feb 27 01:28:13 crc kubenswrapper[4771]: I0227 01:28:13.455834 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55452564-ea38-46a1-bb44-7cac2ae8840a-kube-api-access-qq9c4" (OuterVolumeSpecName: "kube-api-access-qq9c4") pod "55452564-ea38-46a1-bb44-7cac2ae8840a" (UID: "55452564-ea38-46a1-bb44-7cac2ae8840a"). InnerVolumeSpecName "kube-api-access-qq9c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:28:13 crc kubenswrapper[4771]: I0227 01:28:13.506319 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-config" (OuterVolumeSpecName: "config") pod "55452564-ea38-46a1-bb44-7cac2ae8840a" (UID: "55452564-ea38-46a1-bb44-7cac2ae8840a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:28:13 crc kubenswrapper[4771]: I0227 01:28:13.509053 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "55452564-ea38-46a1-bb44-7cac2ae8840a" (UID: "55452564-ea38-46a1-bb44-7cac2ae8840a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:28:13 crc kubenswrapper[4771]: I0227 01:28:13.511174 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "55452564-ea38-46a1-bb44-7cac2ae8840a" (UID: "55452564-ea38-46a1-bb44-7cac2ae8840a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:28:13 crc kubenswrapper[4771]: I0227 01:28:13.512114 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "55452564-ea38-46a1-bb44-7cac2ae8840a" (UID: "55452564-ea38-46a1-bb44-7cac2ae8840a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:28:13 crc kubenswrapper[4771]: I0227 01:28:13.514064 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "55452564-ea38-46a1-bb44-7cac2ae8840a" (UID: "55452564-ea38-46a1-bb44-7cac2ae8840a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:28:13 crc kubenswrapper[4771]: I0227 01:28:13.516330 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "55452564-ea38-46a1-bb44-7cac2ae8840a" (UID: "55452564-ea38-46a1-bb44-7cac2ae8840a"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:28:13 crc kubenswrapper[4771]: I0227 01:28:13.550886 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 27 01:28:13 crc kubenswrapper[4771]: I0227 01:28:13.550927 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 27 01:28:13 crc kubenswrapper[4771]: I0227 01:28:13.550945 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq9c4\" (UniqueName: \"kubernetes.io/projected/55452564-ea38-46a1-bb44-7cac2ae8840a-kube-api-access-qq9c4\") on node \"crc\" DevicePath \"\"" Feb 27 01:28:13 crc kubenswrapper[4771]: I0227 01:28:13.550958 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 27 01:28:13 crc kubenswrapper[4771]: I0227 01:28:13.550972 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 27 01:28:13 crc kubenswrapper[4771]: I0227 01:28:13.550982 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-config\") on node \"crc\" DevicePath \"\"" Feb 27 01:28:13 crc kubenswrapper[4771]: I0227 01:28:13.550993 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/55452564-ea38-46a1-bb44-7cac2ae8840a-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 01:28:14 crc kubenswrapper[4771]: I0227 01:28:14.088630 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" event={"ID":"55452564-ea38-46a1-bb44-7cac2ae8840a","Type":"ContainerDied","Data":"0a05b14a4914d344622a21ab2ccc35f9b7895a911a790dce3c9bd88c70575af5"} Feb 27 01:28:14 crc kubenswrapper[4771]: I0227 01:28:14.088699 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-vvrn8" Feb 27 01:28:14 crc kubenswrapper[4771]: I0227 01:28:14.088709 4771 scope.go:117] "RemoveContainer" containerID="edbd000fd8085d67ebfe8efaceb824e17b04f8953266f4e158e4f27bd690f6a9" Feb 27 01:28:14 crc kubenswrapper[4771]: I0227 01:28:14.119254 4771 scope.go:117] "RemoveContainer" containerID="f7368f73e637edf4de0a9418598c66840ba06f9aa57967f476097ffbbf539b74" Feb 27 01:28:14 crc kubenswrapper[4771]: I0227 01:28:14.131167 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-vvrn8"] Feb 27 01:28:14 crc kubenswrapper[4771]: I0227 01:28:14.139845 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-vvrn8"] Feb 27 01:28:15 crc kubenswrapper[4771]: I0227 01:28:15.788385 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55452564-ea38-46a1-bb44-7cac2ae8840a" path="/var/lib/kubelet/pods/55452564-ea38-46a1-bb44-7cac2ae8840a/volumes" Feb 27 01:28:25 crc kubenswrapper[4771]: I0227 01:28:25.231346 4771 generic.go:334] "Generic (PLEG): container finished" podID="7813115d-b642-406c-892d-61b10c9777d2" containerID="fc882403dc9f93eade06fcb944b7af5e74973bfea4dd83a1fe459d989a67dc4e" exitCode=0 Feb 27 01:28:25 crc kubenswrapper[4771]: I0227 01:28:25.231400 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7813115d-b642-406c-892d-61b10c9777d2","Type":"ContainerDied","Data":"fc882403dc9f93eade06fcb944b7af5e74973bfea4dd83a1fe459d989a67dc4e"} Feb 27 01:28:25 crc kubenswrapper[4771]: I0227 01:28:25.835662 4771 scope.go:117] "RemoveContainer" containerID="3d97010849da7ce7464351a42782e10357865d01cc58a899ca02dd5b8a8ff0f1" Feb 27 01:28:25 crc kubenswrapper[4771]: I0227 01:28:25.886623 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt"] Feb 27 01:28:25 crc kubenswrapper[4771]: E0227 01:28:25.888032 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d28e3a3-ccd6-4c74-996b-cb8844471672" containerName="oc" Feb 27 01:28:25 crc kubenswrapper[4771]: I0227 01:28:25.888058 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d28e3a3-ccd6-4c74-996b-cb8844471672" containerName="oc" Feb 27 01:28:25 crc kubenswrapper[4771]: E0227 01:28:25.888081 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55452564-ea38-46a1-bb44-7cac2ae8840a" containerName="init" Feb 27 01:28:25 crc kubenswrapper[4771]: I0227 01:28:25.888089 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="55452564-ea38-46a1-bb44-7cac2ae8840a" containerName="init" Feb 27 01:28:25 crc kubenswrapper[4771]: E0227 01:28:25.888111 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e507450-eb79-43bc-ae7b-89352c222a44" containerName="init" Feb 27 01:28:25 crc kubenswrapper[4771]: I0227 01:28:25.888118 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e507450-eb79-43bc-ae7b-89352c222a44" containerName="init" Feb 27 01:28:25 crc kubenswrapper[4771]: E0227 01:28:25.888147 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e507450-eb79-43bc-ae7b-89352c222a44" containerName="dnsmasq-dns" Feb 27 01:28:25 crc kubenswrapper[4771]: I0227 01:28:25.888154 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e507450-eb79-43bc-ae7b-89352c222a44" containerName="dnsmasq-dns" Feb 27 01:28:25 crc kubenswrapper[4771]: E0227 01:28:25.888168 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55452564-ea38-46a1-bb44-7cac2ae8840a" containerName="dnsmasq-dns" Feb 27 01:28:25 crc kubenswrapper[4771]: I0227 01:28:25.888176 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="55452564-ea38-46a1-bb44-7cac2ae8840a" containerName="dnsmasq-dns" Feb 27 01:28:25 crc kubenswrapper[4771]: I0227 01:28:25.888422 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="55452564-ea38-46a1-bb44-7cac2ae8840a" containerName="dnsmasq-dns" Feb 27 01:28:25 crc kubenswrapper[4771]: I0227 01:28:25.888439 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e507450-eb79-43bc-ae7b-89352c222a44" containerName="dnsmasq-dns" Feb 27 01:28:25 crc kubenswrapper[4771]: I0227 01:28:25.888454 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d28e3a3-ccd6-4c74-996b-cb8844471672" containerName="oc" Feb 27 01:28:25 crc kubenswrapper[4771]: I0227 01:28:25.889295 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt" Feb 27 01:28:25 crc kubenswrapper[4771]: I0227 01:28:25.902717 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 01:28:25 crc kubenswrapper[4771]: I0227 01:28:25.903115 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 01:28:25 crc kubenswrapper[4771]: I0227 01:28:25.904109 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 01:28:25 crc kubenswrapper[4771]: I0227 01:28:25.904380 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kwjcj" Feb 27 01:28:25 crc kubenswrapper[4771]: I0227 01:28:25.914104 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt"] Feb 27 01:28:26 crc kubenswrapper[4771]: I0227 01:28:26.018442 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6b0ecf8-2611-4192-94ad-c4f9974cbab9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt\" (UID: \"c6b0ecf8-2611-4192-94ad-c4f9974cbab9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt" Feb 27 01:28:26 crc kubenswrapper[4771]: I0227 01:28:26.018497 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n8j2\" (UniqueName: \"kubernetes.io/projected/c6b0ecf8-2611-4192-94ad-c4f9974cbab9-kube-api-access-4n8j2\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt\" (UID: \"c6b0ecf8-2611-4192-94ad-c4f9974cbab9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt" Feb 27 01:28:26 crc kubenswrapper[4771]: I0227 01:28:26.018528 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6b0ecf8-2611-4192-94ad-c4f9974cbab9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt\" (UID: \"c6b0ecf8-2611-4192-94ad-c4f9974cbab9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt" Feb 27 01:28:26 crc kubenswrapper[4771]: I0227 01:28:26.018786 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c6b0ecf8-2611-4192-94ad-c4f9974cbab9-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt\" (UID: \"c6b0ecf8-2611-4192-94ad-c4f9974cbab9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt" Feb 27 01:28:26 crc kubenswrapper[4771]: I0227 01:28:26.120560 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c6b0ecf8-2611-4192-94ad-c4f9974cbab9-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt\" (UID: \"c6b0ecf8-2611-4192-94ad-c4f9974cbab9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt" Feb 27 01:28:26 crc kubenswrapper[4771]: I0227 01:28:26.121142 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6b0ecf8-2611-4192-94ad-c4f9974cbab9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt\" (UID: \"c6b0ecf8-2611-4192-94ad-c4f9974cbab9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt" Feb 27 01:28:26 crc kubenswrapper[4771]: I0227 01:28:26.121260 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n8j2\" (UniqueName: \"kubernetes.io/projected/c6b0ecf8-2611-4192-94ad-c4f9974cbab9-kube-api-access-4n8j2\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt\" (UID: \"c6b0ecf8-2611-4192-94ad-c4f9974cbab9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt" Feb 27 01:28:26 crc kubenswrapper[4771]: I0227 01:28:26.121366 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6b0ecf8-2611-4192-94ad-c4f9974cbab9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt\" (UID: \"c6b0ecf8-2611-4192-94ad-c4f9974cbab9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt" Feb 27 01:28:26 crc kubenswrapper[4771]: I0227 01:28:26.125735 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6b0ecf8-2611-4192-94ad-c4f9974cbab9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt\" (UID: \"c6b0ecf8-2611-4192-94ad-c4f9974cbab9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt" Feb 27 01:28:26 crc kubenswrapper[4771]: I0227 01:28:26.126278 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6b0ecf8-2611-4192-94ad-c4f9974cbab9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt\" (UID: \"c6b0ecf8-2611-4192-94ad-c4f9974cbab9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt" Feb 27 01:28:26 crc kubenswrapper[4771]: I0227 01:28:26.129039 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c6b0ecf8-2611-4192-94ad-c4f9974cbab9-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt\" (UID: \"c6b0ecf8-2611-4192-94ad-c4f9974cbab9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt" Feb 27 01:28:26 crc kubenswrapper[4771]: I0227 01:28:26.143185 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n8j2\" (UniqueName: \"kubernetes.io/projected/c6b0ecf8-2611-4192-94ad-c4f9974cbab9-kube-api-access-4n8j2\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt\" (UID: \"c6b0ecf8-2611-4192-94ad-c4f9974cbab9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt" Feb 27 01:28:26 crc kubenswrapper[4771]: I0227 01:28:26.237689 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt" Feb 27 01:28:26 crc kubenswrapper[4771]: I0227 01:28:26.244261 4771 generic.go:334] "Generic (PLEG): container finished" podID="370e8739-d955-433e-8f61-b8e3bc1d8dc7" containerID="240e8ae49c92a5945e4f0381ea3f0fa3667e76a1be7ac727e601acccb64a457f" exitCode=0 Feb 27 01:28:26 crc kubenswrapper[4771]: I0227 01:28:26.244357 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"370e8739-d955-433e-8f61-b8e3bc1d8dc7","Type":"ContainerDied","Data":"240e8ae49c92a5945e4f0381ea3f0fa3667e76a1be7ac727e601acccb64a457f"} Feb 27 01:28:26 crc kubenswrapper[4771]: I0227 01:28:26.246380 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7813115d-b642-406c-892d-61b10c9777d2","Type":"ContainerStarted","Data":"3226a6ccca5530bf4cd2a5a507da3f9da9fb01518d5f3f5aa5350a940afd3e32"} Feb 27 01:28:26 crc kubenswrapper[4771]: I0227 01:28:26.246723 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 27 01:28:26 crc kubenswrapper[4771]: I0227 01:28:26.874458 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.874437849 podStartE2EDuration="37.874437849s" podCreationTimestamp="2026-02-27 01:27:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:28:26.316286376 +0000 UTC m=+1419.253847694" watchObservedRunningTime="2026-02-27 01:28:26.874437849 +0000 UTC m=+1419.811999137" Feb 27 01:28:26 crc kubenswrapper[4771]: I0227 01:28:26.877571 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt"] Feb 27 01:28:27 crc kubenswrapper[4771]: I0227 01:28:27.256407 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt" event={"ID":"c6b0ecf8-2611-4192-94ad-c4f9974cbab9","Type":"ContainerStarted","Data":"7154c1febca1eee3792a90a0fd850e2a98a3bf987f47c9bb0a4fa657429919d5"} Feb 27 01:28:27 crc kubenswrapper[4771]: I0227 01:28:27.259639 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"370e8739-d955-433e-8f61-b8e3bc1d8dc7","Type":"ContainerStarted","Data":"f98fd7d92570008c4383015cc1341748dd8551290d3b4d1f0cdee609b80c6bf9"} Feb 27 01:28:27 crc kubenswrapper[4771]: I0227 01:28:27.291205 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.291178382 podStartE2EDuration="38.291178382s" podCreationTimestamp="2026-02-27 01:27:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 01:28:27.281860273 +0000 UTC m=+1420.219421561" watchObservedRunningTime="2026-02-27 01:28:27.291178382 +0000 UTC m=+1420.228739690" Feb 27 01:28:30 crc kubenswrapper[4771]: I0227 01:28:30.275259 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:28:38 crc kubenswrapper[4771]: I0227 01:28:38.372344 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt" event={"ID":"c6b0ecf8-2611-4192-94ad-c4f9974cbab9","Type":"ContainerStarted","Data":"b3de15a3b55aaaa29d6e6a9e7da2ba5a1cc315faa234ffe9936427051325484b"} Feb 27 01:28:40 crc kubenswrapper[4771]: I0227 01:28:40.265292 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 27 01:28:40 crc kubenswrapper[4771]: I0227 01:28:40.278883 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 27 01:28:40 crc kubenswrapper[4771]: I0227 01:28:40.297657 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt" podStartSLOduration=4.374534134 podStartE2EDuration="15.297631288s" podCreationTimestamp="2026-02-27 01:28:25 +0000 UTC" firstStartedPulling="2026-02-27 01:28:26.898568257 +0000 UTC m=+1419.836129545" lastFinishedPulling="2026-02-27 01:28:37.821665381 +0000 UTC m=+1430.759226699" observedRunningTime="2026-02-27 01:28:38.399483162 +0000 UTC m=+1431.337044490" watchObservedRunningTime="2026-02-27 01:28:40.297631288 +0000 UTC m=+1433.235192606" Feb 27 01:28:49 crc kubenswrapper[4771]: I0227 01:28:49.502015 4771 generic.go:334] "Generic (PLEG): container finished" podID="c6b0ecf8-2611-4192-94ad-c4f9974cbab9" containerID="b3de15a3b55aaaa29d6e6a9e7da2ba5a1cc315faa234ffe9936427051325484b" exitCode=0 Feb 27 01:28:49 crc kubenswrapper[4771]: I0227 01:28:49.502128 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt" event={"ID":"c6b0ecf8-2611-4192-94ad-c4f9974cbab9","Type":"ContainerDied","Data":"b3de15a3b55aaaa29d6e6a9e7da2ba5a1cc315faa234ffe9936427051325484b"} Feb 27 01:28:50 crc kubenswrapper[4771]: I0227 01:28:50.947374 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.046270 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6b0ecf8-2611-4192-94ad-c4f9974cbab9-inventory\") pod \"c6b0ecf8-2611-4192-94ad-c4f9974cbab9\" (UID: \"c6b0ecf8-2611-4192-94ad-c4f9974cbab9\") " Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.046488 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6b0ecf8-2611-4192-94ad-c4f9974cbab9-repo-setup-combined-ca-bundle\") pod \"c6b0ecf8-2611-4192-94ad-c4f9974cbab9\" (UID: \"c6b0ecf8-2611-4192-94ad-c4f9974cbab9\") " Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.046581 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c6b0ecf8-2611-4192-94ad-c4f9974cbab9-ssh-key-openstack-edpm-ipam\") pod \"c6b0ecf8-2611-4192-94ad-c4f9974cbab9\" (UID: \"c6b0ecf8-2611-4192-94ad-c4f9974cbab9\") " Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.046640 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n8j2\" (UniqueName: \"kubernetes.io/projected/c6b0ecf8-2611-4192-94ad-c4f9974cbab9-kube-api-access-4n8j2\") pod \"c6b0ecf8-2611-4192-94ad-c4f9974cbab9\" (UID: \"c6b0ecf8-2611-4192-94ad-c4f9974cbab9\") " Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.052534 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6b0ecf8-2611-4192-94ad-c4f9974cbab9-kube-api-access-4n8j2" (OuterVolumeSpecName: "kube-api-access-4n8j2") pod "c6b0ecf8-2611-4192-94ad-c4f9974cbab9" (UID: "c6b0ecf8-2611-4192-94ad-c4f9974cbab9"). InnerVolumeSpecName "kube-api-access-4n8j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.052865 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6b0ecf8-2611-4192-94ad-c4f9974cbab9-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "c6b0ecf8-2611-4192-94ad-c4f9974cbab9" (UID: "c6b0ecf8-2611-4192-94ad-c4f9974cbab9"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.075360 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6b0ecf8-2611-4192-94ad-c4f9974cbab9-inventory" (OuterVolumeSpecName: "inventory") pod "c6b0ecf8-2611-4192-94ad-c4f9974cbab9" (UID: "c6b0ecf8-2611-4192-94ad-c4f9974cbab9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.079133 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6b0ecf8-2611-4192-94ad-c4f9974cbab9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c6b0ecf8-2611-4192-94ad-c4f9974cbab9" (UID: "c6b0ecf8-2611-4192-94ad-c4f9974cbab9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.149718 4771 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6b0ecf8-2611-4192-94ad-c4f9974cbab9-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.149781 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c6b0ecf8-2611-4192-94ad-c4f9974cbab9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.149810 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n8j2\" (UniqueName: \"kubernetes.io/projected/c6b0ecf8-2611-4192-94ad-c4f9974cbab9-kube-api-access-4n8j2\") on node \"crc\" DevicePath \"\"" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.149869 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6b0ecf8-2611-4192-94ad-c4f9974cbab9-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.524500 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt" event={"ID":"c6b0ecf8-2611-4192-94ad-c4f9974cbab9","Type":"ContainerDied","Data":"7154c1febca1eee3792a90a0fd850e2a98a3bf987f47c9bb0a4fa657429919d5"} Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.524846 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7154c1febca1eee3792a90a0fd850e2a98a3bf987f47c9bb0a4fa657429919d5" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.524939 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.627562 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-f88s5"] Feb 27 01:28:51 crc kubenswrapper[4771]: E0227 01:28:51.628450 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6b0ecf8-2611-4192-94ad-c4f9974cbab9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.628484 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6b0ecf8-2611-4192-94ad-c4f9974cbab9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.628784 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6b0ecf8-2611-4192-94ad-c4f9974cbab9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.630088 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-f88s5" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.639416 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kwjcj" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.639703 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.639973 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.640127 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.641029 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-f88s5"] Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.764166 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4e2d0148-506d-458b-89c3-1faf19410b6b-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-f88s5\" (UID: \"4e2d0148-506d-458b-89c3-1faf19410b6b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-f88s5" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.764244 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e2d0148-506d-458b-89c3-1faf19410b6b-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-f88s5\" (UID: \"4e2d0148-506d-458b-89c3-1faf19410b6b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-f88s5" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.764366 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgzg4\" (UniqueName: \"kubernetes.io/projected/4e2d0148-506d-458b-89c3-1faf19410b6b-kube-api-access-zgzg4\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-f88s5\" (UID: \"4e2d0148-506d-458b-89c3-1faf19410b6b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-f88s5" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.866485 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgzg4\" (UniqueName: \"kubernetes.io/projected/4e2d0148-506d-458b-89c3-1faf19410b6b-kube-api-access-zgzg4\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-f88s5\" (UID: \"4e2d0148-506d-458b-89c3-1faf19410b6b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-f88s5" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.866694 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4e2d0148-506d-458b-89c3-1faf19410b6b-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-f88s5\" (UID: \"4e2d0148-506d-458b-89c3-1faf19410b6b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-f88s5" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.866735 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e2d0148-506d-458b-89c3-1faf19410b6b-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-f88s5\" (UID: \"4e2d0148-506d-458b-89c3-1faf19410b6b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-f88s5" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.872376 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e2d0148-506d-458b-89c3-1faf19410b6b-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-f88s5\" (UID: \"4e2d0148-506d-458b-89c3-1faf19410b6b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-f88s5" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.883224 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4e2d0148-506d-458b-89c3-1faf19410b6b-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-f88s5\" (UID: \"4e2d0148-506d-458b-89c3-1faf19410b6b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-f88s5" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.886715 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgzg4\" (UniqueName: \"kubernetes.io/projected/4e2d0148-506d-458b-89c3-1faf19410b6b-kube-api-access-zgzg4\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-f88s5\" (UID: \"4e2d0148-506d-458b-89c3-1faf19410b6b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-f88s5" Feb 27 01:28:51 crc kubenswrapper[4771]: I0227 01:28:51.952956 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-f88s5" Feb 27 01:28:52 crc kubenswrapper[4771]: I0227 01:28:52.474541 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-f88s5"] Feb 27 01:28:52 crc kubenswrapper[4771]: I0227 01:28:52.538492 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-f88s5" event={"ID":"4e2d0148-506d-458b-89c3-1faf19410b6b","Type":"ContainerStarted","Data":"d5eed138ad02f206bbf03d804d809be4191419a6f0d0373ee2cd2cd427fa300c"} Feb 27 01:28:53 crc kubenswrapper[4771]: I0227 01:28:53.557793 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-f88s5" event={"ID":"4e2d0148-506d-458b-89c3-1faf19410b6b","Type":"ContainerStarted","Data":"9483c00d0b5230297fa2f333b5a509c5d8838f1a8dd65c47c5e394de9f35d939"} Feb 27 01:28:53 crc kubenswrapper[4771]: I0227 01:28:53.602512 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-f88s5" podStartSLOduration=2.125751736 podStartE2EDuration="2.602474011s" podCreationTimestamp="2026-02-27 01:28:51 +0000 UTC" firstStartedPulling="2026-02-27 01:28:52.49189683 +0000 UTC m=+1445.429458118" lastFinishedPulling="2026-02-27 01:28:52.968619105 +0000 UTC m=+1445.906180393" observedRunningTime="2026-02-27 01:28:53.588248339 +0000 UTC m=+1446.525809647" watchObservedRunningTime="2026-02-27 01:28:53.602474011 +0000 UTC m=+1446.540035299" Feb 27 01:28:55 crc kubenswrapper[4771]: I0227 01:28:55.589016 4771 generic.go:334] "Generic (PLEG): container finished" podID="4e2d0148-506d-458b-89c3-1faf19410b6b" containerID="9483c00d0b5230297fa2f333b5a509c5d8838f1a8dd65c47c5e394de9f35d939" exitCode=0 Feb 27 01:28:55 crc kubenswrapper[4771]: I0227 01:28:55.589113 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-f88s5" event={"ID":"4e2d0148-506d-458b-89c3-1faf19410b6b","Type":"ContainerDied","Data":"9483c00d0b5230297fa2f333b5a509c5d8838f1a8dd65c47c5e394de9f35d939"} Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.059957 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-f88s5" Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.179105 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgzg4\" (UniqueName: \"kubernetes.io/projected/4e2d0148-506d-458b-89c3-1faf19410b6b-kube-api-access-zgzg4\") pod \"4e2d0148-506d-458b-89c3-1faf19410b6b\" (UID: \"4e2d0148-506d-458b-89c3-1faf19410b6b\") " Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.179249 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4e2d0148-506d-458b-89c3-1faf19410b6b-ssh-key-openstack-edpm-ipam\") pod \"4e2d0148-506d-458b-89c3-1faf19410b6b\" (UID: \"4e2d0148-506d-458b-89c3-1faf19410b6b\") " Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.179353 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e2d0148-506d-458b-89c3-1faf19410b6b-inventory\") pod \"4e2d0148-506d-458b-89c3-1faf19410b6b\" (UID: \"4e2d0148-506d-458b-89c3-1faf19410b6b\") " Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.184463 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e2d0148-506d-458b-89c3-1faf19410b6b-kube-api-access-zgzg4" (OuterVolumeSpecName: "kube-api-access-zgzg4") pod "4e2d0148-506d-458b-89c3-1faf19410b6b" (UID: "4e2d0148-506d-458b-89c3-1faf19410b6b"). InnerVolumeSpecName "kube-api-access-zgzg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.207081 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e2d0148-506d-458b-89c3-1faf19410b6b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4e2d0148-506d-458b-89c3-1faf19410b6b" (UID: "4e2d0148-506d-458b-89c3-1faf19410b6b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.207917 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e2d0148-506d-458b-89c3-1faf19410b6b-inventory" (OuterVolumeSpecName: "inventory") pod "4e2d0148-506d-458b-89c3-1faf19410b6b" (UID: "4e2d0148-506d-458b-89c3-1faf19410b6b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.281371 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgzg4\" (UniqueName: \"kubernetes.io/projected/4e2d0148-506d-458b-89c3-1faf19410b6b-kube-api-access-zgzg4\") on node \"crc\" DevicePath \"\"" Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.281408 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4e2d0148-506d-458b-89c3-1faf19410b6b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.281418 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e2d0148-506d-458b-89c3-1faf19410b6b-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.623415 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-f88s5" event={"ID":"4e2d0148-506d-458b-89c3-1faf19410b6b","Type":"ContainerDied","Data":"d5eed138ad02f206bbf03d804d809be4191419a6f0d0373ee2cd2cd427fa300c"} Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.624326 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5eed138ad02f206bbf03d804d809be4191419a6f0d0373ee2cd2cd427fa300c" Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.623581 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-f88s5" Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.799270 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5"] Feb 27 01:28:57 crc kubenswrapper[4771]: E0227 01:28:57.799772 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e2d0148-506d-458b-89c3-1faf19410b6b" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.799807 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2d0148-506d-458b-89c3-1faf19410b6b" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.800099 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e2d0148-506d-458b-89c3-1faf19410b6b" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.801694 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5" Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.804579 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.804940 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kwjcj" Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.805295 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.805378 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.809998 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5"] Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.893014 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34ae2923-be95-45e5-a840-dfea9b17f9c4-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5\" (UID: \"34ae2923-be95-45e5-a840-dfea9b17f9c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5" Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.893182 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vf68\" (UniqueName: \"kubernetes.io/projected/34ae2923-be95-45e5-a840-dfea9b17f9c4-kube-api-access-5vf68\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5\" (UID: \"34ae2923-be95-45e5-a840-dfea9b17f9c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5" Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.893866 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34ae2923-be95-45e5-a840-dfea9b17f9c4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5\" (UID: \"34ae2923-be95-45e5-a840-dfea9b17f9c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5" Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.894168 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34ae2923-be95-45e5-a840-dfea9b17f9c4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5\" (UID: \"34ae2923-be95-45e5-a840-dfea9b17f9c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5" Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.995803 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34ae2923-be95-45e5-a840-dfea9b17f9c4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5\" (UID: \"34ae2923-be95-45e5-a840-dfea9b17f9c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5" Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.995889 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34ae2923-be95-45e5-a840-dfea9b17f9c4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5\" (UID: \"34ae2923-be95-45e5-a840-dfea9b17f9c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5" Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.995963 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34ae2923-be95-45e5-a840-dfea9b17f9c4-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5\" (UID: \"34ae2923-be95-45e5-a840-dfea9b17f9c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5" Feb 27 01:28:57 crc kubenswrapper[4771]: I0227 01:28:57.996025 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vf68\" (UniqueName: \"kubernetes.io/projected/34ae2923-be95-45e5-a840-dfea9b17f9c4-kube-api-access-5vf68\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5\" (UID: \"34ae2923-be95-45e5-a840-dfea9b17f9c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5" Feb 27 01:28:58 crc kubenswrapper[4771]: I0227 01:28:58.001396 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34ae2923-be95-45e5-a840-dfea9b17f9c4-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5\" (UID: \"34ae2923-be95-45e5-a840-dfea9b17f9c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5" Feb 27 01:28:58 crc kubenswrapper[4771]: I0227 01:28:58.001631 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34ae2923-be95-45e5-a840-dfea9b17f9c4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5\" (UID: \"34ae2923-be95-45e5-a840-dfea9b17f9c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5" Feb 27 01:28:58 crc kubenswrapper[4771]: I0227 01:28:58.003296 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34ae2923-be95-45e5-a840-dfea9b17f9c4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5\" (UID: \"34ae2923-be95-45e5-a840-dfea9b17f9c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5" Feb 27 01:28:58 crc kubenswrapper[4771]: I0227 01:28:58.026956 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vf68\" (UniqueName: \"kubernetes.io/projected/34ae2923-be95-45e5-a840-dfea9b17f9c4-kube-api-access-5vf68\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5\" (UID: \"34ae2923-be95-45e5-a840-dfea9b17f9c4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5" Feb 27 01:28:58 crc kubenswrapper[4771]: I0227 01:28:58.175197 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5" Feb 27 01:28:58 crc kubenswrapper[4771]: I0227 01:28:58.634819 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5"] Feb 27 01:28:59 crc kubenswrapper[4771]: I0227 01:28:59.646796 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5" event={"ID":"34ae2923-be95-45e5-a840-dfea9b17f9c4","Type":"ContainerStarted","Data":"c60249e9cd18d51b8fb2701778227ad7426d56e3fe9e025b27a397eb346d6a6a"} Feb 27 01:28:59 crc kubenswrapper[4771]: I0227 01:28:59.647460 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5" event={"ID":"34ae2923-be95-45e5-a840-dfea9b17f9c4","Type":"ContainerStarted","Data":"7bd272f7844dc05fc67a08d66f97b2fe7a37aa72d23ee8b0314c686df94b47e1"} Feb 27 01:28:59 crc kubenswrapper[4771]: I0227 01:28:59.669465 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5" podStartSLOduration=2.260185712 podStartE2EDuration="2.669448655s" podCreationTimestamp="2026-02-27 01:28:57 +0000 UTC" firstStartedPulling="2026-02-27 01:28:58.645121542 +0000 UTC m=+1451.582682830" lastFinishedPulling="2026-02-27 01:28:59.054384484 +0000 UTC m=+1451.991945773" observedRunningTime="2026-02-27 01:28:59.666118016 +0000 UTC m=+1452.603679304" watchObservedRunningTime="2026-02-27 01:28:59.669448655 +0000 UTC m=+1452.607009943" Feb 27 01:29:08 crc kubenswrapper[4771]: I0227 01:29:08.167538 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dp86r"] Feb 27 01:29:08 crc kubenswrapper[4771]: I0227 01:29:08.203715 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dp86r"] Feb 27 01:29:08 crc kubenswrapper[4771]: I0227 01:29:08.204745 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dp86r" Feb 27 01:29:08 crc kubenswrapper[4771]: I0227 01:29:08.338273 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6xft\" (UniqueName: \"kubernetes.io/projected/4500823f-a7a6-4096-91c4-7ac87ff6cdef-kube-api-access-d6xft\") pod \"community-operators-dp86r\" (UID: \"4500823f-a7a6-4096-91c4-7ac87ff6cdef\") " pod="openshift-marketplace/community-operators-dp86r" Feb 27 01:29:08 crc kubenswrapper[4771]: I0227 01:29:08.338422 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4500823f-a7a6-4096-91c4-7ac87ff6cdef-catalog-content\") pod \"community-operators-dp86r\" (UID: \"4500823f-a7a6-4096-91c4-7ac87ff6cdef\") " pod="openshift-marketplace/community-operators-dp86r" Feb 27 01:29:08 crc kubenswrapper[4771]: I0227 01:29:08.338448 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4500823f-a7a6-4096-91c4-7ac87ff6cdef-utilities\") pod \"community-operators-dp86r\" (UID: \"4500823f-a7a6-4096-91c4-7ac87ff6cdef\") " pod="openshift-marketplace/community-operators-dp86r" Feb 27 01:29:08 crc kubenswrapper[4771]: I0227 01:29:08.440877 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6xft\" (UniqueName: \"kubernetes.io/projected/4500823f-a7a6-4096-91c4-7ac87ff6cdef-kube-api-access-d6xft\") pod \"community-operators-dp86r\" (UID: \"4500823f-a7a6-4096-91c4-7ac87ff6cdef\") " pod="openshift-marketplace/community-operators-dp86r" Feb 27 01:29:08 crc kubenswrapper[4771]: I0227 01:29:08.441138 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4500823f-a7a6-4096-91c4-7ac87ff6cdef-catalog-content\") pod \"community-operators-dp86r\" (UID: \"4500823f-a7a6-4096-91c4-7ac87ff6cdef\") " pod="openshift-marketplace/community-operators-dp86r" Feb 27 01:29:08 crc kubenswrapper[4771]: I0227 01:29:08.441208 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4500823f-a7a6-4096-91c4-7ac87ff6cdef-utilities\") pod \"community-operators-dp86r\" (UID: \"4500823f-a7a6-4096-91c4-7ac87ff6cdef\") " pod="openshift-marketplace/community-operators-dp86r" Feb 27 01:29:08 crc kubenswrapper[4771]: I0227 01:29:08.442074 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4500823f-a7a6-4096-91c4-7ac87ff6cdef-utilities\") pod \"community-operators-dp86r\" (UID: \"4500823f-a7a6-4096-91c4-7ac87ff6cdef\") " pod="openshift-marketplace/community-operators-dp86r" Feb 27 01:29:08 crc kubenswrapper[4771]: I0227 01:29:08.443170 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4500823f-a7a6-4096-91c4-7ac87ff6cdef-catalog-content\") pod \"community-operators-dp86r\" (UID: \"4500823f-a7a6-4096-91c4-7ac87ff6cdef\") " pod="openshift-marketplace/community-operators-dp86r" Feb 27 01:29:08 crc kubenswrapper[4771]: I0227 01:29:08.462038 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6xft\" (UniqueName: \"kubernetes.io/projected/4500823f-a7a6-4096-91c4-7ac87ff6cdef-kube-api-access-d6xft\") pod \"community-operators-dp86r\" (UID: \"4500823f-a7a6-4096-91c4-7ac87ff6cdef\") " pod="openshift-marketplace/community-operators-dp86r" Feb 27 01:29:08 crc kubenswrapper[4771]: I0227 01:29:08.534733 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dp86r" Feb 27 01:29:09 crc kubenswrapper[4771]: I0227 01:29:09.084635 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dp86r"] Feb 27 01:29:09 crc kubenswrapper[4771]: W0227 01:29:09.088794 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4500823f_a7a6_4096_91c4_7ac87ff6cdef.slice/crio-ab7dd45edd08c5eb58020b405a3bdf4c3ca52712a3ac238ab29f5123101635dd WatchSource:0}: Error finding container ab7dd45edd08c5eb58020b405a3bdf4c3ca52712a3ac238ab29f5123101635dd: Status 404 returned error can't find the container with id ab7dd45edd08c5eb58020b405a3bdf4c3ca52712a3ac238ab29f5123101635dd Feb 27 01:29:09 crc kubenswrapper[4771]: I0227 01:29:09.770818 4771 generic.go:334] "Generic (PLEG): container finished" podID="4500823f-a7a6-4096-91c4-7ac87ff6cdef" containerID="6c629cdf82a6c182948bf01451e44e5998852f4ab52e61ce151063c208c59664" exitCode=0 Feb 27 01:29:09 crc kubenswrapper[4771]: I0227 01:29:09.770903 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dp86r" event={"ID":"4500823f-a7a6-4096-91c4-7ac87ff6cdef","Type":"ContainerDied","Data":"6c629cdf82a6c182948bf01451e44e5998852f4ab52e61ce151063c208c59664"} Feb 27 01:29:09 crc kubenswrapper[4771]: I0227 01:29:09.771108 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dp86r" event={"ID":"4500823f-a7a6-4096-91c4-7ac87ff6cdef","Type":"ContainerStarted","Data":"ab7dd45edd08c5eb58020b405a3bdf4c3ca52712a3ac238ab29f5123101635dd"} Feb 27 01:29:11 crc kubenswrapper[4771]: I0227 01:29:11.813798 4771 generic.go:334] "Generic (PLEG): container finished" podID="4500823f-a7a6-4096-91c4-7ac87ff6cdef" containerID="12a6098fc32bca88d6d174a6c6eaad45714547d87347d2b896dd2a39f7a73331" exitCode=0 Feb 27 01:29:11 crc kubenswrapper[4771]: I0227 01:29:11.814271 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dp86r" event={"ID":"4500823f-a7a6-4096-91c4-7ac87ff6cdef","Type":"ContainerDied","Data":"12a6098fc32bca88d6d174a6c6eaad45714547d87347d2b896dd2a39f7a73331"} Feb 27 01:29:11 crc kubenswrapper[4771]: I0227 01:29:11.818266 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 01:29:12 crc kubenswrapper[4771]: I0227 01:29:12.823496 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dp86r" event={"ID":"4500823f-a7a6-4096-91c4-7ac87ff6cdef","Type":"ContainerStarted","Data":"db7b6042edf5c60e6c37784736132aa07b92722aaa21aa835b264c3329ff34c6"} Feb 27 01:29:12 crc kubenswrapper[4771]: I0227 01:29:12.841976 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dp86r" podStartSLOduration=2.236618001 podStartE2EDuration="4.841958252s" podCreationTimestamp="2026-02-27 01:29:08 +0000 UTC" firstStartedPulling="2026-02-27 01:29:09.773434859 +0000 UTC m=+1462.710996167" lastFinishedPulling="2026-02-27 01:29:12.37877513 +0000 UTC m=+1465.316336418" observedRunningTime="2026-02-27 01:29:12.838118709 +0000 UTC m=+1465.775680017" watchObservedRunningTime="2026-02-27 01:29:12.841958252 +0000 UTC m=+1465.779519540" Feb 27 01:29:18 crc kubenswrapper[4771]: I0227 01:29:18.535470 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dp86r" Feb 27 01:29:18 crc kubenswrapper[4771]: I0227 01:29:18.535778 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dp86r" Feb 27 01:29:18 crc kubenswrapper[4771]: I0227 01:29:18.581454 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dp86r" Feb 27 01:29:18 crc kubenswrapper[4771]: I0227 01:29:18.956387 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dp86r" Feb 27 01:29:19 crc kubenswrapper[4771]: I0227 01:29:19.018316 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dp86r"] Feb 27 01:29:20 crc kubenswrapper[4771]: I0227 01:29:20.905031 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dp86r" podUID="4500823f-a7a6-4096-91c4-7ac87ff6cdef" containerName="registry-server" containerID="cri-o://db7b6042edf5c60e6c37784736132aa07b92722aaa21aa835b264c3329ff34c6" gracePeriod=2 Feb 27 01:29:21 crc kubenswrapper[4771]: I0227 01:29:21.438515 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dp86r" Feb 27 01:29:21 crc kubenswrapper[4771]: I0227 01:29:21.628671 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6xft\" (UniqueName: \"kubernetes.io/projected/4500823f-a7a6-4096-91c4-7ac87ff6cdef-kube-api-access-d6xft\") pod \"4500823f-a7a6-4096-91c4-7ac87ff6cdef\" (UID: \"4500823f-a7a6-4096-91c4-7ac87ff6cdef\") " Feb 27 01:29:21 crc kubenswrapper[4771]: I0227 01:29:21.628749 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4500823f-a7a6-4096-91c4-7ac87ff6cdef-catalog-content\") pod \"4500823f-a7a6-4096-91c4-7ac87ff6cdef\" (UID: \"4500823f-a7a6-4096-91c4-7ac87ff6cdef\") " Feb 27 01:29:21 crc kubenswrapper[4771]: I0227 01:29:21.628905 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4500823f-a7a6-4096-91c4-7ac87ff6cdef-utilities\") pod \"4500823f-a7a6-4096-91c4-7ac87ff6cdef\" (UID: \"4500823f-a7a6-4096-91c4-7ac87ff6cdef\") " Feb 27 01:29:21 crc kubenswrapper[4771]: I0227 01:29:21.630063 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4500823f-a7a6-4096-91c4-7ac87ff6cdef-utilities" (OuterVolumeSpecName: "utilities") pod "4500823f-a7a6-4096-91c4-7ac87ff6cdef" (UID: "4500823f-a7a6-4096-91c4-7ac87ff6cdef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:29:21 crc kubenswrapper[4771]: I0227 01:29:21.640794 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4500823f-a7a6-4096-91c4-7ac87ff6cdef-kube-api-access-d6xft" (OuterVolumeSpecName: "kube-api-access-d6xft") pod "4500823f-a7a6-4096-91c4-7ac87ff6cdef" (UID: "4500823f-a7a6-4096-91c4-7ac87ff6cdef"). InnerVolumeSpecName "kube-api-access-d6xft". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:29:21 crc kubenswrapper[4771]: I0227 01:29:21.685298 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4500823f-a7a6-4096-91c4-7ac87ff6cdef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4500823f-a7a6-4096-91c4-7ac87ff6cdef" (UID: "4500823f-a7a6-4096-91c4-7ac87ff6cdef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:29:21 crc kubenswrapper[4771]: I0227 01:29:21.730493 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4500823f-a7a6-4096-91c4-7ac87ff6cdef-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:29:21 crc kubenswrapper[4771]: I0227 01:29:21.730538 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6xft\" (UniqueName: \"kubernetes.io/projected/4500823f-a7a6-4096-91c4-7ac87ff6cdef-kube-api-access-d6xft\") on node \"crc\" DevicePath \"\"" Feb 27 01:29:21 crc kubenswrapper[4771]: I0227 01:29:21.730572 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4500823f-a7a6-4096-91c4-7ac87ff6cdef-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:29:21 crc kubenswrapper[4771]: I0227 01:29:21.914782 4771 generic.go:334] "Generic (PLEG): container finished" podID="4500823f-a7a6-4096-91c4-7ac87ff6cdef" containerID="db7b6042edf5c60e6c37784736132aa07b92722aaa21aa835b264c3329ff34c6" exitCode=0 Feb 27 01:29:21 crc kubenswrapper[4771]: I0227 01:29:21.914859 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dp86r" Feb 27 01:29:21 crc kubenswrapper[4771]: I0227 01:29:21.914854 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dp86r" event={"ID":"4500823f-a7a6-4096-91c4-7ac87ff6cdef","Type":"ContainerDied","Data":"db7b6042edf5c60e6c37784736132aa07b92722aaa21aa835b264c3329ff34c6"} Feb 27 01:29:21 crc kubenswrapper[4771]: I0227 01:29:21.915178 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dp86r" event={"ID":"4500823f-a7a6-4096-91c4-7ac87ff6cdef","Type":"ContainerDied","Data":"ab7dd45edd08c5eb58020b405a3bdf4c3ca52712a3ac238ab29f5123101635dd"} Feb 27 01:29:21 crc kubenswrapper[4771]: I0227 01:29:21.915203 4771 scope.go:117] "RemoveContainer" containerID="db7b6042edf5c60e6c37784736132aa07b92722aaa21aa835b264c3329ff34c6" Feb 27 01:29:21 crc kubenswrapper[4771]: I0227 01:29:21.949422 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dp86r"] Feb 27 01:29:21 crc kubenswrapper[4771]: I0227 01:29:21.958722 4771 scope.go:117] "RemoveContainer" containerID="12a6098fc32bca88d6d174a6c6eaad45714547d87347d2b896dd2a39f7a73331" Feb 27 01:29:21 crc kubenswrapper[4771]: I0227 01:29:21.961423 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dp86r"] Feb 27 01:29:22 crc kubenswrapper[4771]: I0227 01:29:22.012541 4771 scope.go:117] "RemoveContainer" containerID="6c629cdf82a6c182948bf01451e44e5998852f4ab52e61ce151063c208c59664" Feb 27 01:29:22 crc kubenswrapper[4771]: I0227 01:29:22.056563 4771 scope.go:117] "RemoveContainer" containerID="db7b6042edf5c60e6c37784736132aa07b92722aaa21aa835b264c3329ff34c6" Feb 27 01:29:22 crc kubenswrapper[4771]: E0227 01:29:22.057256 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db7b6042edf5c60e6c37784736132aa07b92722aaa21aa835b264c3329ff34c6\": container with ID starting with db7b6042edf5c60e6c37784736132aa07b92722aaa21aa835b264c3329ff34c6 not found: ID does not exist" containerID="db7b6042edf5c60e6c37784736132aa07b92722aaa21aa835b264c3329ff34c6" Feb 27 01:29:22 crc kubenswrapper[4771]: I0227 01:29:22.057295 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db7b6042edf5c60e6c37784736132aa07b92722aaa21aa835b264c3329ff34c6"} err="failed to get container status \"db7b6042edf5c60e6c37784736132aa07b92722aaa21aa835b264c3329ff34c6\": rpc error: code = NotFound desc = could not find container \"db7b6042edf5c60e6c37784736132aa07b92722aaa21aa835b264c3329ff34c6\": container with ID starting with db7b6042edf5c60e6c37784736132aa07b92722aaa21aa835b264c3329ff34c6 not found: ID does not exist" Feb 27 01:29:22 crc kubenswrapper[4771]: I0227 01:29:22.057319 4771 scope.go:117] "RemoveContainer" containerID="12a6098fc32bca88d6d174a6c6eaad45714547d87347d2b896dd2a39f7a73331" Feb 27 01:29:22 crc kubenswrapper[4771]: E0227 01:29:22.057838 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12a6098fc32bca88d6d174a6c6eaad45714547d87347d2b896dd2a39f7a73331\": container with ID starting with 12a6098fc32bca88d6d174a6c6eaad45714547d87347d2b896dd2a39f7a73331 not found: ID does not exist" containerID="12a6098fc32bca88d6d174a6c6eaad45714547d87347d2b896dd2a39f7a73331" Feb 27 01:29:22 crc kubenswrapper[4771]: I0227 01:29:22.057884 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12a6098fc32bca88d6d174a6c6eaad45714547d87347d2b896dd2a39f7a73331"} err="failed to get container status \"12a6098fc32bca88d6d174a6c6eaad45714547d87347d2b896dd2a39f7a73331\": rpc error: code = NotFound desc = could not find container \"12a6098fc32bca88d6d174a6c6eaad45714547d87347d2b896dd2a39f7a73331\": container with ID starting with 12a6098fc32bca88d6d174a6c6eaad45714547d87347d2b896dd2a39f7a73331 not found: ID does not exist" Feb 27 01:29:22 crc kubenswrapper[4771]: I0227 01:29:22.057919 4771 scope.go:117] "RemoveContainer" containerID="6c629cdf82a6c182948bf01451e44e5998852f4ab52e61ce151063c208c59664" Feb 27 01:29:22 crc kubenswrapper[4771]: E0227 01:29:22.058470 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c629cdf82a6c182948bf01451e44e5998852f4ab52e61ce151063c208c59664\": container with ID starting with 6c629cdf82a6c182948bf01451e44e5998852f4ab52e61ce151063c208c59664 not found: ID does not exist" containerID="6c629cdf82a6c182948bf01451e44e5998852f4ab52e61ce151063c208c59664" Feb 27 01:29:22 crc kubenswrapper[4771]: I0227 01:29:22.058500 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c629cdf82a6c182948bf01451e44e5998852f4ab52e61ce151063c208c59664"} err="failed to get container status \"6c629cdf82a6c182948bf01451e44e5998852f4ab52e61ce151063c208c59664\": rpc error: code = NotFound desc = could not find container \"6c629cdf82a6c182948bf01451e44e5998852f4ab52e61ce151063c208c59664\": container with ID starting with 6c629cdf82a6c182948bf01451e44e5998852f4ab52e61ce151063c208c59664 not found: ID does not exist" Feb 27 01:29:23 crc kubenswrapper[4771]: I0227 01:29:23.789376 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4500823f-a7a6-4096-91c4-7ac87ff6cdef" path="/var/lib/kubelet/pods/4500823f-a7a6-4096-91c4-7ac87ff6cdef/volumes" Feb 27 01:29:26 crc kubenswrapper[4771]: I0227 01:29:26.051130 4771 scope.go:117] "RemoveContainer" containerID="54112211ee0425cdc9f26c6c5c1245475e2e0ee45d665733fc638c86f483bb2a" Feb 27 01:29:26 crc kubenswrapper[4771]: I0227 01:29:26.079156 4771 scope.go:117] "RemoveContainer" containerID="5b673c146cecb1f36e1160b06cd8e4b5697929ed2b501711044d9f881a5cc945" Feb 27 01:29:26 crc kubenswrapper[4771]: I0227 01:29:26.132450 4771 scope.go:117] "RemoveContainer" containerID="9eac1ca5ec14913833db7f0abc8b1c19460bb4ed27ce8a4af8bb66832562b005" Feb 27 01:29:58 crc kubenswrapper[4771]: I0227 01:29:58.953411 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:29:58 crc kubenswrapper[4771]: I0227 01:29:58.954132 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.144924 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535930-g6sj2"] Feb 27 01:30:00 crc kubenswrapper[4771]: E0227 01:30:00.145470 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4500823f-a7a6-4096-91c4-7ac87ff6cdef" containerName="registry-server" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.145487 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4500823f-a7a6-4096-91c4-7ac87ff6cdef" containerName="registry-server" Feb 27 01:30:00 crc kubenswrapper[4771]: E0227 01:30:00.145507 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4500823f-a7a6-4096-91c4-7ac87ff6cdef" containerName="extract-utilities" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.145516 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4500823f-a7a6-4096-91c4-7ac87ff6cdef" containerName="extract-utilities" Feb 27 01:30:00 crc kubenswrapper[4771]: E0227 01:30:00.145570 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4500823f-a7a6-4096-91c4-7ac87ff6cdef" containerName="extract-content" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.145580 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4500823f-a7a6-4096-91c4-7ac87ff6cdef" containerName="extract-content" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.145821 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4500823f-a7a6-4096-91c4-7ac87ff6cdef" containerName="registry-server" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.146688 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535930-g6sj2" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.152182 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.152290 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.152671 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.155775 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535930-wkmns"] Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.157110 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-wkmns" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.164004 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.166978 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535930-g6sj2"] Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.169442 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.174685 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47f3600b-05b8-494c-b37b-85be607f8186-config-volume\") pod \"collect-profiles-29535930-wkmns\" (UID: \"47f3600b-05b8-494c-b37b-85be607f8186\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-wkmns" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.174751 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47f3600b-05b8-494c-b37b-85be607f8186-secret-volume\") pod \"collect-profiles-29535930-wkmns\" (UID: \"47f3600b-05b8-494c-b37b-85be607f8186\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-wkmns" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.175224 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcccm\" (UniqueName: \"kubernetes.io/projected/47f3600b-05b8-494c-b37b-85be607f8186-kube-api-access-rcccm\") pod \"collect-profiles-29535930-wkmns\" (UID: \"47f3600b-05b8-494c-b37b-85be607f8186\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-wkmns" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.175316 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp6xg\" (UniqueName: \"kubernetes.io/projected/3e3b6274-faf2-4259-908a-c50a1babdb81-kube-api-access-kp6xg\") pod \"auto-csr-approver-29535930-g6sj2\" (UID: \"3e3b6274-faf2-4259-908a-c50a1babdb81\") " pod="openshift-infra/auto-csr-approver-29535930-g6sj2" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.177273 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535930-wkmns"] Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.276947 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcccm\" (UniqueName: \"kubernetes.io/projected/47f3600b-05b8-494c-b37b-85be607f8186-kube-api-access-rcccm\") pod \"collect-profiles-29535930-wkmns\" (UID: \"47f3600b-05b8-494c-b37b-85be607f8186\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-wkmns" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.277007 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp6xg\" (UniqueName: \"kubernetes.io/projected/3e3b6274-faf2-4259-908a-c50a1babdb81-kube-api-access-kp6xg\") pod \"auto-csr-approver-29535930-g6sj2\" (UID: \"3e3b6274-faf2-4259-908a-c50a1babdb81\") " pod="openshift-infra/auto-csr-approver-29535930-g6sj2" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.277069 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47f3600b-05b8-494c-b37b-85be607f8186-config-volume\") pod \"collect-profiles-29535930-wkmns\" (UID: \"47f3600b-05b8-494c-b37b-85be607f8186\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-wkmns" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.277103 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47f3600b-05b8-494c-b37b-85be607f8186-secret-volume\") pod \"collect-profiles-29535930-wkmns\" (UID: \"47f3600b-05b8-494c-b37b-85be607f8186\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-wkmns" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.278164 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47f3600b-05b8-494c-b37b-85be607f8186-config-volume\") pod \"collect-profiles-29535930-wkmns\" (UID: \"47f3600b-05b8-494c-b37b-85be607f8186\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-wkmns" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.283807 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47f3600b-05b8-494c-b37b-85be607f8186-secret-volume\") pod \"collect-profiles-29535930-wkmns\" (UID: \"47f3600b-05b8-494c-b37b-85be607f8186\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-wkmns" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.294764 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcccm\" (UniqueName: \"kubernetes.io/projected/47f3600b-05b8-494c-b37b-85be607f8186-kube-api-access-rcccm\") pod \"collect-profiles-29535930-wkmns\" (UID: \"47f3600b-05b8-494c-b37b-85be607f8186\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-wkmns" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.296170 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp6xg\" (UniqueName: \"kubernetes.io/projected/3e3b6274-faf2-4259-908a-c50a1babdb81-kube-api-access-kp6xg\") pod \"auto-csr-approver-29535930-g6sj2\" (UID: \"3e3b6274-faf2-4259-908a-c50a1babdb81\") " pod="openshift-infra/auto-csr-approver-29535930-g6sj2" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.474122 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535930-g6sj2" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.484969 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-wkmns" Feb 27 01:30:00 crc kubenswrapper[4771]: I0227 01:30:00.941431 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535930-wkmns"] Feb 27 01:30:00 crc kubenswrapper[4771]: W0227 01:30:00.948473 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47f3600b_05b8_494c_b37b_85be607f8186.slice/crio-800f551ed0d04ea3d4fdff3add0647f6c22360b36ea1987def6819e14d1f3089 WatchSource:0}: Error finding container 800f551ed0d04ea3d4fdff3add0647f6c22360b36ea1987def6819e14d1f3089: Status 404 returned error can't find the container with id 800f551ed0d04ea3d4fdff3add0647f6c22360b36ea1987def6819e14d1f3089 Feb 27 01:30:01 crc kubenswrapper[4771]: I0227 01:30:01.017288 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535930-g6sj2"] Feb 27 01:30:01 crc kubenswrapper[4771]: W0227 01:30:01.033617 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e3b6274_faf2_4259_908a_c50a1babdb81.slice/crio-ed7dd1500c5232996fa5390b088788c629a86811c2b47ab65a66f4f750f32542 WatchSource:0}: Error finding container ed7dd1500c5232996fa5390b088788c629a86811c2b47ab65a66f4f750f32542: Status 404 returned error can't find the container with id ed7dd1500c5232996fa5390b088788c629a86811c2b47ab65a66f4f750f32542 Feb 27 01:30:01 crc kubenswrapper[4771]: I0227 01:30:01.360758 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535930-g6sj2" event={"ID":"3e3b6274-faf2-4259-908a-c50a1babdb81","Type":"ContainerStarted","Data":"ed7dd1500c5232996fa5390b088788c629a86811c2b47ab65a66f4f750f32542"} Feb 27 01:30:01 crc kubenswrapper[4771]: I0227 01:30:01.362739 4771 generic.go:334] "Generic (PLEG): container finished" podID="47f3600b-05b8-494c-b37b-85be607f8186" containerID="bea3b2af5322da83e83d96d97985a4ab4438b6e9e603c3ad651edb94fc325886" exitCode=0 Feb 27 01:30:01 crc kubenswrapper[4771]: I0227 01:30:01.362810 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-wkmns" event={"ID":"47f3600b-05b8-494c-b37b-85be607f8186","Type":"ContainerDied","Data":"bea3b2af5322da83e83d96d97985a4ab4438b6e9e603c3ad651edb94fc325886"} Feb 27 01:30:01 crc kubenswrapper[4771]: I0227 01:30:01.362962 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-wkmns" event={"ID":"47f3600b-05b8-494c-b37b-85be607f8186","Type":"ContainerStarted","Data":"800f551ed0d04ea3d4fdff3add0647f6c22360b36ea1987def6819e14d1f3089"} Feb 27 01:30:02 crc kubenswrapper[4771]: I0227 01:30:02.710622 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-wkmns" Feb 27 01:30:02 crc kubenswrapper[4771]: I0227 01:30:02.724941 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47f3600b-05b8-494c-b37b-85be607f8186-config-volume\") pod \"47f3600b-05b8-494c-b37b-85be607f8186\" (UID: \"47f3600b-05b8-494c-b37b-85be607f8186\") " Feb 27 01:30:02 crc kubenswrapper[4771]: I0227 01:30:02.725051 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47f3600b-05b8-494c-b37b-85be607f8186-secret-volume\") pod \"47f3600b-05b8-494c-b37b-85be607f8186\" (UID: \"47f3600b-05b8-494c-b37b-85be607f8186\") " Feb 27 01:30:02 crc kubenswrapper[4771]: I0227 01:30:02.725084 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcccm\" (UniqueName: \"kubernetes.io/projected/47f3600b-05b8-494c-b37b-85be607f8186-kube-api-access-rcccm\") pod \"47f3600b-05b8-494c-b37b-85be607f8186\" (UID: \"47f3600b-05b8-494c-b37b-85be607f8186\") " Feb 27 01:30:02 crc kubenswrapper[4771]: I0227 01:30:02.740337 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47f3600b-05b8-494c-b37b-85be607f8186-config-volume" (OuterVolumeSpecName: "config-volume") pod "47f3600b-05b8-494c-b37b-85be607f8186" (UID: "47f3600b-05b8-494c-b37b-85be607f8186"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:30:02 crc kubenswrapper[4771]: I0227 01:30:02.743568 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47f3600b-05b8-494c-b37b-85be607f8186-kube-api-access-rcccm" (OuterVolumeSpecName: "kube-api-access-rcccm") pod "47f3600b-05b8-494c-b37b-85be607f8186" (UID: "47f3600b-05b8-494c-b37b-85be607f8186"). InnerVolumeSpecName "kube-api-access-rcccm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:30:02 crc kubenswrapper[4771]: I0227 01:30:02.767805 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47f3600b-05b8-494c-b37b-85be607f8186-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "47f3600b-05b8-494c-b37b-85be607f8186" (UID: "47f3600b-05b8-494c-b37b-85be607f8186"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:30:02 crc kubenswrapper[4771]: I0227 01:30:02.828897 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47f3600b-05b8-494c-b37b-85be607f8186-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 01:30:02 crc kubenswrapper[4771]: I0227 01:30:02.828943 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47f3600b-05b8-494c-b37b-85be607f8186-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 01:30:02 crc kubenswrapper[4771]: I0227 01:30:02.828959 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcccm\" (UniqueName: \"kubernetes.io/projected/47f3600b-05b8-494c-b37b-85be607f8186-kube-api-access-rcccm\") on node \"crc\" DevicePath \"\"" Feb 27 01:30:03 crc kubenswrapper[4771]: I0227 01:30:03.383673 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-wkmns" event={"ID":"47f3600b-05b8-494c-b37b-85be607f8186","Type":"ContainerDied","Data":"800f551ed0d04ea3d4fdff3add0647f6c22360b36ea1987def6819e14d1f3089"} Feb 27 01:30:03 crc kubenswrapper[4771]: I0227 01:30:03.383719 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="800f551ed0d04ea3d4fdff3add0647f6c22360b36ea1987def6819e14d1f3089" Feb 27 01:30:03 crc kubenswrapper[4771]: I0227 01:30:03.383765 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535930-wkmns" Feb 27 01:30:04 crc kubenswrapper[4771]: I0227 01:30:04.398933 4771 generic.go:334] "Generic (PLEG): container finished" podID="3e3b6274-faf2-4259-908a-c50a1babdb81" containerID="a03b248a22aa2e9b7dd3f17102eb2ab0fdc9f235c7b021aae5b056ecef362cb1" exitCode=0 Feb 27 01:30:04 crc kubenswrapper[4771]: I0227 01:30:04.399017 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535930-g6sj2" event={"ID":"3e3b6274-faf2-4259-908a-c50a1babdb81","Type":"ContainerDied","Data":"a03b248a22aa2e9b7dd3f17102eb2ab0fdc9f235c7b021aae5b056ecef362cb1"} Feb 27 01:30:05 crc kubenswrapper[4771]: I0227 01:30:05.815191 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535930-g6sj2" Feb 27 01:30:05 crc kubenswrapper[4771]: I0227 01:30:05.991413 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp6xg\" (UniqueName: \"kubernetes.io/projected/3e3b6274-faf2-4259-908a-c50a1babdb81-kube-api-access-kp6xg\") pod \"3e3b6274-faf2-4259-908a-c50a1babdb81\" (UID: \"3e3b6274-faf2-4259-908a-c50a1babdb81\") " Feb 27 01:30:05 crc kubenswrapper[4771]: I0227 01:30:05.997201 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e3b6274-faf2-4259-908a-c50a1babdb81-kube-api-access-kp6xg" (OuterVolumeSpecName: "kube-api-access-kp6xg") pod "3e3b6274-faf2-4259-908a-c50a1babdb81" (UID: "3e3b6274-faf2-4259-908a-c50a1babdb81"). InnerVolumeSpecName "kube-api-access-kp6xg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:30:06 crc kubenswrapper[4771]: I0227 01:30:06.094334 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp6xg\" (UniqueName: \"kubernetes.io/projected/3e3b6274-faf2-4259-908a-c50a1babdb81-kube-api-access-kp6xg\") on node \"crc\" DevicePath \"\"" Feb 27 01:30:06 crc kubenswrapper[4771]: I0227 01:30:06.426916 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535930-g6sj2" event={"ID":"3e3b6274-faf2-4259-908a-c50a1babdb81","Type":"ContainerDied","Data":"ed7dd1500c5232996fa5390b088788c629a86811c2b47ab65a66f4f750f32542"} Feb 27 01:30:06 crc kubenswrapper[4771]: I0227 01:30:06.426980 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed7dd1500c5232996fa5390b088788c629a86811c2b47ab65a66f4f750f32542" Feb 27 01:30:06 crc kubenswrapper[4771]: I0227 01:30:06.427062 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535930-g6sj2" Feb 27 01:30:06 crc kubenswrapper[4771]: I0227 01:30:06.898727 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535924-2x5jq"] Feb 27 01:30:06 crc kubenswrapper[4771]: I0227 01:30:06.908677 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535924-2x5jq"] Feb 27 01:30:07 crc kubenswrapper[4771]: I0227 01:30:07.793471 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71ddad50-134a-4525-ade2-057c655b1a8c" path="/var/lib/kubelet/pods/71ddad50-134a-4525-ade2-057c655b1a8c/volumes" Feb 27 01:30:11 crc kubenswrapper[4771]: I0227 01:30:11.484610 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hlqcw"] Feb 27 01:30:11 crc kubenswrapper[4771]: E0227 01:30:11.486820 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47f3600b-05b8-494c-b37b-85be607f8186" containerName="collect-profiles" Feb 27 01:30:11 crc kubenswrapper[4771]: I0227 01:30:11.486994 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="47f3600b-05b8-494c-b37b-85be607f8186" containerName="collect-profiles" Feb 27 01:30:11 crc kubenswrapper[4771]: E0227 01:30:11.487138 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e3b6274-faf2-4259-908a-c50a1babdb81" containerName="oc" Feb 27 01:30:11 crc kubenswrapper[4771]: I0227 01:30:11.487243 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e3b6274-faf2-4259-908a-c50a1babdb81" containerName="oc" Feb 27 01:30:11 crc kubenswrapper[4771]: I0227 01:30:11.487854 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e3b6274-faf2-4259-908a-c50a1babdb81" containerName="oc" Feb 27 01:30:11 crc kubenswrapper[4771]: I0227 01:30:11.488006 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="47f3600b-05b8-494c-b37b-85be607f8186" containerName="collect-profiles" Feb 27 01:30:11 crc kubenswrapper[4771]: I0227 01:30:11.490726 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hlqcw" Feb 27 01:30:11 crc kubenswrapper[4771]: I0227 01:30:11.506609 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hlqcw"] Feb 27 01:30:11 crc kubenswrapper[4771]: I0227 01:30:11.510256 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w64w\" (UniqueName: \"kubernetes.io/projected/af19ec53-7df9-43c5-9df2-57f5235d94c9-kube-api-access-5w64w\") pod \"certified-operators-hlqcw\" (UID: \"af19ec53-7df9-43c5-9df2-57f5235d94c9\") " pod="openshift-marketplace/certified-operators-hlqcw" Feb 27 01:30:11 crc kubenswrapper[4771]: I0227 01:30:11.510480 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af19ec53-7df9-43c5-9df2-57f5235d94c9-utilities\") pod \"certified-operators-hlqcw\" (UID: \"af19ec53-7df9-43c5-9df2-57f5235d94c9\") " pod="openshift-marketplace/certified-operators-hlqcw" Feb 27 01:30:11 crc kubenswrapper[4771]: I0227 01:30:11.510791 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af19ec53-7df9-43c5-9df2-57f5235d94c9-catalog-content\") pod \"certified-operators-hlqcw\" (UID: \"af19ec53-7df9-43c5-9df2-57f5235d94c9\") " pod="openshift-marketplace/certified-operators-hlqcw" Feb 27 01:30:11 crc kubenswrapper[4771]: I0227 01:30:11.614024 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w64w\" (UniqueName: \"kubernetes.io/projected/af19ec53-7df9-43c5-9df2-57f5235d94c9-kube-api-access-5w64w\") pod \"certified-operators-hlqcw\" (UID: \"af19ec53-7df9-43c5-9df2-57f5235d94c9\") " pod="openshift-marketplace/certified-operators-hlqcw" Feb 27 01:30:11 crc kubenswrapper[4771]: I0227 01:30:11.614156 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af19ec53-7df9-43c5-9df2-57f5235d94c9-utilities\") pod \"certified-operators-hlqcw\" (UID: \"af19ec53-7df9-43c5-9df2-57f5235d94c9\") " pod="openshift-marketplace/certified-operators-hlqcw" Feb 27 01:30:11 crc kubenswrapper[4771]: I0227 01:30:11.614193 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af19ec53-7df9-43c5-9df2-57f5235d94c9-catalog-content\") pod \"certified-operators-hlqcw\" (UID: \"af19ec53-7df9-43c5-9df2-57f5235d94c9\") " pod="openshift-marketplace/certified-operators-hlqcw" Feb 27 01:30:11 crc kubenswrapper[4771]: I0227 01:30:11.614767 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af19ec53-7df9-43c5-9df2-57f5235d94c9-catalog-content\") pod \"certified-operators-hlqcw\" (UID: \"af19ec53-7df9-43c5-9df2-57f5235d94c9\") " pod="openshift-marketplace/certified-operators-hlqcw" Feb 27 01:30:11 crc kubenswrapper[4771]: I0227 01:30:11.614928 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af19ec53-7df9-43c5-9df2-57f5235d94c9-utilities\") pod \"certified-operators-hlqcw\" (UID: \"af19ec53-7df9-43c5-9df2-57f5235d94c9\") " pod="openshift-marketplace/certified-operators-hlqcw" Feb 27 01:30:11 crc kubenswrapper[4771]: I0227 01:30:11.635599 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w64w\" (UniqueName: \"kubernetes.io/projected/af19ec53-7df9-43c5-9df2-57f5235d94c9-kube-api-access-5w64w\") pod \"certified-operators-hlqcw\" (UID: \"af19ec53-7df9-43c5-9df2-57f5235d94c9\") " pod="openshift-marketplace/certified-operators-hlqcw" Feb 27 01:30:11 crc kubenswrapper[4771]: I0227 01:30:11.832738 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hlqcw" Feb 27 01:30:12 crc kubenswrapper[4771]: I0227 01:30:12.284989 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hlqcw"] Feb 27 01:30:12 crc kubenswrapper[4771]: I0227 01:30:12.503094 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlqcw" event={"ID":"af19ec53-7df9-43c5-9df2-57f5235d94c9","Type":"ContainerStarted","Data":"2854ae0089507f061ce0ee755ecf27ee3cd12ac5657ef937dbf67eface8d54f4"} Feb 27 01:30:12 crc kubenswrapper[4771]: I0227 01:30:12.503139 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlqcw" event={"ID":"af19ec53-7df9-43c5-9df2-57f5235d94c9","Type":"ContainerStarted","Data":"addeb311ba886d27b867db56c8e7a043ba7e5a484c7ca03707ca764b4da7f1a8"} Feb 27 01:30:13 crc kubenswrapper[4771]: I0227 01:30:13.514376 4771 generic.go:334] "Generic (PLEG): container finished" podID="af19ec53-7df9-43c5-9df2-57f5235d94c9" containerID="2854ae0089507f061ce0ee755ecf27ee3cd12ac5657ef937dbf67eface8d54f4" exitCode=0 Feb 27 01:30:13 crc kubenswrapper[4771]: I0227 01:30:13.514460 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlqcw" event={"ID":"af19ec53-7df9-43c5-9df2-57f5235d94c9","Type":"ContainerDied","Data":"2854ae0089507f061ce0ee755ecf27ee3cd12ac5657ef937dbf67eface8d54f4"} Feb 27 01:30:14 crc kubenswrapper[4771]: I0227 01:30:14.528845 4771 generic.go:334] "Generic (PLEG): container finished" podID="af19ec53-7df9-43c5-9df2-57f5235d94c9" containerID="ad4e29c30bca3240351641279354435ab83f31bd7fa203b2a4939af1fbbb580e" exitCode=0 Feb 27 01:30:14 crc kubenswrapper[4771]: I0227 01:30:14.528924 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlqcw" event={"ID":"af19ec53-7df9-43c5-9df2-57f5235d94c9","Type":"ContainerDied","Data":"ad4e29c30bca3240351641279354435ab83f31bd7fa203b2a4939af1fbbb580e"} Feb 27 01:30:15 crc kubenswrapper[4771]: I0227 01:30:15.541511 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlqcw" event={"ID":"af19ec53-7df9-43c5-9df2-57f5235d94c9","Type":"ContainerStarted","Data":"2085b255dc5e969c6625a93c489666e4710365ec0a7cb3964b7228980dc2fd47"} Feb 27 01:30:15 crc kubenswrapper[4771]: I0227 01:30:15.572303 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hlqcw" podStartSLOduration=2.152722168 podStartE2EDuration="4.572276036s" podCreationTimestamp="2026-02-27 01:30:11 +0000 UTC" firstStartedPulling="2026-02-27 01:30:12.505080642 +0000 UTC m=+1525.442641930" lastFinishedPulling="2026-02-27 01:30:14.92463447 +0000 UTC m=+1527.862195798" observedRunningTime="2026-02-27 01:30:15.569617814 +0000 UTC m=+1528.507179112" watchObservedRunningTime="2026-02-27 01:30:15.572276036 +0000 UTC m=+1528.509837364" Feb 27 01:30:21 crc kubenswrapper[4771]: I0227 01:30:21.833856 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hlqcw" Feb 27 01:30:21 crc kubenswrapper[4771]: I0227 01:30:21.834955 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hlqcw" Feb 27 01:30:21 crc kubenswrapper[4771]: I0227 01:30:21.905747 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hlqcw" Feb 27 01:30:22 crc kubenswrapper[4771]: I0227 01:30:22.719860 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hlqcw" Feb 27 01:30:22 crc kubenswrapper[4771]: I0227 01:30:22.788478 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hlqcw"] Feb 27 01:30:24 crc kubenswrapper[4771]: I0227 01:30:24.673353 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hlqcw" podUID="af19ec53-7df9-43c5-9df2-57f5235d94c9" containerName="registry-server" containerID="cri-o://2085b255dc5e969c6625a93c489666e4710365ec0a7cb3964b7228980dc2fd47" gracePeriod=2 Feb 27 01:30:25 crc kubenswrapper[4771]: I0227 01:30:25.264180 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hlqcw" Feb 27 01:30:25 crc kubenswrapper[4771]: I0227 01:30:25.326229 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w64w\" (UniqueName: \"kubernetes.io/projected/af19ec53-7df9-43c5-9df2-57f5235d94c9-kube-api-access-5w64w\") pod \"af19ec53-7df9-43c5-9df2-57f5235d94c9\" (UID: \"af19ec53-7df9-43c5-9df2-57f5235d94c9\") " Feb 27 01:30:25 crc kubenswrapper[4771]: I0227 01:30:25.326342 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af19ec53-7df9-43c5-9df2-57f5235d94c9-utilities\") pod \"af19ec53-7df9-43c5-9df2-57f5235d94c9\" (UID: \"af19ec53-7df9-43c5-9df2-57f5235d94c9\") " Feb 27 01:30:25 crc kubenswrapper[4771]: I0227 01:30:25.326542 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af19ec53-7df9-43c5-9df2-57f5235d94c9-catalog-content\") pod \"af19ec53-7df9-43c5-9df2-57f5235d94c9\" (UID: \"af19ec53-7df9-43c5-9df2-57f5235d94c9\") " Feb 27 01:30:25 crc kubenswrapper[4771]: I0227 01:30:25.327696 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af19ec53-7df9-43c5-9df2-57f5235d94c9-utilities" (OuterVolumeSpecName: "utilities") pod "af19ec53-7df9-43c5-9df2-57f5235d94c9" (UID: "af19ec53-7df9-43c5-9df2-57f5235d94c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:30:25 crc kubenswrapper[4771]: I0227 01:30:25.333630 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af19ec53-7df9-43c5-9df2-57f5235d94c9-kube-api-access-5w64w" (OuterVolumeSpecName: "kube-api-access-5w64w") pod "af19ec53-7df9-43c5-9df2-57f5235d94c9" (UID: "af19ec53-7df9-43c5-9df2-57f5235d94c9"). InnerVolumeSpecName "kube-api-access-5w64w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:30:25 crc kubenswrapper[4771]: I0227 01:30:25.394322 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af19ec53-7df9-43c5-9df2-57f5235d94c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af19ec53-7df9-43c5-9df2-57f5235d94c9" (UID: "af19ec53-7df9-43c5-9df2-57f5235d94c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:30:25 crc kubenswrapper[4771]: I0227 01:30:25.427938 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5w64w\" (UniqueName: \"kubernetes.io/projected/af19ec53-7df9-43c5-9df2-57f5235d94c9-kube-api-access-5w64w\") on node \"crc\" DevicePath \"\"" Feb 27 01:30:25 crc kubenswrapper[4771]: I0227 01:30:25.427967 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af19ec53-7df9-43c5-9df2-57f5235d94c9-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:30:25 crc kubenswrapper[4771]: I0227 01:30:25.427978 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af19ec53-7df9-43c5-9df2-57f5235d94c9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:30:25 crc kubenswrapper[4771]: I0227 01:30:25.692100 4771 generic.go:334] "Generic (PLEG): container finished" podID="af19ec53-7df9-43c5-9df2-57f5235d94c9" containerID="2085b255dc5e969c6625a93c489666e4710365ec0a7cb3964b7228980dc2fd47" exitCode=0 Feb 27 01:30:25 crc kubenswrapper[4771]: I0227 01:30:25.692157 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlqcw" event={"ID":"af19ec53-7df9-43c5-9df2-57f5235d94c9","Type":"ContainerDied","Data":"2085b255dc5e969c6625a93c489666e4710365ec0a7cb3964b7228980dc2fd47"} Feb 27 01:30:25 crc kubenswrapper[4771]: I0227 01:30:25.692217 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlqcw" event={"ID":"af19ec53-7df9-43c5-9df2-57f5235d94c9","Type":"ContainerDied","Data":"addeb311ba886d27b867db56c8e7a043ba7e5a484c7ca03707ca764b4da7f1a8"} Feb 27 01:30:25 crc kubenswrapper[4771]: I0227 01:30:25.692235 4771 scope.go:117] "RemoveContainer" containerID="2085b255dc5e969c6625a93c489666e4710365ec0a7cb3964b7228980dc2fd47" Feb 27 01:30:25 crc kubenswrapper[4771]: I0227 01:30:25.692849 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hlqcw" Feb 27 01:30:25 crc kubenswrapper[4771]: I0227 01:30:25.730084 4771 scope.go:117] "RemoveContainer" containerID="ad4e29c30bca3240351641279354435ab83f31bd7fa203b2a4939af1fbbb580e" Feb 27 01:30:25 crc kubenswrapper[4771]: I0227 01:30:25.766137 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hlqcw"] Feb 27 01:30:25 crc kubenswrapper[4771]: I0227 01:30:25.785824 4771 scope.go:117] "RemoveContainer" containerID="2854ae0089507f061ce0ee755ecf27ee3cd12ac5657ef937dbf67eface8d54f4" Feb 27 01:30:25 crc kubenswrapper[4771]: I0227 01:30:25.791384 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hlqcw"] Feb 27 01:30:25 crc kubenswrapper[4771]: I0227 01:30:25.818655 4771 scope.go:117] "RemoveContainer" containerID="2085b255dc5e969c6625a93c489666e4710365ec0a7cb3964b7228980dc2fd47" Feb 27 01:30:25 crc kubenswrapper[4771]: E0227 01:30:25.819196 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2085b255dc5e969c6625a93c489666e4710365ec0a7cb3964b7228980dc2fd47\": container with ID starting with 2085b255dc5e969c6625a93c489666e4710365ec0a7cb3964b7228980dc2fd47 not found: ID does not exist" containerID="2085b255dc5e969c6625a93c489666e4710365ec0a7cb3964b7228980dc2fd47" Feb 27 01:30:25 crc kubenswrapper[4771]: I0227 01:30:25.819239 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2085b255dc5e969c6625a93c489666e4710365ec0a7cb3964b7228980dc2fd47"} err="failed to get container status \"2085b255dc5e969c6625a93c489666e4710365ec0a7cb3964b7228980dc2fd47\": rpc error: code = NotFound desc = could not find container \"2085b255dc5e969c6625a93c489666e4710365ec0a7cb3964b7228980dc2fd47\": container with ID starting with 2085b255dc5e969c6625a93c489666e4710365ec0a7cb3964b7228980dc2fd47 not found: ID does not exist" Feb 27 01:30:25 crc kubenswrapper[4771]: I0227 01:30:25.819266 4771 scope.go:117] "RemoveContainer" containerID="ad4e29c30bca3240351641279354435ab83f31bd7fa203b2a4939af1fbbb580e" Feb 27 01:30:25 crc kubenswrapper[4771]: E0227 01:30:25.819674 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad4e29c30bca3240351641279354435ab83f31bd7fa203b2a4939af1fbbb580e\": container with ID starting with ad4e29c30bca3240351641279354435ab83f31bd7fa203b2a4939af1fbbb580e not found: ID does not exist" containerID="ad4e29c30bca3240351641279354435ab83f31bd7fa203b2a4939af1fbbb580e" Feb 27 01:30:25 crc kubenswrapper[4771]: I0227 01:30:25.819722 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad4e29c30bca3240351641279354435ab83f31bd7fa203b2a4939af1fbbb580e"} err="failed to get container status \"ad4e29c30bca3240351641279354435ab83f31bd7fa203b2a4939af1fbbb580e\": rpc error: code = NotFound desc = could not find container \"ad4e29c30bca3240351641279354435ab83f31bd7fa203b2a4939af1fbbb580e\": container with ID starting with ad4e29c30bca3240351641279354435ab83f31bd7fa203b2a4939af1fbbb580e not found: ID does not exist" Feb 27 01:30:25 crc kubenswrapper[4771]: I0227 01:30:25.819761 4771 scope.go:117] "RemoveContainer" containerID="2854ae0089507f061ce0ee755ecf27ee3cd12ac5657ef937dbf67eface8d54f4" Feb 27 01:30:25 crc kubenswrapper[4771]: E0227 01:30:25.820203 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2854ae0089507f061ce0ee755ecf27ee3cd12ac5657ef937dbf67eface8d54f4\": container with ID starting with 2854ae0089507f061ce0ee755ecf27ee3cd12ac5657ef937dbf67eface8d54f4 not found: ID does not exist" containerID="2854ae0089507f061ce0ee755ecf27ee3cd12ac5657ef937dbf67eface8d54f4" Feb 27 01:30:25 crc kubenswrapper[4771]: I0227 01:30:25.820270 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2854ae0089507f061ce0ee755ecf27ee3cd12ac5657ef937dbf67eface8d54f4"} err="failed to get container status \"2854ae0089507f061ce0ee755ecf27ee3cd12ac5657ef937dbf67eface8d54f4\": rpc error: code = NotFound desc = could not find container \"2854ae0089507f061ce0ee755ecf27ee3cd12ac5657ef937dbf67eface8d54f4\": container with ID starting with 2854ae0089507f061ce0ee755ecf27ee3cd12ac5657ef937dbf67eface8d54f4 not found: ID does not exist" Feb 27 01:30:26 crc kubenswrapper[4771]: I0227 01:30:26.275679 4771 scope.go:117] "RemoveContainer" containerID="c181f99a92bb6947d480fa246210ecc8c94e2dc39fc910dafa79d202a559fae5" Feb 27 01:30:26 crc kubenswrapper[4771]: I0227 01:30:26.327732 4771 scope.go:117] "RemoveContainer" containerID="4e1b659b45a1a370910ba8313405db513e67bf9f0b1146bc6f3fa60e4de0a02d" Feb 27 01:30:26 crc kubenswrapper[4771]: I0227 01:30:26.375812 4771 scope.go:117] "RemoveContainer" containerID="abe9e9b98dd3724d60e36667b2ac53e4e7a0bb1891c7a5429756bec39892f1dc" Feb 27 01:30:26 crc kubenswrapper[4771]: I0227 01:30:26.429838 4771 scope.go:117] "RemoveContainer" containerID="00fa4b891b821b195565259e42c2ca98d82e3fa932565fb0a3a1e9128225c110" Feb 27 01:30:26 crc kubenswrapper[4771]: I0227 01:30:26.462225 4771 scope.go:117] "RemoveContainer" containerID="1e50491ccf3ca68206e4ef11def086bfb6d3b305b3a567410e82b4fc8382b8f3" Feb 27 01:30:26 crc kubenswrapper[4771]: I0227 01:30:26.516949 4771 scope.go:117] "RemoveContainer" containerID="756ffee0e7508ceca49f673f36c03777cf26b5f9a2f5ebf61d40abff77ef74b3" Feb 27 01:30:27 crc kubenswrapper[4771]: I0227 01:30:27.793200 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af19ec53-7df9-43c5-9df2-57f5235d94c9" path="/var/lib/kubelet/pods/af19ec53-7df9-43c5-9df2-57f5235d94c9/volumes" Feb 27 01:30:28 crc kubenswrapper[4771]: I0227 01:30:28.953439 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:30:28 crc kubenswrapper[4771]: I0227 01:30:28.953500 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:30:47 crc kubenswrapper[4771]: I0227 01:30:47.723625 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6rvzl"] Feb 27 01:30:47 crc kubenswrapper[4771]: E0227 01:30:47.724892 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af19ec53-7df9-43c5-9df2-57f5235d94c9" containerName="registry-server" Feb 27 01:30:47 crc kubenswrapper[4771]: I0227 01:30:47.724915 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="af19ec53-7df9-43c5-9df2-57f5235d94c9" containerName="registry-server" Feb 27 01:30:47 crc kubenswrapper[4771]: E0227 01:30:47.724949 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af19ec53-7df9-43c5-9df2-57f5235d94c9" containerName="extract-utilities" Feb 27 01:30:47 crc kubenswrapper[4771]: I0227 01:30:47.724960 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="af19ec53-7df9-43c5-9df2-57f5235d94c9" containerName="extract-utilities" Feb 27 01:30:47 crc kubenswrapper[4771]: E0227 01:30:47.725009 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af19ec53-7df9-43c5-9df2-57f5235d94c9" containerName="extract-content" Feb 27 01:30:47 crc kubenswrapper[4771]: I0227 01:30:47.725023 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="af19ec53-7df9-43c5-9df2-57f5235d94c9" containerName="extract-content" Feb 27 01:30:47 crc kubenswrapper[4771]: I0227 01:30:47.725310 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="af19ec53-7df9-43c5-9df2-57f5235d94c9" containerName="registry-server" Feb 27 01:30:47 crc kubenswrapper[4771]: I0227 01:30:47.727625 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6rvzl" Feb 27 01:30:47 crc kubenswrapper[4771]: I0227 01:30:47.739487 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6rvzl"] Feb 27 01:30:47 crc kubenswrapper[4771]: I0227 01:30:47.761859 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9303897-ddbf-43c8-8617-477965d039ce-utilities\") pod \"redhat-operators-6rvzl\" (UID: \"f9303897-ddbf-43c8-8617-477965d039ce\") " pod="openshift-marketplace/redhat-operators-6rvzl" Feb 27 01:30:47 crc kubenswrapper[4771]: I0227 01:30:47.761940 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9303897-ddbf-43c8-8617-477965d039ce-catalog-content\") pod \"redhat-operators-6rvzl\" (UID: \"f9303897-ddbf-43c8-8617-477965d039ce\") " pod="openshift-marketplace/redhat-operators-6rvzl" Feb 27 01:30:47 crc kubenswrapper[4771]: I0227 01:30:47.762328 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7rvk\" (UniqueName: \"kubernetes.io/projected/f9303897-ddbf-43c8-8617-477965d039ce-kube-api-access-n7rvk\") pod \"redhat-operators-6rvzl\" (UID: \"f9303897-ddbf-43c8-8617-477965d039ce\") " pod="openshift-marketplace/redhat-operators-6rvzl" Feb 27 01:30:47 crc kubenswrapper[4771]: I0227 01:30:47.864034 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9303897-ddbf-43c8-8617-477965d039ce-utilities\") pod \"redhat-operators-6rvzl\" (UID: \"f9303897-ddbf-43c8-8617-477965d039ce\") " pod="openshift-marketplace/redhat-operators-6rvzl" Feb 27 01:30:47 crc kubenswrapper[4771]: I0227 01:30:47.864718 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9303897-ddbf-43c8-8617-477965d039ce-catalog-content\") pod \"redhat-operators-6rvzl\" (UID: \"f9303897-ddbf-43c8-8617-477965d039ce\") " pod="openshift-marketplace/redhat-operators-6rvzl" Feb 27 01:30:47 crc kubenswrapper[4771]: I0227 01:30:47.864938 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7rvk\" (UniqueName: \"kubernetes.io/projected/f9303897-ddbf-43c8-8617-477965d039ce-kube-api-access-n7rvk\") pod \"redhat-operators-6rvzl\" (UID: \"f9303897-ddbf-43c8-8617-477965d039ce\") " pod="openshift-marketplace/redhat-operators-6rvzl" Feb 27 01:30:47 crc kubenswrapper[4771]: I0227 01:30:47.865209 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9303897-ddbf-43c8-8617-477965d039ce-catalog-content\") pod \"redhat-operators-6rvzl\" (UID: \"f9303897-ddbf-43c8-8617-477965d039ce\") " pod="openshift-marketplace/redhat-operators-6rvzl" Feb 27 01:30:47 crc kubenswrapper[4771]: I0227 01:30:47.864968 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9303897-ddbf-43c8-8617-477965d039ce-utilities\") pod \"redhat-operators-6rvzl\" (UID: \"f9303897-ddbf-43c8-8617-477965d039ce\") " pod="openshift-marketplace/redhat-operators-6rvzl" Feb 27 01:30:47 crc kubenswrapper[4771]: I0227 01:30:47.888247 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7rvk\" (UniqueName: \"kubernetes.io/projected/f9303897-ddbf-43c8-8617-477965d039ce-kube-api-access-n7rvk\") pod \"redhat-operators-6rvzl\" (UID: \"f9303897-ddbf-43c8-8617-477965d039ce\") " pod="openshift-marketplace/redhat-operators-6rvzl" Feb 27 01:30:48 crc kubenswrapper[4771]: I0227 01:30:48.069883 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6rvzl" Feb 27 01:30:48 crc kubenswrapper[4771]: W0227 01:30:48.575800 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9303897_ddbf_43c8_8617_477965d039ce.slice/crio-c3c71448bf5ffe38d4f48a65fceea168e1929000399c4bda215b5ce0b30baa87 WatchSource:0}: Error finding container c3c71448bf5ffe38d4f48a65fceea168e1929000399c4bda215b5ce0b30baa87: Status 404 returned error can't find the container with id c3c71448bf5ffe38d4f48a65fceea168e1929000399c4bda215b5ce0b30baa87 Feb 27 01:30:48 crc kubenswrapper[4771]: I0227 01:30:48.578523 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6rvzl"] Feb 27 01:30:48 crc kubenswrapper[4771]: I0227 01:30:48.959623 4771 generic.go:334] "Generic (PLEG): container finished" podID="f9303897-ddbf-43c8-8617-477965d039ce" containerID="03d01f662ff63be47205d93061ec1525c1a5aa41bc73cd8ec56b38f38022f173" exitCode=0 Feb 27 01:30:48 crc kubenswrapper[4771]: I0227 01:30:48.959687 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rvzl" event={"ID":"f9303897-ddbf-43c8-8617-477965d039ce","Type":"ContainerDied","Data":"03d01f662ff63be47205d93061ec1525c1a5aa41bc73cd8ec56b38f38022f173"} Feb 27 01:30:48 crc kubenswrapper[4771]: I0227 01:30:48.960089 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rvzl" event={"ID":"f9303897-ddbf-43c8-8617-477965d039ce","Type":"ContainerStarted","Data":"c3c71448bf5ffe38d4f48a65fceea168e1929000399c4bda215b5ce0b30baa87"} Feb 27 01:30:50 crc kubenswrapper[4771]: I0227 01:30:50.983271 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rvzl" event={"ID":"f9303897-ddbf-43c8-8617-477965d039ce","Type":"ContainerStarted","Data":"9e82793d24c7da5331186c7345d1768b20cc5d223dc445c6d054d914f0afb6d5"} Feb 27 01:30:53 crc kubenswrapper[4771]: I0227 01:30:53.008905 4771 generic.go:334] "Generic (PLEG): container finished" podID="f9303897-ddbf-43c8-8617-477965d039ce" containerID="9e82793d24c7da5331186c7345d1768b20cc5d223dc445c6d054d914f0afb6d5" exitCode=0 Feb 27 01:30:53 crc kubenswrapper[4771]: I0227 01:30:53.008996 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rvzl" event={"ID":"f9303897-ddbf-43c8-8617-477965d039ce","Type":"ContainerDied","Data":"9e82793d24c7da5331186c7345d1768b20cc5d223dc445c6d054d914f0afb6d5"} Feb 27 01:30:54 crc kubenswrapper[4771]: I0227 01:30:54.018984 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rvzl" event={"ID":"f9303897-ddbf-43c8-8617-477965d039ce","Type":"ContainerStarted","Data":"09e0f461b445d59d60a78baf7809b74a40d8779adac60cdf5bfee978e8e77fdf"} Feb 27 01:30:54 crc kubenswrapper[4771]: I0227 01:30:54.046654 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6rvzl" podStartSLOduration=2.4687230319999998 podStartE2EDuration="7.046631143s" podCreationTimestamp="2026-02-27 01:30:47 +0000 UTC" firstStartedPulling="2026-02-27 01:30:48.963016962 +0000 UTC m=+1561.900578280" lastFinishedPulling="2026-02-27 01:30:53.540925073 +0000 UTC m=+1566.478486391" observedRunningTime="2026-02-27 01:30:54.042217144 +0000 UTC m=+1566.979778432" watchObservedRunningTime="2026-02-27 01:30:54.046631143 +0000 UTC m=+1566.984192441" Feb 27 01:30:58 crc kubenswrapper[4771]: I0227 01:30:58.070198 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6rvzl" Feb 27 01:30:58 crc kubenswrapper[4771]: I0227 01:30:58.070752 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6rvzl" Feb 27 01:30:58 crc kubenswrapper[4771]: I0227 01:30:58.953045 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:30:58 crc kubenswrapper[4771]: I0227 01:30:58.953102 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:30:58 crc kubenswrapper[4771]: I0227 01:30:58.953149 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 01:30:58 crc kubenswrapper[4771]: I0227 01:30:58.953834 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c2fa94f2e2bead8dd6b922ec063e6ef0f0039cd25cc010b30deb4ce3bb130b4c"} pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 01:30:58 crc kubenswrapper[4771]: I0227 01:30:58.953889 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" containerID="cri-o://c2fa94f2e2bead8dd6b922ec063e6ef0f0039cd25cc010b30deb4ce3bb130b4c" gracePeriod=600 Feb 27 01:30:59 crc kubenswrapper[4771]: I0227 01:30:59.124154 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6rvzl" podUID="f9303897-ddbf-43c8-8617-477965d039ce" containerName="registry-server" probeResult="failure" output=< Feb 27 01:30:59 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 27 01:30:59 crc kubenswrapper[4771]: > Feb 27 01:31:00 crc kubenswrapper[4771]: I0227 01:31:00.086316 4771 generic.go:334] "Generic (PLEG): container finished" podID="ca81e505-d53f-496e-bd26-7cec669591e4" containerID="c2fa94f2e2bead8dd6b922ec063e6ef0f0039cd25cc010b30deb4ce3bb130b4c" exitCode=0 Feb 27 01:31:00 crc kubenswrapper[4771]: I0227 01:31:00.086367 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerDied","Data":"c2fa94f2e2bead8dd6b922ec063e6ef0f0039cd25cc010b30deb4ce3bb130b4c"} Feb 27 01:31:00 crc kubenswrapper[4771]: I0227 01:31:00.086802 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerStarted","Data":"cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f"} Feb 27 01:31:00 crc kubenswrapper[4771]: I0227 01:31:00.086824 4771 scope.go:117] "RemoveContainer" containerID="466a33b6112ab220887139a7abe10596ba6afedbccef8b636c28177f74cb6a85" Feb 27 01:31:08 crc kubenswrapper[4771]: I0227 01:31:08.139104 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6rvzl" Feb 27 01:31:08 crc kubenswrapper[4771]: I0227 01:31:08.192039 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6rvzl" Feb 27 01:31:08 crc kubenswrapper[4771]: I0227 01:31:08.389500 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6rvzl"] Feb 27 01:31:09 crc kubenswrapper[4771]: I0227 01:31:09.188298 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6rvzl" podUID="f9303897-ddbf-43c8-8617-477965d039ce" containerName="registry-server" containerID="cri-o://09e0f461b445d59d60a78baf7809b74a40d8779adac60cdf5bfee978e8e77fdf" gracePeriod=2 Feb 27 01:31:09 crc kubenswrapper[4771]: I0227 01:31:09.703053 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6rvzl" Feb 27 01:31:09 crc kubenswrapper[4771]: I0227 01:31:09.812205 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9303897-ddbf-43c8-8617-477965d039ce-catalog-content\") pod \"f9303897-ddbf-43c8-8617-477965d039ce\" (UID: \"f9303897-ddbf-43c8-8617-477965d039ce\") " Feb 27 01:31:09 crc kubenswrapper[4771]: I0227 01:31:09.812282 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7rvk\" (UniqueName: \"kubernetes.io/projected/f9303897-ddbf-43c8-8617-477965d039ce-kube-api-access-n7rvk\") pod \"f9303897-ddbf-43c8-8617-477965d039ce\" (UID: \"f9303897-ddbf-43c8-8617-477965d039ce\") " Feb 27 01:31:09 crc kubenswrapper[4771]: I0227 01:31:09.812454 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9303897-ddbf-43c8-8617-477965d039ce-utilities\") pod \"f9303897-ddbf-43c8-8617-477965d039ce\" (UID: \"f9303897-ddbf-43c8-8617-477965d039ce\") " Feb 27 01:31:09 crc kubenswrapper[4771]: I0227 01:31:09.813379 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9303897-ddbf-43c8-8617-477965d039ce-utilities" (OuterVolumeSpecName: "utilities") pod "f9303897-ddbf-43c8-8617-477965d039ce" (UID: "f9303897-ddbf-43c8-8617-477965d039ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:31:09 crc kubenswrapper[4771]: I0227 01:31:09.821451 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9303897-ddbf-43c8-8617-477965d039ce-kube-api-access-n7rvk" (OuterVolumeSpecName: "kube-api-access-n7rvk") pod "f9303897-ddbf-43c8-8617-477965d039ce" (UID: "f9303897-ddbf-43c8-8617-477965d039ce"). InnerVolumeSpecName "kube-api-access-n7rvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:31:09 crc kubenswrapper[4771]: I0227 01:31:09.914789 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7rvk\" (UniqueName: \"kubernetes.io/projected/f9303897-ddbf-43c8-8617-477965d039ce-kube-api-access-n7rvk\") on node \"crc\" DevicePath \"\"" Feb 27 01:31:09 crc kubenswrapper[4771]: I0227 01:31:09.914814 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9303897-ddbf-43c8-8617-477965d039ce-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:31:09 crc kubenswrapper[4771]: I0227 01:31:09.954856 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9303897-ddbf-43c8-8617-477965d039ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9303897-ddbf-43c8-8617-477965d039ce" (UID: "f9303897-ddbf-43c8-8617-477965d039ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:31:10 crc kubenswrapper[4771]: I0227 01:31:10.016719 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9303897-ddbf-43c8-8617-477965d039ce-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:31:10 crc kubenswrapper[4771]: I0227 01:31:10.198887 4771 generic.go:334] "Generic (PLEG): container finished" podID="f9303897-ddbf-43c8-8617-477965d039ce" containerID="09e0f461b445d59d60a78baf7809b74a40d8779adac60cdf5bfee978e8e77fdf" exitCode=0 Feb 27 01:31:10 crc kubenswrapper[4771]: I0227 01:31:10.198935 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rvzl" event={"ID":"f9303897-ddbf-43c8-8617-477965d039ce","Type":"ContainerDied","Data":"09e0f461b445d59d60a78baf7809b74a40d8779adac60cdf5bfee978e8e77fdf"} Feb 27 01:31:10 crc kubenswrapper[4771]: I0227 01:31:10.198955 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6rvzl" Feb 27 01:31:10 crc kubenswrapper[4771]: I0227 01:31:10.198974 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rvzl" event={"ID":"f9303897-ddbf-43c8-8617-477965d039ce","Type":"ContainerDied","Data":"c3c71448bf5ffe38d4f48a65fceea168e1929000399c4bda215b5ce0b30baa87"} Feb 27 01:31:10 crc kubenswrapper[4771]: I0227 01:31:10.198992 4771 scope.go:117] "RemoveContainer" containerID="09e0f461b445d59d60a78baf7809b74a40d8779adac60cdf5bfee978e8e77fdf" Feb 27 01:31:10 crc kubenswrapper[4771]: I0227 01:31:10.223521 4771 scope.go:117] "RemoveContainer" containerID="9e82793d24c7da5331186c7345d1768b20cc5d223dc445c6d054d914f0afb6d5" Feb 27 01:31:10 crc kubenswrapper[4771]: I0227 01:31:10.232944 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6rvzl"] Feb 27 01:31:10 crc kubenswrapper[4771]: I0227 01:31:10.241534 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6rvzl"] Feb 27 01:31:10 crc kubenswrapper[4771]: I0227 01:31:10.263993 4771 scope.go:117] "RemoveContainer" containerID="03d01f662ff63be47205d93061ec1525c1a5aa41bc73cd8ec56b38f38022f173" Feb 27 01:31:10 crc kubenswrapper[4771]: I0227 01:31:10.288639 4771 scope.go:117] "RemoveContainer" containerID="09e0f461b445d59d60a78baf7809b74a40d8779adac60cdf5bfee978e8e77fdf" Feb 27 01:31:10 crc kubenswrapper[4771]: E0227 01:31:10.289096 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09e0f461b445d59d60a78baf7809b74a40d8779adac60cdf5bfee978e8e77fdf\": container with ID starting with 09e0f461b445d59d60a78baf7809b74a40d8779adac60cdf5bfee978e8e77fdf not found: ID does not exist" containerID="09e0f461b445d59d60a78baf7809b74a40d8779adac60cdf5bfee978e8e77fdf" Feb 27 01:31:10 crc kubenswrapper[4771]: I0227 01:31:10.289142 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e0f461b445d59d60a78baf7809b74a40d8779adac60cdf5bfee978e8e77fdf"} err="failed to get container status \"09e0f461b445d59d60a78baf7809b74a40d8779adac60cdf5bfee978e8e77fdf\": rpc error: code = NotFound desc = could not find container \"09e0f461b445d59d60a78baf7809b74a40d8779adac60cdf5bfee978e8e77fdf\": container with ID starting with 09e0f461b445d59d60a78baf7809b74a40d8779adac60cdf5bfee978e8e77fdf not found: ID does not exist" Feb 27 01:31:10 crc kubenswrapper[4771]: I0227 01:31:10.289169 4771 scope.go:117] "RemoveContainer" containerID="9e82793d24c7da5331186c7345d1768b20cc5d223dc445c6d054d914f0afb6d5" Feb 27 01:31:10 crc kubenswrapper[4771]: E0227 01:31:10.289581 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e82793d24c7da5331186c7345d1768b20cc5d223dc445c6d054d914f0afb6d5\": container with ID starting with 9e82793d24c7da5331186c7345d1768b20cc5d223dc445c6d054d914f0afb6d5 not found: ID does not exist" containerID="9e82793d24c7da5331186c7345d1768b20cc5d223dc445c6d054d914f0afb6d5" Feb 27 01:31:10 crc kubenswrapper[4771]: I0227 01:31:10.289613 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e82793d24c7da5331186c7345d1768b20cc5d223dc445c6d054d914f0afb6d5"} err="failed to get container status \"9e82793d24c7da5331186c7345d1768b20cc5d223dc445c6d054d914f0afb6d5\": rpc error: code = NotFound desc = could not find container \"9e82793d24c7da5331186c7345d1768b20cc5d223dc445c6d054d914f0afb6d5\": container with ID starting with 9e82793d24c7da5331186c7345d1768b20cc5d223dc445c6d054d914f0afb6d5 not found: ID does not exist" Feb 27 01:31:10 crc kubenswrapper[4771]: I0227 01:31:10.289635 4771 scope.go:117] "RemoveContainer" containerID="03d01f662ff63be47205d93061ec1525c1a5aa41bc73cd8ec56b38f38022f173" Feb 27 01:31:10 crc kubenswrapper[4771]: E0227 01:31:10.289892 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03d01f662ff63be47205d93061ec1525c1a5aa41bc73cd8ec56b38f38022f173\": container with ID starting with 03d01f662ff63be47205d93061ec1525c1a5aa41bc73cd8ec56b38f38022f173 not found: ID does not exist" containerID="03d01f662ff63be47205d93061ec1525c1a5aa41bc73cd8ec56b38f38022f173" Feb 27 01:31:10 crc kubenswrapper[4771]: I0227 01:31:10.289911 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d01f662ff63be47205d93061ec1525c1a5aa41bc73cd8ec56b38f38022f173"} err="failed to get container status \"03d01f662ff63be47205d93061ec1525c1a5aa41bc73cd8ec56b38f38022f173\": rpc error: code = NotFound desc = could not find container \"03d01f662ff63be47205d93061ec1525c1a5aa41bc73cd8ec56b38f38022f173\": container with ID starting with 03d01f662ff63be47205d93061ec1525c1a5aa41bc73cd8ec56b38f38022f173 not found: ID does not exist" Feb 27 01:31:11 crc kubenswrapper[4771]: I0227 01:31:11.785310 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9303897-ddbf-43c8-8617-477965d039ce" path="/var/lib/kubelet/pods/f9303897-ddbf-43c8-8617-477965d039ce/volumes" Feb 27 01:31:59 crc kubenswrapper[4771]: I0227 01:31:59.728881 4771 generic.go:334] "Generic (PLEG): container finished" podID="34ae2923-be95-45e5-a840-dfea9b17f9c4" containerID="c60249e9cd18d51b8fb2701778227ad7426d56e3fe9e025b27a397eb346d6a6a" exitCode=0 Feb 27 01:31:59 crc kubenswrapper[4771]: I0227 01:31:59.729043 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5" event={"ID":"34ae2923-be95-45e5-a840-dfea9b17f9c4","Type":"ContainerDied","Data":"c60249e9cd18d51b8fb2701778227ad7426d56e3fe9e025b27a397eb346d6a6a"} Feb 27 01:32:00 crc kubenswrapper[4771]: I0227 01:32:00.142294 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535932-4rx7t"] Feb 27 01:32:00 crc kubenswrapper[4771]: E0227 01:32:00.142646 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9303897-ddbf-43c8-8617-477965d039ce" containerName="extract-content" Feb 27 01:32:00 crc kubenswrapper[4771]: I0227 01:32:00.142658 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9303897-ddbf-43c8-8617-477965d039ce" containerName="extract-content" Feb 27 01:32:00 crc kubenswrapper[4771]: E0227 01:32:00.142679 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9303897-ddbf-43c8-8617-477965d039ce" containerName="registry-server" Feb 27 01:32:00 crc kubenswrapper[4771]: I0227 01:32:00.142685 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9303897-ddbf-43c8-8617-477965d039ce" containerName="registry-server" Feb 27 01:32:00 crc kubenswrapper[4771]: E0227 01:32:00.142694 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9303897-ddbf-43c8-8617-477965d039ce" containerName="extract-utilities" Feb 27 01:32:00 crc kubenswrapper[4771]: I0227 01:32:00.142700 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9303897-ddbf-43c8-8617-477965d039ce" containerName="extract-utilities" Feb 27 01:32:00 crc kubenswrapper[4771]: I0227 01:32:00.142886 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9303897-ddbf-43c8-8617-477965d039ce" containerName="registry-server" Feb 27 01:32:00 crc kubenswrapper[4771]: I0227 01:32:00.143413 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535932-4rx7t" Feb 27 01:32:00 crc kubenswrapper[4771]: I0227 01:32:00.145303 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:32:00 crc kubenswrapper[4771]: I0227 01:32:00.145655 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:32:00 crc kubenswrapper[4771]: I0227 01:32:00.147064 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 01:32:00 crc kubenswrapper[4771]: I0227 01:32:00.157143 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535932-4rx7t"] Feb 27 01:32:00 crc kubenswrapper[4771]: I0227 01:32:00.210740 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhcq5\" (UniqueName: \"kubernetes.io/projected/ef8584c4-bf5c-47f7-83af-af6162407eba-kube-api-access-fhcq5\") pod \"auto-csr-approver-29535932-4rx7t\" (UID: \"ef8584c4-bf5c-47f7-83af-af6162407eba\") " pod="openshift-infra/auto-csr-approver-29535932-4rx7t" Feb 27 01:32:00 crc kubenswrapper[4771]: I0227 01:32:00.313033 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhcq5\" (UniqueName: \"kubernetes.io/projected/ef8584c4-bf5c-47f7-83af-af6162407eba-kube-api-access-fhcq5\") pod \"auto-csr-approver-29535932-4rx7t\" (UID: \"ef8584c4-bf5c-47f7-83af-af6162407eba\") " pod="openshift-infra/auto-csr-approver-29535932-4rx7t" Feb 27 01:32:00 crc kubenswrapper[4771]: I0227 01:32:00.337788 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhcq5\" (UniqueName: \"kubernetes.io/projected/ef8584c4-bf5c-47f7-83af-af6162407eba-kube-api-access-fhcq5\") pod \"auto-csr-approver-29535932-4rx7t\" (UID: \"ef8584c4-bf5c-47f7-83af-af6162407eba\") " pod="openshift-infra/auto-csr-approver-29535932-4rx7t" Feb 27 01:32:00 crc kubenswrapper[4771]: I0227 01:32:00.461295 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535932-4rx7t" Feb 27 01:32:00 crc kubenswrapper[4771]: I0227 01:32:00.968046 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535932-4rx7t"] Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.148097 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5" Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.226719 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34ae2923-be95-45e5-a840-dfea9b17f9c4-bootstrap-combined-ca-bundle\") pod \"34ae2923-be95-45e5-a840-dfea9b17f9c4\" (UID: \"34ae2923-be95-45e5-a840-dfea9b17f9c4\") " Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.226900 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vf68\" (UniqueName: \"kubernetes.io/projected/34ae2923-be95-45e5-a840-dfea9b17f9c4-kube-api-access-5vf68\") pod \"34ae2923-be95-45e5-a840-dfea9b17f9c4\" (UID: \"34ae2923-be95-45e5-a840-dfea9b17f9c4\") " Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.226935 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34ae2923-be95-45e5-a840-dfea9b17f9c4-inventory\") pod \"34ae2923-be95-45e5-a840-dfea9b17f9c4\" (UID: \"34ae2923-be95-45e5-a840-dfea9b17f9c4\") " Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.227052 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34ae2923-be95-45e5-a840-dfea9b17f9c4-ssh-key-openstack-edpm-ipam\") pod \"34ae2923-be95-45e5-a840-dfea9b17f9c4\" (UID: \"34ae2923-be95-45e5-a840-dfea9b17f9c4\") " Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.238246 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34ae2923-be95-45e5-a840-dfea9b17f9c4-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "34ae2923-be95-45e5-a840-dfea9b17f9c4" (UID: "34ae2923-be95-45e5-a840-dfea9b17f9c4"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.238396 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34ae2923-be95-45e5-a840-dfea9b17f9c4-kube-api-access-5vf68" (OuterVolumeSpecName: "kube-api-access-5vf68") pod "34ae2923-be95-45e5-a840-dfea9b17f9c4" (UID: "34ae2923-be95-45e5-a840-dfea9b17f9c4"). InnerVolumeSpecName "kube-api-access-5vf68". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.260636 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34ae2923-be95-45e5-a840-dfea9b17f9c4-inventory" (OuterVolumeSpecName: "inventory") pod "34ae2923-be95-45e5-a840-dfea9b17f9c4" (UID: "34ae2923-be95-45e5-a840-dfea9b17f9c4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.261116 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34ae2923-be95-45e5-a840-dfea9b17f9c4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "34ae2923-be95-45e5-a840-dfea9b17f9c4" (UID: "34ae2923-be95-45e5-a840-dfea9b17f9c4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.329441 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34ae2923-be95-45e5-a840-dfea9b17f9c4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.329483 4771 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34ae2923-be95-45e5-a840-dfea9b17f9c4-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.329497 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vf68\" (UniqueName: \"kubernetes.io/projected/34ae2923-be95-45e5-a840-dfea9b17f9c4-kube-api-access-5vf68\") on node \"crc\" DevicePath \"\"" Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.329509 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34ae2923-be95-45e5-a840-dfea9b17f9c4-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.777900 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5" Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.791619 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5" event={"ID":"34ae2923-be95-45e5-a840-dfea9b17f9c4","Type":"ContainerDied","Data":"7bd272f7844dc05fc67a08d66f97b2fe7a37aa72d23ee8b0314c686df94b47e1"} Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.791664 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bd272f7844dc05fc67a08d66f97b2fe7a37aa72d23ee8b0314c686df94b47e1" Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.793451 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535932-4rx7t" event={"ID":"ef8584c4-bf5c-47f7-83af-af6162407eba","Type":"ContainerStarted","Data":"167d7228df8dc300aae05d9e02738407c861c28716598cc1488e59d2547c22bd"} Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.858316 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-47l7f"] Feb 27 01:32:01 crc kubenswrapper[4771]: E0227 01:32:01.858825 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34ae2923-be95-45e5-a840-dfea9b17f9c4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.858851 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="34ae2923-be95-45e5-a840-dfea9b17f9c4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.859093 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="34ae2923-be95-45e5-a840-dfea9b17f9c4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.859805 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-47l7f" Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.864598 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-47l7f"] Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.865756 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.865992 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.866117 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kwjcj" Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.866217 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.938214 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-496nb\" (UniqueName: \"kubernetes.io/projected/761add5e-bade-44af-be1b-3cbcaa54f19a-kube-api-access-496nb\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-47l7f\" (UID: \"761add5e-bade-44af-be1b-3cbcaa54f19a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-47l7f" Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.938280 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/761add5e-bade-44af-be1b-3cbcaa54f19a-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-47l7f\" (UID: \"761add5e-bade-44af-be1b-3cbcaa54f19a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-47l7f" Feb 27 01:32:01 crc kubenswrapper[4771]: I0227 01:32:01.938529 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/761add5e-bade-44af-be1b-3cbcaa54f19a-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-47l7f\" (UID: \"761add5e-bade-44af-be1b-3cbcaa54f19a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-47l7f" Feb 27 01:32:02 crc kubenswrapper[4771]: I0227 01:32:02.039869 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-496nb\" (UniqueName: \"kubernetes.io/projected/761add5e-bade-44af-be1b-3cbcaa54f19a-kube-api-access-496nb\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-47l7f\" (UID: \"761add5e-bade-44af-be1b-3cbcaa54f19a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-47l7f" Feb 27 01:32:02 crc kubenswrapper[4771]: I0227 01:32:02.039940 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/761add5e-bade-44af-be1b-3cbcaa54f19a-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-47l7f\" (UID: \"761add5e-bade-44af-be1b-3cbcaa54f19a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-47l7f" Feb 27 01:32:02 crc kubenswrapper[4771]: I0227 01:32:02.040009 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/761add5e-bade-44af-be1b-3cbcaa54f19a-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-47l7f\" (UID: \"761add5e-bade-44af-be1b-3cbcaa54f19a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-47l7f" Feb 27 01:32:02 crc kubenswrapper[4771]: I0227 01:32:02.052498 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/761add5e-bade-44af-be1b-3cbcaa54f19a-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-47l7f\" (UID: \"761add5e-bade-44af-be1b-3cbcaa54f19a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-47l7f" Feb 27 01:32:02 crc kubenswrapper[4771]: I0227 01:32:02.052896 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/761add5e-bade-44af-be1b-3cbcaa54f19a-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-47l7f\" (UID: \"761add5e-bade-44af-be1b-3cbcaa54f19a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-47l7f" Feb 27 01:32:02 crc kubenswrapper[4771]: I0227 01:32:02.057763 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-496nb\" (UniqueName: \"kubernetes.io/projected/761add5e-bade-44af-be1b-3cbcaa54f19a-kube-api-access-496nb\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-47l7f\" (UID: \"761add5e-bade-44af-be1b-3cbcaa54f19a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-47l7f" Feb 27 01:32:02 crc kubenswrapper[4771]: I0227 01:32:02.218235 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-47l7f" Feb 27 01:32:02 crc kubenswrapper[4771]: I0227 01:32:02.731141 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-47l7f"] Feb 27 01:32:02 crc kubenswrapper[4771]: W0227 01:32:02.739312 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod761add5e_bade_44af_be1b_3cbcaa54f19a.slice/crio-e23d6178f4f616049f1098ffaaabfc90c0a0789a4b2f65b032712212abc1ad8b WatchSource:0}: Error finding container e23d6178f4f616049f1098ffaaabfc90c0a0789a4b2f65b032712212abc1ad8b: Status 404 returned error can't find the container with id e23d6178f4f616049f1098ffaaabfc90c0a0789a4b2f65b032712212abc1ad8b Feb 27 01:32:02 crc kubenswrapper[4771]: I0227 01:32:02.803303 4771 generic.go:334] "Generic (PLEG): container finished" podID="ef8584c4-bf5c-47f7-83af-af6162407eba" containerID="113f2d9db81d779181c411a5cffdb485c1a9dba56549f29c916f6125a9d62fc5" exitCode=0 Feb 27 01:32:02 crc kubenswrapper[4771]: I0227 01:32:02.803354 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535932-4rx7t" event={"ID":"ef8584c4-bf5c-47f7-83af-af6162407eba","Type":"ContainerDied","Data":"113f2d9db81d779181c411a5cffdb485c1a9dba56549f29c916f6125a9d62fc5"} Feb 27 01:32:02 crc kubenswrapper[4771]: I0227 01:32:02.804999 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-47l7f" event={"ID":"761add5e-bade-44af-be1b-3cbcaa54f19a","Type":"ContainerStarted","Data":"e23d6178f4f616049f1098ffaaabfc90c0a0789a4b2f65b032712212abc1ad8b"} Feb 27 01:32:03 crc kubenswrapper[4771]: I0227 01:32:03.821222 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-47l7f" event={"ID":"761add5e-bade-44af-be1b-3cbcaa54f19a","Type":"ContainerStarted","Data":"1de580526d764652eb1b9f2e01870ac87fe9f3f068a83a786c13261c273d4314"} Feb 27 01:32:03 crc kubenswrapper[4771]: I0227 01:32:03.855930 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-47l7f" podStartSLOduration=2.202375075 podStartE2EDuration="2.855908299s" podCreationTimestamp="2026-02-27 01:32:01 +0000 UTC" firstStartedPulling="2026-02-27 01:32:02.741478888 +0000 UTC m=+1635.679040176" lastFinishedPulling="2026-02-27 01:32:03.395012102 +0000 UTC m=+1636.332573400" observedRunningTime="2026-02-27 01:32:03.842339311 +0000 UTC m=+1636.779900609" watchObservedRunningTime="2026-02-27 01:32:03.855908299 +0000 UTC m=+1636.793469597" Feb 27 01:32:04 crc kubenswrapper[4771]: I0227 01:32:04.193967 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535932-4rx7t" Feb 27 01:32:04 crc kubenswrapper[4771]: I0227 01:32:04.288533 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhcq5\" (UniqueName: \"kubernetes.io/projected/ef8584c4-bf5c-47f7-83af-af6162407eba-kube-api-access-fhcq5\") pod \"ef8584c4-bf5c-47f7-83af-af6162407eba\" (UID: \"ef8584c4-bf5c-47f7-83af-af6162407eba\") " Feb 27 01:32:04 crc kubenswrapper[4771]: I0227 01:32:04.301694 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef8584c4-bf5c-47f7-83af-af6162407eba-kube-api-access-fhcq5" (OuterVolumeSpecName: "kube-api-access-fhcq5") pod "ef8584c4-bf5c-47f7-83af-af6162407eba" (UID: "ef8584c4-bf5c-47f7-83af-af6162407eba"). InnerVolumeSpecName "kube-api-access-fhcq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:32:04 crc kubenswrapper[4771]: I0227 01:32:04.399057 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhcq5\" (UniqueName: \"kubernetes.io/projected/ef8584c4-bf5c-47f7-83af-af6162407eba-kube-api-access-fhcq5\") on node \"crc\" DevicePath \"\"" Feb 27 01:32:04 crc kubenswrapper[4771]: I0227 01:32:04.837481 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535932-4rx7t" event={"ID":"ef8584c4-bf5c-47f7-83af-af6162407eba","Type":"ContainerDied","Data":"167d7228df8dc300aae05d9e02738407c861c28716598cc1488e59d2547c22bd"} Feb 27 01:32:04 crc kubenswrapper[4771]: I0227 01:32:04.837566 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="167d7228df8dc300aae05d9e02738407c861c28716598cc1488e59d2547c22bd" Feb 27 01:32:04 crc kubenswrapper[4771]: I0227 01:32:04.837516 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535932-4rx7t" Feb 27 01:32:05 crc kubenswrapper[4771]: I0227 01:32:05.265925 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535926-bsrhj"] Feb 27 01:32:05 crc kubenswrapper[4771]: I0227 01:32:05.275604 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535926-bsrhj"] Feb 27 01:32:05 crc kubenswrapper[4771]: I0227 01:32:05.785842 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abf6ea82-e9fc-4fd6-81c5-6883afb00b83" path="/var/lib/kubelet/pods/abf6ea82-e9fc-4fd6-81c5-6883afb00b83/volumes" Feb 27 01:32:26 crc kubenswrapper[4771]: I0227 01:32:26.685573 4771 scope.go:117] "RemoveContainer" containerID="596827d833e8740f4219de319278bedbde3129517faaddc09167ab70da358220" Feb 27 01:32:42 crc kubenswrapper[4771]: I0227 01:32:42.141899 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6vsgv"] Feb 27 01:32:42 crc kubenswrapper[4771]: E0227 01:32:42.142950 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef8584c4-bf5c-47f7-83af-af6162407eba" containerName="oc" Feb 27 01:32:42 crc kubenswrapper[4771]: I0227 01:32:42.142969 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef8584c4-bf5c-47f7-83af-af6162407eba" containerName="oc" Feb 27 01:32:42 crc kubenswrapper[4771]: I0227 01:32:42.143225 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef8584c4-bf5c-47f7-83af-af6162407eba" containerName="oc" Feb 27 01:32:42 crc kubenswrapper[4771]: I0227 01:32:42.144910 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6vsgv" Feb 27 01:32:42 crc kubenswrapper[4771]: I0227 01:32:42.179330 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vsgv"] Feb 27 01:32:42 crc kubenswrapper[4771]: I0227 01:32:42.240517 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22d852f1-4370-4163-bd48-dc909dcd2054-utilities\") pod \"redhat-marketplace-6vsgv\" (UID: \"22d852f1-4370-4163-bd48-dc909dcd2054\") " pod="openshift-marketplace/redhat-marketplace-6vsgv" Feb 27 01:32:42 crc kubenswrapper[4771]: I0227 01:32:42.240688 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22d852f1-4370-4163-bd48-dc909dcd2054-catalog-content\") pod \"redhat-marketplace-6vsgv\" (UID: \"22d852f1-4370-4163-bd48-dc909dcd2054\") " pod="openshift-marketplace/redhat-marketplace-6vsgv" Feb 27 01:32:42 crc kubenswrapper[4771]: I0227 01:32:42.240786 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvd4f\" (UniqueName: \"kubernetes.io/projected/22d852f1-4370-4163-bd48-dc909dcd2054-kube-api-access-lvd4f\") pod \"redhat-marketplace-6vsgv\" (UID: \"22d852f1-4370-4163-bd48-dc909dcd2054\") " pod="openshift-marketplace/redhat-marketplace-6vsgv" Feb 27 01:32:42 crc kubenswrapper[4771]: I0227 01:32:42.342529 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22d852f1-4370-4163-bd48-dc909dcd2054-catalog-content\") pod \"redhat-marketplace-6vsgv\" (UID: \"22d852f1-4370-4163-bd48-dc909dcd2054\") " pod="openshift-marketplace/redhat-marketplace-6vsgv" Feb 27 01:32:42 crc kubenswrapper[4771]: I0227 01:32:42.342641 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvd4f\" (UniqueName: \"kubernetes.io/projected/22d852f1-4370-4163-bd48-dc909dcd2054-kube-api-access-lvd4f\") pod \"redhat-marketplace-6vsgv\" (UID: \"22d852f1-4370-4163-bd48-dc909dcd2054\") " pod="openshift-marketplace/redhat-marketplace-6vsgv" Feb 27 01:32:42 crc kubenswrapper[4771]: I0227 01:32:42.342803 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22d852f1-4370-4163-bd48-dc909dcd2054-utilities\") pod \"redhat-marketplace-6vsgv\" (UID: \"22d852f1-4370-4163-bd48-dc909dcd2054\") " pod="openshift-marketplace/redhat-marketplace-6vsgv" Feb 27 01:32:42 crc kubenswrapper[4771]: I0227 01:32:42.343060 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22d852f1-4370-4163-bd48-dc909dcd2054-catalog-content\") pod \"redhat-marketplace-6vsgv\" (UID: \"22d852f1-4370-4163-bd48-dc909dcd2054\") " pod="openshift-marketplace/redhat-marketplace-6vsgv" Feb 27 01:32:42 crc kubenswrapper[4771]: I0227 01:32:42.343180 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22d852f1-4370-4163-bd48-dc909dcd2054-utilities\") pod \"redhat-marketplace-6vsgv\" (UID: \"22d852f1-4370-4163-bd48-dc909dcd2054\") " pod="openshift-marketplace/redhat-marketplace-6vsgv" Feb 27 01:32:42 crc kubenswrapper[4771]: I0227 01:32:42.363449 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvd4f\" (UniqueName: \"kubernetes.io/projected/22d852f1-4370-4163-bd48-dc909dcd2054-kube-api-access-lvd4f\") pod \"redhat-marketplace-6vsgv\" (UID: \"22d852f1-4370-4163-bd48-dc909dcd2054\") " pod="openshift-marketplace/redhat-marketplace-6vsgv" Feb 27 01:32:42 crc kubenswrapper[4771]: I0227 01:32:42.475954 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6vsgv" Feb 27 01:32:42 crc kubenswrapper[4771]: I0227 01:32:42.944909 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vsgv"] Feb 27 01:32:42 crc kubenswrapper[4771]: W0227 01:32:42.945729 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22d852f1_4370_4163_bd48_dc909dcd2054.slice/crio-56defa5b52e567021e2cbc980da63933117d71171c1b1274aa6ce622f095e2c4 WatchSource:0}: Error finding container 56defa5b52e567021e2cbc980da63933117d71171c1b1274aa6ce622f095e2c4: Status 404 returned error can't find the container with id 56defa5b52e567021e2cbc980da63933117d71171c1b1274aa6ce622f095e2c4 Feb 27 01:32:43 crc kubenswrapper[4771]: I0227 01:32:43.289859 4771 generic.go:334] "Generic (PLEG): container finished" podID="22d852f1-4370-4163-bd48-dc909dcd2054" containerID="cd09f1d1d8babf62509e03a656db8557c0cf0778ecceac8b073e80da7173eb89" exitCode=0 Feb 27 01:32:43 crc kubenswrapper[4771]: I0227 01:32:43.289967 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vsgv" event={"ID":"22d852f1-4370-4163-bd48-dc909dcd2054","Type":"ContainerDied","Data":"cd09f1d1d8babf62509e03a656db8557c0cf0778ecceac8b073e80da7173eb89"} Feb 27 01:32:43 crc kubenswrapper[4771]: I0227 01:32:43.290225 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vsgv" event={"ID":"22d852f1-4370-4163-bd48-dc909dcd2054","Type":"ContainerStarted","Data":"56defa5b52e567021e2cbc980da63933117d71171c1b1274aa6ce622f095e2c4"} Feb 27 01:32:44 crc kubenswrapper[4771]: I0227 01:32:44.302680 4771 generic.go:334] "Generic (PLEG): container finished" podID="22d852f1-4370-4163-bd48-dc909dcd2054" containerID="f4ffcc6b158de7519ca63bb428d7f50b48f14b227df769ab984493294d2cb28c" exitCode=0 Feb 27 01:32:44 crc kubenswrapper[4771]: I0227 01:32:44.302738 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vsgv" event={"ID":"22d852f1-4370-4163-bd48-dc909dcd2054","Type":"ContainerDied","Data":"f4ffcc6b158de7519ca63bb428d7f50b48f14b227df769ab984493294d2cb28c"} Feb 27 01:32:45 crc kubenswrapper[4771]: I0227 01:32:45.319078 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vsgv" event={"ID":"22d852f1-4370-4163-bd48-dc909dcd2054","Type":"ContainerStarted","Data":"dd277de85c8a89e9b13726c62e152672ce38b86b45b53f389296e7adc4fc3dc4"} Feb 27 01:32:45 crc kubenswrapper[4771]: I0227 01:32:45.342152 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6vsgv" podStartSLOduration=1.9048974300000001 podStartE2EDuration="3.342125486s" podCreationTimestamp="2026-02-27 01:32:42 +0000 UTC" firstStartedPulling="2026-02-27 01:32:43.291489273 +0000 UTC m=+1676.229050561" lastFinishedPulling="2026-02-27 01:32:44.728717339 +0000 UTC m=+1677.666278617" observedRunningTime="2026-02-27 01:32:45.340561974 +0000 UTC m=+1678.278123282" watchObservedRunningTime="2026-02-27 01:32:45.342125486 +0000 UTC m=+1678.279686784" Feb 27 01:32:52 crc kubenswrapper[4771]: I0227 01:32:52.477161 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6vsgv" Feb 27 01:32:52 crc kubenswrapper[4771]: I0227 01:32:52.477755 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6vsgv" Feb 27 01:32:52 crc kubenswrapper[4771]: I0227 01:32:52.524530 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6vsgv" Feb 27 01:32:53 crc kubenswrapper[4771]: I0227 01:32:53.505529 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6vsgv" Feb 27 01:32:53 crc kubenswrapper[4771]: I0227 01:32:53.566970 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vsgv"] Feb 27 01:32:55 crc kubenswrapper[4771]: I0227 01:32:55.441013 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6vsgv" podUID="22d852f1-4370-4163-bd48-dc909dcd2054" containerName="registry-server" containerID="cri-o://dd277de85c8a89e9b13726c62e152672ce38b86b45b53f389296e7adc4fc3dc4" gracePeriod=2 Feb 27 01:32:55 crc kubenswrapper[4771]: I0227 01:32:55.889127 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6vsgv" Feb 27 01:32:56 crc kubenswrapper[4771]: I0227 01:32:56.082473 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvd4f\" (UniqueName: \"kubernetes.io/projected/22d852f1-4370-4163-bd48-dc909dcd2054-kube-api-access-lvd4f\") pod \"22d852f1-4370-4163-bd48-dc909dcd2054\" (UID: \"22d852f1-4370-4163-bd48-dc909dcd2054\") " Feb 27 01:32:56 crc kubenswrapper[4771]: I0227 01:32:56.082605 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22d852f1-4370-4163-bd48-dc909dcd2054-catalog-content\") pod \"22d852f1-4370-4163-bd48-dc909dcd2054\" (UID: \"22d852f1-4370-4163-bd48-dc909dcd2054\") " Feb 27 01:32:56 crc kubenswrapper[4771]: I0227 01:32:56.082793 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22d852f1-4370-4163-bd48-dc909dcd2054-utilities\") pod \"22d852f1-4370-4163-bd48-dc909dcd2054\" (UID: \"22d852f1-4370-4163-bd48-dc909dcd2054\") " Feb 27 01:32:56 crc kubenswrapper[4771]: I0227 01:32:56.083691 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22d852f1-4370-4163-bd48-dc909dcd2054-utilities" (OuterVolumeSpecName: "utilities") pod "22d852f1-4370-4163-bd48-dc909dcd2054" (UID: "22d852f1-4370-4163-bd48-dc909dcd2054"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:32:56 crc kubenswrapper[4771]: I0227 01:32:56.093847 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22d852f1-4370-4163-bd48-dc909dcd2054-kube-api-access-lvd4f" (OuterVolumeSpecName: "kube-api-access-lvd4f") pod "22d852f1-4370-4163-bd48-dc909dcd2054" (UID: "22d852f1-4370-4163-bd48-dc909dcd2054"). InnerVolumeSpecName "kube-api-access-lvd4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:32:56 crc kubenswrapper[4771]: I0227 01:32:56.104634 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22d852f1-4370-4163-bd48-dc909dcd2054-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22d852f1-4370-4163-bd48-dc909dcd2054" (UID: "22d852f1-4370-4163-bd48-dc909dcd2054"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:32:56 crc kubenswrapper[4771]: I0227 01:32:56.184387 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22d852f1-4370-4163-bd48-dc909dcd2054-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:32:56 crc kubenswrapper[4771]: I0227 01:32:56.184433 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvd4f\" (UniqueName: \"kubernetes.io/projected/22d852f1-4370-4163-bd48-dc909dcd2054-kube-api-access-lvd4f\") on node \"crc\" DevicePath \"\"" Feb 27 01:32:56 crc kubenswrapper[4771]: I0227 01:32:56.184448 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22d852f1-4370-4163-bd48-dc909dcd2054-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:32:56 crc kubenswrapper[4771]: I0227 01:32:56.455122 4771 generic.go:334] "Generic (PLEG): container finished" podID="22d852f1-4370-4163-bd48-dc909dcd2054" containerID="dd277de85c8a89e9b13726c62e152672ce38b86b45b53f389296e7adc4fc3dc4" exitCode=0 Feb 27 01:32:56 crc kubenswrapper[4771]: I0227 01:32:56.455183 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vsgv" event={"ID":"22d852f1-4370-4163-bd48-dc909dcd2054","Type":"ContainerDied","Data":"dd277de85c8a89e9b13726c62e152672ce38b86b45b53f389296e7adc4fc3dc4"} Feb 27 01:32:56 crc kubenswrapper[4771]: I0227 01:32:56.455209 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6vsgv" Feb 27 01:32:56 crc kubenswrapper[4771]: I0227 01:32:56.455226 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vsgv" event={"ID":"22d852f1-4370-4163-bd48-dc909dcd2054","Type":"ContainerDied","Data":"56defa5b52e567021e2cbc980da63933117d71171c1b1274aa6ce622f095e2c4"} Feb 27 01:32:56 crc kubenswrapper[4771]: I0227 01:32:56.455248 4771 scope.go:117] "RemoveContainer" containerID="dd277de85c8a89e9b13726c62e152672ce38b86b45b53f389296e7adc4fc3dc4" Feb 27 01:32:56 crc kubenswrapper[4771]: I0227 01:32:56.483633 4771 scope.go:117] "RemoveContainer" containerID="f4ffcc6b158de7519ca63bb428d7f50b48f14b227df769ab984493294d2cb28c" Feb 27 01:32:56 crc kubenswrapper[4771]: I0227 01:32:56.507173 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vsgv"] Feb 27 01:32:56 crc kubenswrapper[4771]: I0227 01:32:56.520199 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vsgv"] Feb 27 01:32:56 crc kubenswrapper[4771]: I0227 01:32:56.526930 4771 scope.go:117] "RemoveContainer" containerID="cd09f1d1d8babf62509e03a656db8557c0cf0778ecceac8b073e80da7173eb89" Feb 27 01:32:56 crc kubenswrapper[4771]: I0227 01:32:56.583498 4771 scope.go:117] "RemoveContainer" containerID="dd277de85c8a89e9b13726c62e152672ce38b86b45b53f389296e7adc4fc3dc4" Feb 27 01:32:56 crc kubenswrapper[4771]: E0227 01:32:56.584322 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd277de85c8a89e9b13726c62e152672ce38b86b45b53f389296e7adc4fc3dc4\": container with ID starting with dd277de85c8a89e9b13726c62e152672ce38b86b45b53f389296e7adc4fc3dc4 not found: ID does not exist" containerID="dd277de85c8a89e9b13726c62e152672ce38b86b45b53f389296e7adc4fc3dc4" Feb 27 01:32:56 crc kubenswrapper[4771]: I0227 01:32:56.584361 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd277de85c8a89e9b13726c62e152672ce38b86b45b53f389296e7adc4fc3dc4"} err="failed to get container status \"dd277de85c8a89e9b13726c62e152672ce38b86b45b53f389296e7adc4fc3dc4\": rpc error: code = NotFound desc = could not find container \"dd277de85c8a89e9b13726c62e152672ce38b86b45b53f389296e7adc4fc3dc4\": container with ID starting with dd277de85c8a89e9b13726c62e152672ce38b86b45b53f389296e7adc4fc3dc4 not found: ID does not exist" Feb 27 01:32:56 crc kubenswrapper[4771]: I0227 01:32:56.584386 4771 scope.go:117] "RemoveContainer" containerID="f4ffcc6b158de7519ca63bb428d7f50b48f14b227df769ab984493294d2cb28c" Feb 27 01:32:56 crc kubenswrapper[4771]: E0227 01:32:56.584855 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4ffcc6b158de7519ca63bb428d7f50b48f14b227df769ab984493294d2cb28c\": container with ID starting with f4ffcc6b158de7519ca63bb428d7f50b48f14b227df769ab984493294d2cb28c not found: ID does not exist" containerID="f4ffcc6b158de7519ca63bb428d7f50b48f14b227df769ab984493294d2cb28c" Feb 27 01:32:56 crc kubenswrapper[4771]: I0227 01:32:56.584909 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4ffcc6b158de7519ca63bb428d7f50b48f14b227df769ab984493294d2cb28c"} err="failed to get container status \"f4ffcc6b158de7519ca63bb428d7f50b48f14b227df769ab984493294d2cb28c\": rpc error: code = NotFound desc = could not find container \"f4ffcc6b158de7519ca63bb428d7f50b48f14b227df769ab984493294d2cb28c\": container with ID starting with f4ffcc6b158de7519ca63bb428d7f50b48f14b227df769ab984493294d2cb28c not found: ID does not exist" Feb 27 01:32:56 crc kubenswrapper[4771]: I0227 01:32:56.584947 4771 scope.go:117] "RemoveContainer" containerID="cd09f1d1d8babf62509e03a656db8557c0cf0778ecceac8b073e80da7173eb89" Feb 27 01:32:56 crc kubenswrapper[4771]: E0227 01:32:56.585441 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd09f1d1d8babf62509e03a656db8557c0cf0778ecceac8b073e80da7173eb89\": container with ID starting with cd09f1d1d8babf62509e03a656db8557c0cf0778ecceac8b073e80da7173eb89 not found: ID does not exist" containerID="cd09f1d1d8babf62509e03a656db8557c0cf0778ecceac8b073e80da7173eb89" Feb 27 01:32:56 crc kubenswrapper[4771]: I0227 01:32:56.585476 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd09f1d1d8babf62509e03a656db8557c0cf0778ecceac8b073e80da7173eb89"} err="failed to get container status \"cd09f1d1d8babf62509e03a656db8557c0cf0778ecceac8b073e80da7173eb89\": rpc error: code = NotFound desc = could not find container \"cd09f1d1d8babf62509e03a656db8557c0cf0778ecceac8b073e80da7173eb89\": container with ID starting with cd09f1d1d8babf62509e03a656db8557c0cf0778ecceac8b073e80da7173eb89 not found: ID does not exist" Feb 27 01:32:57 crc kubenswrapper[4771]: I0227 01:32:57.786748 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22d852f1-4370-4163-bd48-dc909dcd2054" path="/var/lib/kubelet/pods/22d852f1-4370-4163-bd48-dc909dcd2054/volumes" Feb 27 01:33:08 crc kubenswrapper[4771]: I0227 01:33:08.067824 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8b3d-account-create-update-phxws"] Feb 27 01:33:08 crc kubenswrapper[4771]: I0227 01:33:08.080518 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-ggggn"] Feb 27 01:33:08 crc kubenswrapper[4771]: I0227 01:33:08.098760 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-8b3d-account-create-update-phxws"] Feb 27 01:33:08 crc kubenswrapper[4771]: I0227 01:33:08.098830 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-ggggn"] Feb 27 01:33:09 crc kubenswrapper[4771]: I0227 01:33:09.793441 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b31ac243-4c5d-4cf2-9be0-a2adbcf42186" path="/var/lib/kubelet/pods/b31ac243-4c5d-4cf2-9be0-a2adbcf42186/volumes" Feb 27 01:33:09 crc kubenswrapper[4771]: I0227 01:33:09.795119 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccc234ed-1171-4186-ad79-6029cc652fda" path="/var/lib/kubelet/pods/ccc234ed-1171-4186-ad79-6029cc652fda/volumes" Feb 27 01:33:11 crc kubenswrapper[4771]: I0227 01:33:11.034393 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-d365-account-create-update-hxfxb"] Feb 27 01:33:11 crc kubenswrapper[4771]: I0227 01:33:11.048452 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-d365-account-create-update-hxfxb"] Feb 27 01:33:11 crc kubenswrapper[4771]: I0227 01:33:11.793455 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8f679e1-32b8-4041-bee8-4686a9a9ae2e" path="/var/lib/kubelet/pods/b8f679e1-32b8-4041-bee8-4686a9a9ae2e/volumes" Feb 27 01:33:12 crc kubenswrapper[4771]: I0227 01:33:12.036924 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-c5xcw"] Feb 27 01:33:12 crc kubenswrapper[4771]: I0227 01:33:12.064986 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-c5xcw"] Feb 27 01:33:13 crc kubenswrapper[4771]: I0227 01:33:13.790169 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae89ba17-392c-48f6-b05f-5217350743fe" path="/var/lib/kubelet/pods/ae89ba17-392c-48f6-b05f-5217350743fe/volumes" Feb 27 01:33:17 crc kubenswrapper[4771]: I0227 01:33:17.039678 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-rnr4x"] Feb 27 01:33:17 crc kubenswrapper[4771]: I0227 01:33:17.055378 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-768f-account-create-update-gxklj"] Feb 27 01:33:17 crc kubenswrapper[4771]: I0227 01:33:17.070871 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-rnr4x"] Feb 27 01:33:17 crc kubenswrapper[4771]: I0227 01:33:17.085173 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-768f-account-create-update-gxklj"] Feb 27 01:33:17 crc kubenswrapper[4771]: I0227 01:33:17.795802 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60076d22-0bfa-4f4e-adde-42e991825877" path="/var/lib/kubelet/pods/60076d22-0bfa-4f4e-adde-42e991825877/volumes" Feb 27 01:33:17 crc kubenswrapper[4771]: I0227 01:33:17.799708 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f" path="/var/lib/kubelet/pods/ecdc6e63-c0a3-4fec-9fb5-19b41e507b4f/volumes" Feb 27 01:33:26 crc kubenswrapper[4771]: I0227 01:33:26.774020 4771 scope.go:117] "RemoveContainer" containerID="6d058c14f94f6075b3d3fc347b91c26d963448674cbe80dcd44a15eb3a8e7e26" Feb 27 01:33:26 crc kubenswrapper[4771]: I0227 01:33:26.801784 4771 scope.go:117] "RemoveContainer" containerID="67b43932ca2388709a10439977aacc1ede8a467713f717bc06b171fd409aa17d" Feb 27 01:33:26 crc kubenswrapper[4771]: I0227 01:33:26.825223 4771 scope.go:117] "RemoveContainer" containerID="f4b40801898389d09b69135513f45175da3b95797febad5e12f39119006aed40" Feb 27 01:33:26 crc kubenswrapper[4771]: I0227 01:33:26.850721 4771 scope.go:117] "RemoveContainer" containerID="71760be4e11a514bdbdf989a6e3e8a8920da84ab5247daa34dcb145e0089e4d1" Feb 27 01:33:26 crc kubenswrapper[4771]: I0227 01:33:26.926452 4771 scope.go:117] "RemoveContainer" containerID="41d35fa3bc72ae3e6a241953dad69a0fe304a3eb8fee468bc3293f76983c9c95" Feb 27 01:33:26 crc kubenswrapper[4771]: I0227 01:33:26.973204 4771 scope.go:117] "RemoveContainer" containerID="b24e6c7bf29978fa688d9704ce181fb0d1d5b851432febc23b8945cbf26408f0" Feb 27 01:33:27 crc kubenswrapper[4771]: I0227 01:33:27.004707 4771 scope.go:117] "RemoveContainer" containerID="cfc3024b9a7b28be1f5bcdd93565d72778f90a3bcd576c9e85767b245cd395cb" Feb 27 01:33:27 crc kubenswrapper[4771]: I0227 01:33:27.045861 4771 scope.go:117] "RemoveContainer" containerID="727b38ebe58c1975f28912b967f22314cfe0bfe0e1164b4f82597f2eb40c1934" Feb 27 01:33:27 crc kubenswrapper[4771]: I0227 01:33:27.071762 4771 scope.go:117] "RemoveContainer" containerID="2e182beeca8a6f3ee67571563ae6acb6f8d973520fd0524931f48e9b19489ad9" Feb 27 01:33:28 crc kubenswrapper[4771]: I0227 01:33:28.953018 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:33:28 crc kubenswrapper[4771]: I0227 01:33:28.953624 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:33:30 crc kubenswrapper[4771]: I0227 01:33:30.857923 4771 generic.go:334] "Generic (PLEG): container finished" podID="761add5e-bade-44af-be1b-3cbcaa54f19a" containerID="1de580526d764652eb1b9f2e01870ac87fe9f3f068a83a786c13261c273d4314" exitCode=0 Feb 27 01:33:30 crc kubenswrapper[4771]: I0227 01:33:30.858025 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-47l7f" event={"ID":"761add5e-bade-44af-be1b-3cbcaa54f19a","Type":"ContainerDied","Data":"1de580526d764652eb1b9f2e01870ac87fe9f3f068a83a786c13261c273d4314"} Feb 27 01:33:32 crc kubenswrapper[4771]: I0227 01:33:32.289377 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-47l7f" Feb 27 01:33:32 crc kubenswrapper[4771]: I0227 01:33:32.390375 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/761add5e-bade-44af-be1b-3cbcaa54f19a-ssh-key-openstack-edpm-ipam\") pod \"761add5e-bade-44af-be1b-3cbcaa54f19a\" (UID: \"761add5e-bade-44af-be1b-3cbcaa54f19a\") " Feb 27 01:33:32 crc kubenswrapper[4771]: I0227 01:33:32.390455 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-496nb\" (UniqueName: \"kubernetes.io/projected/761add5e-bade-44af-be1b-3cbcaa54f19a-kube-api-access-496nb\") pod \"761add5e-bade-44af-be1b-3cbcaa54f19a\" (UID: \"761add5e-bade-44af-be1b-3cbcaa54f19a\") " Feb 27 01:33:32 crc kubenswrapper[4771]: I0227 01:33:32.390608 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/761add5e-bade-44af-be1b-3cbcaa54f19a-inventory\") pod \"761add5e-bade-44af-be1b-3cbcaa54f19a\" (UID: \"761add5e-bade-44af-be1b-3cbcaa54f19a\") " Feb 27 01:33:32 crc kubenswrapper[4771]: I0227 01:33:32.399585 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/761add5e-bade-44af-be1b-3cbcaa54f19a-kube-api-access-496nb" (OuterVolumeSpecName: "kube-api-access-496nb") pod "761add5e-bade-44af-be1b-3cbcaa54f19a" (UID: "761add5e-bade-44af-be1b-3cbcaa54f19a"). InnerVolumeSpecName "kube-api-access-496nb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:33:32 crc kubenswrapper[4771]: I0227 01:33:32.423958 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/761add5e-bade-44af-be1b-3cbcaa54f19a-inventory" (OuterVolumeSpecName: "inventory") pod "761add5e-bade-44af-be1b-3cbcaa54f19a" (UID: "761add5e-bade-44af-be1b-3cbcaa54f19a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:33:32 crc kubenswrapper[4771]: I0227 01:33:32.425219 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/761add5e-bade-44af-be1b-3cbcaa54f19a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "761add5e-bade-44af-be1b-3cbcaa54f19a" (UID: "761add5e-bade-44af-be1b-3cbcaa54f19a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:33:32 crc kubenswrapper[4771]: I0227 01:33:32.493267 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/761add5e-bade-44af-be1b-3cbcaa54f19a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 01:33:32 crc kubenswrapper[4771]: I0227 01:33:32.493316 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-496nb\" (UniqueName: \"kubernetes.io/projected/761add5e-bade-44af-be1b-3cbcaa54f19a-kube-api-access-496nb\") on node \"crc\" DevicePath \"\"" Feb 27 01:33:32 crc kubenswrapper[4771]: I0227 01:33:32.493331 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/761add5e-bade-44af-be1b-3cbcaa54f19a-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 01:33:32 crc kubenswrapper[4771]: I0227 01:33:32.879377 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-47l7f" event={"ID":"761add5e-bade-44af-be1b-3cbcaa54f19a","Type":"ContainerDied","Data":"e23d6178f4f616049f1098ffaaabfc90c0a0789a4b2f65b032712212abc1ad8b"} Feb 27 01:33:32 crc kubenswrapper[4771]: I0227 01:33:32.879763 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e23d6178f4f616049f1098ffaaabfc90c0a0789a4b2f65b032712212abc1ad8b" Feb 27 01:33:32 crc kubenswrapper[4771]: I0227 01:33:32.879444 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-47l7f" Feb 27 01:33:32 crc kubenswrapper[4771]: I0227 01:33:32.985430 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf"] Feb 27 01:33:32 crc kubenswrapper[4771]: E0227 01:33:32.985854 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="761add5e-bade-44af-be1b-3cbcaa54f19a" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 27 01:33:32 crc kubenswrapper[4771]: I0227 01:33:32.985874 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="761add5e-bade-44af-be1b-3cbcaa54f19a" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 27 01:33:32 crc kubenswrapper[4771]: E0227 01:33:32.985890 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22d852f1-4370-4163-bd48-dc909dcd2054" containerName="extract-content" Feb 27 01:33:32 crc kubenswrapper[4771]: I0227 01:33:32.985898 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="22d852f1-4370-4163-bd48-dc909dcd2054" containerName="extract-content" Feb 27 01:33:32 crc kubenswrapper[4771]: E0227 01:33:32.985913 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22d852f1-4370-4163-bd48-dc909dcd2054" containerName="extract-utilities" Feb 27 01:33:32 crc kubenswrapper[4771]: I0227 01:33:32.985922 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="22d852f1-4370-4163-bd48-dc909dcd2054" containerName="extract-utilities" Feb 27 01:33:32 crc kubenswrapper[4771]: E0227 01:33:32.985958 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22d852f1-4370-4163-bd48-dc909dcd2054" containerName="registry-server" Feb 27 01:33:32 crc kubenswrapper[4771]: I0227 01:33:32.985965 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="22d852f1-4370-4163-bd48-dc909dcd2054" containerName="registry-server" Feb 27 01:33:32 crc kubenswrapper[4771]: I0227 01:33:32.986125 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="761add5e-bade-44af-be1b-3cbcaa54f19a" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 27 01:33:32 crc kubenswrapper[4771]: I0227 01:33:32.986152 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="22d852f1-4370-4163-bd48-dc909dcd2054" containerName="registry-server" Feb 27 01:33:32 crc kubenswrapper[4771]: I0227 01:33:32.987212 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf" Feb 27 01:33:32 crc kubenswrapper[4771]: I0227 01:33:32.989373 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 01:33:32 crc kubenswrapper[4771]: I0227 01:33:32.989416 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kwjcj" Feb 27 01:33:32 crc kubenswrapper[4771]: I0227 01:33:32.989644 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 01:33:32 crc kubenswrapper[4771]: I0227 01:33:32.999681 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 01:33:33 crc kubenswrapper[4771]: I0227 01:33:33.012345 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf"] Feb 27 01:33:33 crc kubenswrapper[4771]: I0227 01:33:33.104574 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acd636bf-528e-4bbe-8220-e4a9b755b025-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf\" (UID: \"acd636bf-528e-4bbe-8220-e4a9b755b025\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf" Feb 27 01:33:33 crc kubenswrapper[4771]: I0227 01:33:33.104656 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acd636bf-528e-4bbe-8220-e4a9b755b025-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf\" (UID: \"acd636bf-528e-4bbe-8220-e4a9b755b025\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf" Feb 27 01:33:33 crc kubenswrapper[4771]: I0227 01:33:33.104975 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p86kn\" (UniqueName: \"kubernetes.io/projected/acd636bf-528e-4bbe-8220-e4a9b755b025-kube-api-access-p86kn\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf\" (UID: \"acd636bf-528e-4bbe-8220-e4a9b755b025\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf" Feb 27 01:33:33 crc kubenswrapper[4771]: I0227 01:33:33.206288 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acd636bf-528e-4bbe-8220-e4a9b755b025-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf\" (UID: \"acd636bf-528e-4bbe-8220-e4a9b755b025\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf" Feb 27 01:33:33 crc kubenswrapper[4771]: I0227 01:33:33.206398 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p86kn\" (UniqueName: \"kubernetes.io/projected/acd636bf-528e-4bbe-8220-e4a9b755b025-kube-api-access-p86kn\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf\" (UID: \"acd636bf-528e-4bbe-8220-e4a9b755b025\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf" Feb 27 01:33:33 crc kubenswrapper[4771]: I0227 01:33:33.206482 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acd636bf-528e-4bbe-8220-e4a9b755b025-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf\" (UID: \"acd636bf-528e-4bbe-8220-e4a9b755b025\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf" Feb 27 01:33:33 crc kubenswrapper[4771]: I0227 01:33:33.212288 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acd636bf-528e-4bbe-8220-e4a9b755b025-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf\" (UID: \"acd636bf-528e-4bbe-8220-e4a9b755b025\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf" Feb 27 01:33:33 crc kubenswrapper[4771]: I0227 01:33:33.212290 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acd636bf-528e-4bbe-8220-e4a9b755b025-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf\" (UID: \"acd636bf-528e-4bbe-8220-e4a9b755b025\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf" Feb 27 01:33:33 crc kubenswrapper[4771]: I0227 01:33:33.223844 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p86kn\" (UniqueName: \"kubernetes.io/projected/acd636bf-528e-4bbe-8220-e4a9b755b025-kube-api-access-p86kn\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf\" (UID: \"acd636bf-528e-4bbe-8220-e4a9b755b025\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf" Feb 27 01:33:33 crc kubenswrapper[4771]: I0227 01:33:33.365945 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf" Feb 27 01:33:33 crc kubenswrapper[4771]: I0227 01:33:33.909813 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf"] Feb 27 01:33:34 crc kubenswrapper[4771]: I0227 01:33:34.900499 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf" event={"ID":"acd636bf-528e-4bbe-8220-e4a9b755b025","Type":"ContainerStarted","Data":"2a7b691523c1868418ac367b6f8bdd30409429d829e2243a0d42c8ae870fc256"} Feb 27 01:33:34 crc kubenswrapper[4771]: I0227 01:33:34.900868 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf" event={"ID":"acd636bf-528e-4bbe-8220-e4a9b755b025","Type":"ContainerStarted","Data":"f9d5cec03e92090ebb30b6358fe9f58629acc7d622c37fad981dfbe667a8e18c"} Feb 27 01:33:34 crc kubenswrapper[4771]: I0227 01:33:34.921208 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf" podStartSLOduration=2.4807761680000002 podStartE2EDuration="2.921190709s" podCreationTimestamp="2026-02-27 01:33:32 +0000 UTC" firstStartedPulling="2026-02-27 01:33:33.914948459 +0000 UTC m=+1726.852509747" lastFinishedPulling="2026-02-27 01:33:34.355363 +0000 UTC m=+1727.292924288" observedRunningTime="2026-02-27 01:33:34.916601165 +0000 UTC m=+1727.854162453" watchObservedRunningTime="2026-02-27 01:33:34.921190709 +0000 UTC m=+1727.858751997" Feb 27 01:33:39 crc kubenswrapper[4771]: I0227 01:33:39.062630 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-xxhc8"] Feb 27 01:33:39 crc kubenswrapper[4771]: I0227 01:33:39.074398 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-xxhc8"] Feb 27 01:33:39 crc kubenswrapper[4771]: I0227 01:33:39.794056 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e019cf0-706d-444b-98cb-b07123d2a0d1" path="/var/lib/kubelet/pods/1e019cf0-706d-444b-98cb-b07123d2a0d1/volumes" Feb 27 01:33:44 crc kubenswrapper[4771]: I0227 01:33:44.033443 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-cg8jm"] Feb 27 01:33:44 crc kubenswrapper[4771]: I0227 01:33:44.042733 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-cg8jm"] Feb 27 01:33:45 crc kubenswrapper[4771]: I0227 01:33:45.072048 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-5696-account-create-update-7mx4f"] Feb 27 01:33:45 crc kubenswrapper[4771]: I0227 01:33:45.081208 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-sswqd"] Feb 27 01:33:45 crc kubenswrapper[4771]: I0227 01:33:45.093036 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-5lfsc"] Feb 27 01:33:45 crc kubenswrapper[4771]: I0227 01:33:45.103516 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-201a-account-create-update-b8qmr"] Feb 27 01:33:45 crc kubenswrapper[4771]: I0227 01:33:45.114604 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-201a-account-create-update-b8qmr"] Feb 27 01:33:45 crc kubenswrapper[4771]: I0227 01:33:45.122099 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-5696-account-create-update-7mx4f"] Feb 27 01:33:45 crc kubenswrapper[4771]: I0227 01:33:45.131485 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-sswqd"] Feb 27 01:33:45 crc kubenswrapper[4771]: I0227 01:33:45.139752 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-5lfsc"] Feb 27 01:33:45 crc kubenswrapper[4771]: I0227 01:33:45.787102 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46fd6f86-04d4-4c78-bbf8-9f057ac4308b" path="/var/lib/kubelet/pods/46fd6f86-04d4-4c78-bbf8-9f057ac4308b/volumes" Feb 27 01:33:45 crc kubenswrapper[4771]: I0227 01:33:45.788185 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="634488bc-0dd8-4dbc-8be4-99328c6a0088" path="/var/lib/kubelet/pods/634488bc-0dd8-4dbc-8be4-99328c6a0088/volumes" Feb 27 01:33:45 crc kubenswrapper[4771]: I0227 01:33:45.788915 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="733a6478-88f8-4dd2-ad0b-fa824ec14a4d" path="/var/lib/kubelet/pods/733a6478-88f8-4dd2-ad0b-fa824ec14a4d/volumes" Feb 27 01:33:45 crc kubenswrapper[4771]: I0227 01:33:45.789699 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d47d411-ba32-47c5-96dc-448ab3aab865" path="/var/lib/kubelet/pods/9d47d411-ba32-47c5-96dc-448ab3aab865/volumes" Feb 27 01:33:45 crc kubenswrapper[4771]: I0227 01:33:45.790958 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9b12792-b88f-4d69-8df2-03dec12a53ac" path="/var/lib/kubelet/pods/b9b12792-b88f-4d69-8df2-03dec12a53ac/volumes" Feb 27 01:33:48 crc kubenswrapper[4771]: I0227 01:33:48.039352 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b38e-account-create-update-d286j"] Feb 27 01:33:48 crc kubenswrapper[4771]: I0227 01:33:48.052776 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b38e-account-create-update-d286j"] Feb 27 01:33:49 crc kubenswrapper[4771]: I0227 01:33:49.786689 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="091872dd-fe0b-4e93-a837-2fd692af8f21" path="/var/lib/kubelet/pods/091872dd-fe0b-4e93-a837-2fd692af8f21/volumes" Feb 27 01:33:50 crc kubenswrapper[4771]: I0227 01:33:50.036802 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-jdwvn"] Feb 27 01:33:50 crc kubenswrapper[4771]: I0227 01:33:50.045197 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-jdwvn"] Feb 27 01:33:51 crc kubenswrapper[4771]: I0227 01:33:51.782624 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cd00ad5-f04b-4756-b60e-054f87509d3f" path="/var/lib/kubelet/pods/9cd00ad5-f04b-4756-b60e-054f87509d3f/volumes" Feb 27 01:33:52 crc kubenswrapper[4771]: I0227 01:33:52.091820 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-l4w7n"] Feb 27 01:33:52 crc kubenswrapper[4771]: I0227 01:33:52.100279 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-l4w7n"] Feb 27 01:33:53 crc kubenswrapper[4771]: I0227 01:33:53.794587 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf593882-913d-4168-b14f-c7df95930f73" path="/var/lib/kubelet/pods/cf593882-913d-4168-b14f-c7df95930f73/volumes" Feb 27 01:33:58 crc kubenswrapper[4771]: I0227 01:33:58.952965 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:33:58 crc kubenswrapper[4771]: I0227 01:33:58.953618 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:34:00 crc kubenswrapper[4771]: I0227 01:34:00.140926 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535934-lh2pf"] Feb 27 01:34:00 crc kubenswrapper[4771]: I0227 01:34:00.143147 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535934-lh2pf" Feb 27 01:34:00 crc kubenswrapper[4771]: I0227 01:34:00.145778 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:34:00 crc kubenswrapper[4771]: I0227 01:34:00.145937 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:34:00 crc kubenswrapper[4771]: I0227 01:34:00.146004 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 01:34:00 crc kubenswrapper[4771]: I0227 01:34:00.154153 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535934-lh2pf"] Feb 27 01:34:00 crc kubenswrapper[4771]: I0227 01:34:00.261504 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jsp8\" (UniqueName: \"kubernetes.io/projected/2e4c1bda-e210-4613-9bd2-34b82bc45640-kube-api-access-6jsp8\") pod \"auto-csr-approver-29535934-lh2pf\" (UID: \"2e4c1bda-e210-4613-9bd2-34b82bc45640\") " pod="openshift-infra/auto-csr-approver-29535934-lh2pf" Feb 27 01:34:00 crc kubenswrapper[4771]: I0227 01:34:00.364038 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jsp8\" (UniqueName: \"kubernetes.io/projected/2e4c1bda-e210-4613-9bd2-34b82bc45640-kube-api-access-6jsp8\") pod \"auto-csr-approver-29535934-lh2pf\" (UID: \"2e4c1bda-e210-4613-9bd2-34b82bc45640\") " pod="openshift-infra/auto-csr-approver-29535934-lh2pf" Feb 27 01:34:00 crc kubenswrapper[4771]: I0227 01:34:00.385205 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jsp8\" (UniqueName: \"kubernetes.io/projected/2e4c1bda-e210-4613-9bd2-34b82bc45640-kube-api-access-6jsp8\") pod \"auto-csr-approver-29535934-lh2pf\" (UID: \"2e4c1bda-e210-4613-9bd2-34b82bc45640\") " pod="openshift-infra/auto-csr-approver-29535934-lh2pf" Feb 27 01:34:00 crc kubenswrapper[4771]: I0227 01:34:00.496108 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535934-lh2pf" Feb 27 01:34:00 crc kubenswrapper[4771]: I0227 01:34:00.932791 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535934-lh2pf"] Feb 27 01:34:01 crc kubenswrapper[4771]: I0227 01:34:01.183575 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535934-lh2pf" event={"ID":"2e4c1bda-e210-4613-9bd2-34b82bc45640","Type":"ContainerStarted","Data":"2f61567892dfac1605d1859a63a88ac089d13cc90f1b033a53506c03760f1f86"} Feb 27 01:34:03 crc kubenswrapper[4771]: I0227 01:34:03.210815 4771 generic.go:334] "Generic (PLEG): container finished" podID="2e4c1bda-e210-4613-9bd2-34b82bc45640" containerID="2da1f5f128734333c8e6b022f59c10e537034ce9ee5b4bb44a76bae45cd9bbf9" exitCode=0 Feb 27 01:34:03 crc kubenswrapper[4771]: I0227 01:34:03.210901 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535934-lh2pf" event={"ID":"2e4c1bda-e210-4613-9bd2-34b82bc45640","Type":"ContainerDied","Data":"2da1f5f128734333c8e6b022f59c10e537034ce9ee5b4bb44a76bae45cd9bbf9"} Feb 27 01:34:04 crc kubenswrapper[4771]: I0227 01:34:04.608207 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535934-lh2pf" Feb 27 01:34:04 crc kubenswrapper[4771]: I0227 01:34:04.656163 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jsp8\" (UniqueName: \"kubernetes.io/projected/2e4c1bda-e210-4613-9bd2-34b82bc45640-kube-api-access-6jsp8\") pod \"2e4c1bda-e210-4613-9bd2-34b82bc45640\" (UID: \"2e4c1bda-e210-4613-9bd2-34b82bc45640\") " Feb 27 01:34:04 crc kubenswrapper[4771]: I0227 01:34:04.673771 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e4c1bda-e210-4613-9bd2-34b82bc45640-kube-api-access-6jsp8" (OuterVolumeSpecName: "kube-api-access-6jsp8") pod "2e4c1bda-e210-4613-9bd2-34b82bc45640" (UID: "2e4c1bda-e210-4613-9bd2-34b82bc45640"). InnerVolumeSpecName "kube-api-access-6jsp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:34:04 crc kubenswrapper[4771]: I0227 01:34:04.759293 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jsp8\" (UniqueName: \"kubernetes.io/projected/2e4c1bda-e210-4613-9bd2-34b82bc45640-kube-api-access-6jsp8\") on node \"crc\" DevicePath \"\"" Feb 27 01:34:05 crc kubenswrapper[4771]: I0227 01:34:05.232007 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535934-lh2pf" event={"ID":"2e4c1bda-e210-4613-9bd2-34b82bc45640","Type":"ContainerDied","Data":"2f61567892dfac1605d1859a63a88ac089d13cc90f1b033a53506c03760f1f86"} Feb 27 01:34:05 crc kubenswrapper[4771]: I0227 01:34:05.232064 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f61567892dfac1605d1859a63a88ac089d13cc90f1b033a53506c03760f1f86" Feb 27 01:34:05 crc kubenswrapper[4771]: I0227 01:34:05.232061 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535934-lh2pf" Feb 27 01:34:05 crc kubenswrapper[4771]: I0227 01:34:05.695277 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535928-jj8zd"] Feb 27 01:34:05 crc kubenswrapper[4771]: I0227 01:34:05.703632 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535928-jj8zd"] Feb 27 01:34:05 crc kubenswrapper[4771]: I0227 01:34:05.792674 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d28e3a3-ccd6-4c74-996b-cb8844471672" path="/var/lib/kubelet/pods/0d28e3a3-ccd6-4c74-996b-cb8844471672/volumes" Feb 27 01:34:23 crc kubenswrapper[4771]: I0227 01:34:23.056663 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-mg85q"] Feb 27 01:34:23 crc kubenswrapper[4771]: I0227 01:34:23.070273 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-mg85q"] Feb 27 01:34:23 crc kubenswrapper[4771]: I0227 01:34:23.796932 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d82930a8-1630-4e85-86f0-0f2027e7225d" path="/var/lib/kubelet/pods/d82930a8-1630-4e85-86f0-0f2027e7225d/volumes" Feb 27 01:34:27 crc kubenswrapper[4771]: I0227 01:34:27.281029 4771 scope.go:117] "RemoveContainer" containerID="086025db74cc9b34ed6cf0ddb188159f320cba3aa21b4cc17b433b8b021fd3de" Feb 27 01:34:27 crc kubenswrapper[4771]: I0227 01:34:27.314468 4771 scope.go:117] "RemoveContainer" containerID="b2c1d545d61e8fbda505e99057e7cbf67ea03b9e8da2fabebf60d89760134563" Feb 27 01:34:27 crc kubenswrapper[4771]: I0227 01:34:27.363092 4771 scope.go:117] "RemoveContainer" containerID="34409f10da27221006f5134c9147e1b83cd82ce20cd7bfc0019338bfb6fee5a3" Feb 27 01:34:27 crc kubenswrapper[4771]: I0227 01:34:27.419884 4771 scope.go:117] "RemoveContainer" containerID="ac172242da8d96913693eb009e7bd6164a310f4fb458cfa6efe59999b377503f" Feb 27 01:34:27 crc kubenswrapper[4771]: I0227 01:34:27.456022 4771 scope.go:117] "RemoveContainer" containerID="c3c14a90a5c23fa0f6008256eb0bb7d14ebd6bc3f79eb00361481bca14439cd0" Feb 27 01:34:27 crc kubenswrapper[4771]: I0227 01:34:27.514244 4771 scope.go:117] "RemoveContainer" containerID="a9343567e9e56567b52bb82f302faacea0cb49a9e19b11a33b359b8ee0113ca6" Feb 27 01:34:27 crc kubenswrapper[4771]: I0227 01:34:27.544217 4771 scope.go:117] "RemoveContainer" containerID="2c38bbafbf6607bc5a1382e4dffd8f62f5ccb078bb97519ab8e7f00014769e65" Feb 27 01:34:27 crc kubenswrapper[4771]: I0227 01:34:27.569068 4771 scope.go:117] "RemoveContainer" containerID="5c83d516e9c76ea279bfeec3f69185c3127e0875de93167efaf0fe4233e0ed79" Feb 27 01:34:27 crc kubenswrapper[4771]: I0227 01:34:27.601221 4771 scope.go:117] "RemoveContainer" containerID="e0e45977b813e294d17ab1b1dd234d6ea49582deb369001e7d9b65560cfd6936" Feb 27 01:34:27 crc kubenswrapper[4771]: I0227 01:34:27.652266 4771 scope.go:117] "RemoveContainer" containerID="78fcd55cba24557ee0c5e9291f2b4f14ef5cd69699b00ead6dac6bbbd1fe2ef4" Feb 27 01:34:27 crc kubenswrapper[4771]: I0227 01:34:27.705989 4771 scope.go:117] "RemoveContainer" containerID="47b89cdc4291cbcb5b6c29ef410d93e1215af46db51f890e53a4204f77f24d90" Feb 27 01:34:28 crc kubenswrapper[4771]: I0227 01:34:28.953400 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:34:28 crc kubenswrapper[4771]: I0227 01:34:28.953931 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:34:28 crc kubenswrapper[4771]: I0227 01:34:28.953992 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 01:34:28 crc kubenswrapper[4771]: I0227 01:34:28.955287 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f"} pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 01:34:28 crc kubenswrapper[4771]: I0227 01:34:28.955368 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" containerID="cri-o://cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" gracePeriod=600 Feb 27 01:34:29 crc kubenswrapper[4771]: E0227 01:34:29.076858 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:34:29 crc kubenswrapper[4771]: I0227 01:34:29.485147 4771 generic.go:334] "Generic (PLEG): container finished" podID="ca81e505-d53f-496e-bd26-7cec669591e4" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" exitCode=0 Feb 27 01:34:29 crc kubenswrapper[4771]: I0227 01:34:29.485240 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerDied","Data":"cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f"} Feb 27 01:34:29 crc kubenswrapper[4771]: I0227 01:34:29.485513 4771 scope.go:117] "RemoveContainer" containerID="c2fa94f2e2bead8dd6b922ec063e6ef0f0039cd25cc010b30deb4ce3bb130b4c" Feb 27 01:34:29 crc kubenswrapper[4771]: I0227 01:34:29.486501 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:34:29 crc kubenswrapper[4771]: E0227 01:34:29.486871 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:34:38 crc kubenswrapper[4771]: I0227 01:34:38.050049 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-kqdqm"] Feb 27 01:34:38 crc kubenswrapper[4771]: I0227 01:34:38.068125 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-kqdqm"] Feb 27 01:34:39 crc kubenswrapper[4771]: I0227 01:34:39.785009 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57ec654f-6921-476d-8001-aec299744492" path="/var/lib/kubelet/pods/57ec654f-6921-476d-8001-aec299744492/volumes" Feb 27 01:34:44 crc kubenswrapper[4771]: I0227 01:34:44.645451 4771 generic.go:334] "Generic (PLEG): container finished" podID="acd636bf-528e-4bbe-8220-e4a9b755b025" containerID="2a7b691523c1868418ac367b6f8bdd30409429d829e2243a0d42c8ae870fc256" exitCode=0 Feb 27 01:34:44 crc kubenswrapper[4771]: I0227 01:34:44.645612 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf" event={"ID":"acd636bf-528e-4bbe-8220-e4a9b755b025","Type":"ContainerDied","Data":"2a7b691523c1868418ac367b6f8bdd30409429d829e2243a0d42c8ae870fc256"} Feb 27 01:34:44 crc kubenswrapper[4771]: I0227 01:34:44.773652 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:34:44 crc kubenswrapper[4771]: E0227 01:34:44.773916 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.035722 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-r69k6"] Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.042784 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-r69k6"] Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.097359 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.108535 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acd636bf-528e-4bbe-8220-e4a9b755b025-inventory\") pod \"acd636bf-528e-4bbe-8220-e4a9b755b025\" (UID: \"acd636bf-528e-4bbe-8220-e4a9b755b025\") " Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.108805 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acd636bf-528e-4bbe-8220-e4a9b755b025-ssh-key-openstack-edpm-ipam\") pod \"acd636bf-528e-4bbe-8220-e4a9b755b025\" (UID: \"acd636bf-528e-4bbe-8220-e4a9b755b025\") " Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.108916 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p86kn\" (UniqueName: \"kubernetes.io/projected/acd636bf-528e-4bbe-8220-e4a9b755b025-kube-api-access-p86kn\") pod \"acd636bf-528e-4bbe-8220-e4a9b755b025\" (UID: \"acd636bf-528e-4bbe-8220-e4a9b755b025\") " Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.115429 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acd636bf-528e-4bbe-8220-e4a9b755b025-kube-api-access-p86kn" (OuterVolumeSpecName: "kube-api-access-p86kn") pod "acd636bf-528e-4bbe-8220-e4a9b755b025" (UID: "acd636bf-528e-4bbe-8220-e4a9b755b025"). InnerVolumeSpecName "kube-api-access-p86kn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.137479 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acd636bf-528e-4bbe-8220-e4a9b755b025-inventory" (OuterVolumeSpecName: "inventory") pod "acd636bf-528e-4bbe-8220-e4a9b755b025" (UID: "acd636bf-528e-4bbe-8220-e4a9b755b025"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.143040 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acd636bf-528e-4bbe-8220-e4a9b755b025-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "acd636bf-528e-4bbe-8220-e4a9b755b025" (UID: "acd636bf-528e-4bbe-8220-e4a9b755b025"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.211605 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acd636bf-528e-4bbe-8220-e4a9b755b025-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.211653 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p86kn\" (UniqueName: \"kubernetes.io/projected/acd636bf-528e-4bbe-8220-e4a9b755b025-kube-api-access-p86kn\") on node \"crc\" DevicePath \"\"" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.211667 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acd636bf-528e-4bbe-8220-e4a9b755b025-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.667335 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf" event={"ID":"acd636bf-528e-4bbe-8220-e4a9b755b025","Type":"ContainerDied","Data":"f9d5cec03e92090ebb30b6358fe9f58629acc7d622c37fad981dfbe667a8e18c"} Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.667899 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9d5cec03e92090ebb30b6358fe9f58629acc7d622c37fad981dfbe667a8e18c" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.667405 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.761246 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5"] Feb 27 01:34:46 crc kubenswrapper[4771]: E0227 01:34:46.761602 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acd636bf-528e-4bbe-8220-e4a9b755b025" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.761619 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="acd636bf-528e-4bbe-8220-e4a9b755b025" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 27 01:34:46 crc kubenswrapper[4771]: E0227 01:34:46.761639 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e4c1bda-e210-4613-9bd2-34b82bc45640" containerName="oc" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.761646 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e4c1bda-e210-4613-9bd2-34b82bc45640" containerName="oc" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.761823 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="acd636bf-528e-4bbe-8220-e4a9b755b025" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.761837 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e4c1bda-e210-4613-9bd2-34b82bc45640" containerName="oc" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.762405 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.764688 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.764707 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.764887 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.769142 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kwjcj" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.787918 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5"] Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.823633 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5\" (UID: \"fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.823672 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5\" (UID: \"fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.823758 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrpf5\" (UniqueName: \"kubernetes.io/projected/fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b-kube-api-access-vrpf5\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5\" (UID: \"fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.928465 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrpf5\" (UniqueName: \"kubernetes.io/projected/fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b-kube-api-access-vrpf5\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5\" (UID: \"fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.928769 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5\" (UID: \"fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.928821 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5\" (UID: \"fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.933971 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5\" (UID: \"fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.935794 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5\" (UID: \"fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5" Feb 27 01:34:46 crc kubenswrapper[4771]: I0227 01:34:46.946901 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrpf5\" (UniqueName: \"kubernetes.io/projected/fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b-kube-api-access-vrpf5\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5\" (UID: \"fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5" Feb 27 01:34:47 crc kubenswrapper[4771]: I0227 01:34:47.035763 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-tclqb"] Feb 27 01:34:47 crc kubenswrapper[4771]: I0227 01:34:47.045054 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-tclqb"] Feb 27 01:34:47 crc kubenswrapper[4771]: I0227 01:34:47.085345 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5" Feb 27 01:34:47 crc kubenswrapper[4771]: I0227 01:34:47.636121 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5"] Feb 27 01:34:47 crc kubenswrapper[4771]: I0227 01:34:47.647665 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 01:34:47 crc kubenswrapper[4771]: I0227 01:34:47.682214 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5" event={"ID":"fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b","Type":"ContainerStarted","Data":"96d62ee545503ab5ac71c01949e09ba69b92295d9d069e4c846e9ba0c0474918"} Feb 27 01:34:47 crc kubenswrapper[4771]: I0227 01:34:47.799789 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a592bd48-ea9a-4f6c-a7fe-49185fbbed82" path="/var/lib/kubelet/pods/a592bd48-ea9a-4f6c-a7fe-49185fbbed82/volumes" Feb 27 01:34:47 crc kubenswrapper[4771]: I0227 01:34:47.805765 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b411543d-f7a2-4a56-acb5-9b2d9598739a" path="/var/lib/kubelet/pods/b411543d-f7a2-4a56-acb5-9b2d9598739a/volumes" Feb 27 01:34:48 crc kubenswrapper[4771]: I0227 01:34:48.075483 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-zhfdt"] Feb 27 01:34:48 crc kubenswrapper[4771]: I0227 01:34:48.098663 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-zhfdt"] Feb 27 01:34:48 crc kubenswrapper[4771]: I0227 01:34:48.178814 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 01:34:48 crc kubenswrapper[4771]: I0227 01:34:48.693595 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5" event={"ID":"fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b","Type":"ContainerStarted","Data":"77d8bc4291f27671ad4a299ffd1e477d1abf4f59cc38119ad39367752b783f21"} Feb 27 01:34:48 crc kubenswrapper[4771]: I0227 01:34:48.720499 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5" podStartSLOduration=2.196195873 podStartE2EDuration="2.720471427s" podCreationTimestamp="2026-02-27 01:34:46 +0000 UTC" firstStartedPulling="2026-02-27 01:34:47.647456521 +0000 UTC m=+1800.585017809" lastFinishedPulling="2026-02-27 01:34:48.171732075 +0000 UTC m=+1801.109293363" observedRunningTime="2026-02-27 01:34:48.709128137 +0000 UTC m=+1801.646689455" watchObservedRunningTime="2026-02-27 01:34:48.720471427 +0000 UTC m=+1801.658032755" Feb 27 01:34:49 crc kubenswrapper[4771]: I0227 01:34:49.784686 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37e7849a-97b9-4e3d-9ad3-c0c942775e64" path="/var/lib/kubelet/pods/37e7849a-97b9-4e3d-9ad3-c0c942775e64/volumes" Feb 27 01:34:53 crc kubenswrapper[4771]: I0227 01:34:53.741399 4771 generic.go:334] "Generic (PLEG): container finished" podID="fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b" containerID="77d8bc4291f27671ad4a299ffd1e477d1abf4f59cc38119ad39367752b783f21" exitCode=0 Feb 27 01:34:53 crc kubenswrapper[4771]: I0227 01:34:53.741698 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5" event={"ID":"fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b","Type":"ContainerDied","Data":"77d8bc4291f27671ad4a299ffd1e477d1abf4f59cc38119ad39367752b783f21"} Feb 27 01:34:55 crc kubenswrapper[4771]: I0227 01:34:55.415727 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:34:55 crc kubenswrapper[4771]: E0227 01:34:55.416046 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:34:55 crc kubenswrapper[4771]: I0227 01:34:55.901625 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5" Feb 27 01:34:55 crc kubenswrapper[4771]: I0227 01:34:55.933382 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b-inventory\") pod \"fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b\" (UID: \"fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b\") " Feb 27 01:34:55 crc kubenswrapper[4771]: I0227 01:34:55.933468 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrpf5\" (UniqueName: \"kubernetes.io/projected/fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b-kube-api-access-vrpf5\") pod \"fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b\" (UID: \"fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b\") " Feb 27 01:34:55 crc kubenswrapper[4771]: I0227 01:34:55.933735 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b-ssh-key-openstack-edpm-ipam\") pod \"fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b\" (UID: \"fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b\") " Feb 27 01:34:55 crc kubenswrapper[4771]: I0227 01:34:55.943076 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b-kube-api-access-vrpf5" (OuterVolumeSpecName: "kube-api-access-vrpf5") pod "fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b" (UID: "fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b"). InnerVolumeSpecName "kube-api-access-vrpf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:34:55 crc kubenswrapper[4771]: I0227 01:34:55.966433 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b-inventory" (OuterVolumeSpecName: "inventory") pod "fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b" (UID: "fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:34:55 crc kubenswrapper[4771]: I0227 01:34:55.972980 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b" (UID: "fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:34:56 crc kubenswrapper[4771]: I0227 01:34:56.037669 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 01:34:56 crc kubenswrapper[4771]: I0227 01:34:56.038091 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 01:34:56 crc kubenswrapper[4771]: I0227 01:34:56.038120 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrpf5\" (UniqueName: \"kubernetes.io/projected/fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b-kube-api-access-vrpf5\") on node \"crc\" DevicePath \"\"" Feb 27 01:34:56 crc kubenswrapper[4771]: I0227 01:34:56.452204 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5" event={"ID":"fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b","Type":"ContainerDied","Data":"96d62ee545503ab5ac71c01949e09ba69b92295d9d069e4c846e9ba0c0474918"} Feb 27 01:34:56 crc kubenswrapper[4771]: I0227 01:34:56.452248 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96d62ee545503ab5ac71c01949e09ba69b92295d9d069e4c846e9ba0c0474918" Feb 27 01:34:56 crc kubenswrapper[4771]: I0227 01:34:56.452272 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5" Feb 27 01:34:56 crc kubenswrapper[4771]: I0227 01:34:56.994735 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kmmj"] Feb 27 01:34:56 crc kubenswrapper[4771]: E0227 01:34:56.995331 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 27 01:34:56 crc kubenswrapper[4771]: I0227 01:34:56.995361 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 27 01:34:56 crc kubenswrapper[4771]: I0227 01:34:56.995725 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 27 01:34:56 crc kubenswrapper[4771]: I0227 01:34:56.996541 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kmmj" Feb 27 01:34:56 crc kubenswrapper[4771]: I0227 01:34:56.999811 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 01:34:56 crc kubenswrapper[4771]: I0227 01:34:56.999868 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 01:34:57 crc kubenswrapper[4771]: I0227 01:34:57.000521 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 01:34:57 crc kubenswrapper[4771]: I0227 01:34:57.000521 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kwjcj" Feb 27 01:34:57 crc kubenswrapper[4771]: I0227 01:34:57.009300 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kmmj"] Feb 27 01:34:57 crc kubenswrapper[4771]: I0227 01:34:57.058463 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/53bf3a2a-497c-4432-8b0f-e8092fcb72ff-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4kmmj\" (UID: \"53bf3a2a-497c-4432-8b0f-e8092fcb72ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kmmj" Feb 27 01:34:57 crc kubenswrapper[4771]: I0227 01:34:57.058679 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkf86\" (UniqueName: \"kubernetes.io/projected/53bf3a2a-497c-4432-8b0f-e8092fcb72ff-kube-api-access-bkf86\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4kmmj\" (UID: \"53bf3a2a-497c-4432-8b0f-e8092fcb72ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kmmj" Feb 27 01:34:57 crc kubenswrapper[4771]: I0227 01:34:57.058774 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53bf3a2a-497c-4432-8b0f-e8092fcb72ff-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4kmmj\" (UID: \"53bf3a2a-497c-4432-8b0f-e8092fcb72ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kmmj" Feb 27 01:34:57 crc kubenswrapper[4771]: I0227 01:34:57.160252 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkf86\" (UniqueName: \"kubernetes.io/projected/53bf3a2a-497c-4432-8b0f-e8092fcb72ff-kube-api-access-bkf86\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4kmmj\" (UID: \"53bf3a2a-497c-4432-8b0f-e8092fcb72ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kmmj" Feb 27 01:34:57 crc kubenswrapper[4771]: I0227 01:34:57.160317 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53bf3a2a-497c-4432-8b0f-e8092fcb72ff-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4kmmj\" (UID: \"53bf3a2a-497c-4432-8b0f-e8092fcb72ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kmmj" Feb 27 01:34:57 crc kubenswrapper[4771]: I0227 01:34:57.160436 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/53bf3a2a-497c-4432-8b0f-e8092fcb72ff-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4kmmj\" (UID: \"53bf3a2a-497c-4432-8b0f-e8092fcb72ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kmmj" Feb 27 01:34:57 crc kubenswrapper[4771]: I0227 01:34:57.166677 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53bf3a2a-497c-4432-8b0f-e8092fcb72ff-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4kmmj\" (UID: \"53bf3a2a-497c-4432-8b0f-e8092fcb72ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kmmj" Feb 27 01:34:57 crc kubenswrapper[4771]: I0227 01:34:57.167102 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/53bf3a2a-497c-4432-8b0f-e8092fcb72ff-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4kmmj\" (UID: \"53bf3a2a-497c-4432-8b0f-e8092fcb72ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kmmj" Feb 27 01:34:57 crc kubenswrapper[4771]: I0227 01:34:57.182409 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkf86\" (UniqueName: \"kubernetes.io/projected/53bf3a2a-497c-4432-8b0f-e8092fcb72ff-kube-api-access-bkf86\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4kmmj\" (UID: \"53bf3a2a-497c-4432-8b0f-e8092fcb72ff\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kmmj" Feb 27 01:34:57 crc kubenswrapper[4771]: I0227 01:34:57.319422 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kmmj" Feb 27 01:34:58 crc kubenswrapper[4771]: I0227 01:34:58.030330 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kmmj"] Feb 27 01:34:58 crc kubenswrapper[4771]: I0227 01:34:58.494903 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kmmj" event={"ID":"53bf3a2a-497c-4432-8b0f-e8092fcb72ff","Type":"ContainerStarted","Data":"6cbf18c87857b2f416310e9001e7ed00cca93434804d1aefd18453ae181d8f87"} Feb 27 01:34:59 crc kubenswrapper[4771]: I0227 01:34:59.504883 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kmmj" event={"ID":"53bf3a2a-497c-4432-8b0f-e8092fcb72ff","Type":"ContainerStarted","Data":"2dad0cf5a5d2322c36020693571c78c63d6c7ff027ef6a02ba606903e72489e6"} Feb 27 01:34:59 crc kubenswrapper[4771]: I0227 01:34:59.523202 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kmmj" podStartSLOduration=3.057769481 podStartE2EDuration="3.523185477s" podCreationTimestamp="2026-02-27 01:34:56 +0000 UTC" firstStartedPulling="2026-02-27 01:34:58.043463091 +0000 UTC m=+1810.981024379" lastFinishedPulling="2026-02-27 01:34:58.508879067 +0000 UTC m=+1811.446440375" observedRunningTime="2026-02-27 01:34:59.520025631 +0000 UTC m=+1812.457586919" watchObservedRunningTime="2026-02-27 01:34:59.523185477 +0000 UTC m=+1812.460746765" Feb 27 01:35:06 crc kubenswrapper[4771]: I0227 01:35:06.774014 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:35:06 crc kubenswrapper[4771]: E0227 01:35:06.775105 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:35:20 crc kubenswrapper[4771]: I0227 01:35:20.773993 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:35:20 crc kubenswrapper[4771]: E0227 01:35:20.774978 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:35:27 crc kubenswrapper[4771]: I0227 01:35:27.960721 4771 scope.go:117] "RemoveContainer" containerID="cbd0d7c7d1911e59d1ef392693fbaee34ea486d72e649c9b2f23b338d82aa822" Feb 27 01:35:28 crc kubenswrapper[4771]: I0227 01:35:28.003159 4771 scope.go:117] "RemoveContainer" containerID="96a82b8dbdf626beb081b086821121237bcc704dca5923bbb45584bbac786f18" Feb 27 01:35:28 crc kubenswrapper[4771]: I0227 01:35:28.065840 4771 scope.go:117] "RemoveContainer" containerID="e92d952b0af34475040583946aa19405a784b9afca08253635c5d9f8c5d9db1d" Feb 27 01:35:28 crc kubenswrapper[4771]: I0227 01:35:28.128020 4771 scope.go:117] "RemoveContainer" containerID="c696924d536a08d6ade8f59f5787d17d17f48f3070b5b227285d140bca360a9c" Feb 27 01:35:32 crc kubenswrapper[4771]: I0227 01:35:32.773649 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:35:32 crc kubenswrapper[4771]: E0227 01:35:32.774860 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:35:34 crc kubenswrapper[4771]: I0227 01:35:34.069337 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-ssvdh"] Feb 27 01:35:34 crc kubenswrapper[4771]: I0227 01:35:34.076738 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-eb75-account-create-update-9d68f"] Feb 27 01:35:34 crc kubenswrapper[4771]: I0227 01:35:34.083417 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-a6fe-account-create-update-52b7q"] Feb 27 01:35:34 crc kubenswrapper[4771]: I0227 01:35:34.093982 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-kz4w4"] Feb 27 01:35:34 crc kubenswrapper[4771]: I0227 01:35:34.102137 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-d71f-account-create-update-qphgx"] Feb 27 01:35:34 crc kubenswrapper[4771]: I0227 01:35:34.108774 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-ssvdh"] Feb 27 01:35:34 crc kubenswrapper[4771]: I0227 01:35:34.115190 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-7hm42"] Feb 27 01:35:34 crc kubenswrapper[4771]: I0227 01:35:34.121825 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-eb75-account-create-update-9d68f"] Feb 27 01:35:34 crc kubenswrapper[4771]: I0227 01:35:34.128044 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-d71f-account-create-update-qphgx"] Feb 27 01:35:34 crc kubenswrapper[4771]: I0227 01:35:34.133936 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-7hm42"] Feb 27 01:35:34 crc kubenswrapper[4771]: I0227 01:35:34.139814 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-kz4w4"] Feb 27 01:35:34 crc kubenswrapper[4771]: I0227 01:35:34.145688 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-a6fe-account-create-update-52b7q"] Feb 27 01:35:35 crc kubenswrapper[4771]: I0227 01:35:35.788100 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c70c92b-3989-4fa8-9a8a-ce6be8839d8e" path="/var/lib/kubelet/pods/1c70c92b-3989-4fa8-9a8a-ce6be8839d8e/volumes" Feb 27 01:35:35 crc kubenswrapper[4771]: I0227 01:35:35.790104 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac233507-57ad-484c-817b-270cee86a50a" path="/var/lib/kubelet/pods/ac233507-57ad-484c-817b-270cee86a50a/volumes" Feb 27 01:35:35 crc kubenswrapper[4771]: I0227 01:35:35.791357 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8681229-5d10-47f2-8cdf-fa8b6c584ef8" path="/var/lib/kubelet/pods/d8681229-5d10-47f2-8cdf-fa8b6c584ef8/volumes" Feb 27 01:35:35 crc kubenswrapper[4771]: I0227 01:35:35.792831 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e18a7c26-064f-4b67-b9e2-d8a66499cec8" path="/var/lib/kubelet/pods/e18a7c26-064f-4b67-b9e2-d8a66499cec8/volumes" Feb 27 01:35:35 crc kubenswrapper[4771]: I0227 01:35:35.795849 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6e8c01e-0567-48bd-aaef-580afc5667af" path="/var/lib/kubelet/pods/e6e8c01e-0567-48bd-aaef-580afc5667af/volumes" Feb 27 01:35:35 crc kubenswrapper[4771]: I0227 01:35:35.797021 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7" path="/var/lib/kubelet/pods/f2fe7b97-6e66-4c80-87c1-a81f5efb5dc7/volumes" Feb 27 01:35:35 crc kubenswrapper[4771]: I0227 01:35:35.916360 4771 generic.go:334] "Generic (PLEG): container finished" podID="53bf3a2a-497c-4432-8b0f-e8092fcb72ff" containerID="2dad0cf5a5d2322c36020693571c78c63d6c7ff027ef6a02ba606903e72489e6" exitCode=0 Feb 27 01:35:35 crc kubenswrapper[4771]: I0227 01:35:35.916411 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kmmj" event={"ID":"53bf3a2a-497c-4432-8b0f-e8092fcb72ff","Type":"ContainerDied","Data":"2dad0cf5a5d2322c36020693571c78c63d6c7ff027ef6a02ba606903e72489e6"} Feb 27 01:35:37 crc kubenswrapper[4771]: I0227 01:35:37.394698 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kmmj" Feb 27 01:35:37 crc kubenswrapper[4771]: I0227 01:35:37.476777 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/53bf3a2a-497c-4432-8b0f-e8092fcb72ff-ssh-key-openstack-edpm-ipam\") pod \"53bf3a2a-497c-4432-8b0f-e8092fcb72ff\" (UID: \"53bf3a2a-497c-4432-8b0f-e8092fcb72ff\") " Feb 27 01:35:37 crc kubenswrapper[4771]: I0227 01:35:37.477029 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkf86\" (UniqueName: \"kubernetes.io/projected/53bf3a2a-497c-4432-8b0f-e8092fcb72ff-kube-api-access-bkf86\") pod \"53bf3a2a-497c-4432-8b0f-e8092fcb72ff\" (UID: \"53bf3a2a-497c-4432-8b0f-e8092fcb72ff\") " Feb 27 01:35:37 crc kubenswrapper[4771]: I0227 01:35:37.477485 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53bf3a2a-497c-4432-8b0f-e8092fcb72ff-inventory\") pod \"53bf3a2a-497c-4432-8b0f-e8092fcb72ff\" (UID: \"53bf3a2a-497c-4432-8b0f-e8092fcb72ff\") " Feb 27 01:35:37 crc kubenswrapper[4771]: I0227 01:35:37.485016 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53bf3a2a-497c-4432-8b0f-e8092fcb72ff-kube-api-access-bkf86" (OuterVolumeSpecName: "kube-api-access-bkf86") pod "53bf3a2a-497c-4432-8b0f-e8092fcb72ff" (UID: "53bf3a2a-497c-4432-8b0f-e8092fcb72ff"). InnerVolumeSpecName "kube-api-access-bkf86". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:35:37 crc kubenswrapper[4771]: I0227 01:35:37.509456 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53bf3a2a-497c-4432-8b0f-e8092fcb72ff-inventory" (OuterVolumeSpecName: "inventory") pod "53bf3a2a-497c-4432-8b0f-e8092fcb72ff" (UID: "53bf3a2a-497c-4432-8b0f-e8092fcb72ff"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:35:37 crc kubenswrapper[4771]: I0227 01:35:37.529259 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53bf3a2a-497c-4432-8b0f-e8092fcb72ff-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "53bf3a2a-497c-4432-8b0f-e8092fcb72ff" (UID: "53bf3a2a-497c-4432-8b0f-e8092fcb72ff"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:35:37 crc kubenswrapper[4771]: I0227 01:35:37.580625 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53bf3a2a-497c-4432-8b0f-e8092fcb72ff-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 01:35:37 crc kubenswrapper[4771]: I0227 01:35:37.580724 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/53bf3a2a-497c-4432-8b0f-e8092fcb72ff-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 01:35:37 crc kubenswrapper[4771]: I0227 01:35:37.580779 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkf86\" (UniqueName: \"kubernetes.io/projected/53bf3a2a-497c-4432-8b0f-e8092fcb72ff-kube-api-access-bkf86\") on node \"crc\" DevicePath \"\"" Feb 27 01:35:37 crc kubenswrapper[4771]: I0227 01:35:37.938821 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kmmj" event={"ID":"53bf3a2a-497c-4432-8b0f-e8092fcb72ff","Type":"ContainerDied","Data":"6cbf18c87857b2f416310e9001e7ed00cca93434804d1aefd18453ae181d8f87"} Feb 27 01:35:37 crc kubenswrapper[4771]: I0227 01:35:37.938896 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4kmmj" Feb 27 01:35:37 crc kubenswrapper[4771]: I0227 01:35:37.938898 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cbf18c87857b2f416310e9001e7ed00cca93434804d1aefd18453ae181d8f87" Feb 27 01:35:38 crc kubenswrapper[4771]: I0227 01:35:38.062934 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl"] Feb 27 01:35:38 crc kubenswrapper[4771]: E0227 01:35:38.069380 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53bf3a2a-497c-4432-8b0f-e8092fcb72ff" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 27 01:35:38 crc kubenswrapper[4771]: I0227 01:35:38.069672 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="53bf3a2a-497c-4432-8b0f-e8092fcb72ff" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 27 01:35:38 crc kubenswrapper[4771]: I0227 01:35:38.070012 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="53bf3a2a-497c-4432-8b0f-e8092fcb72ff" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 27 01:35:38 crc kubenswrapper[4771]: I0227 01:35:38.071157 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl" Feb 27 01:35:38 crc kubenswrapper[4771]: I0227 01:35:38.074052 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl"] Feb 27 01:35:38 crc kubenswrapper[4771]: I0227 01:35:38.121992 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 01:35:38 crc kubenswrapper[4771]: I0227 01:35:38.122242 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kwjcj" Feb 27 01:35:38 crc kubenswrapper[4771]: I0227 01:35:38.122486 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 01:35:38 crc kubenswrapper[4771]: I0227 01:35:38.122827 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 01:35:38 crc kubenswrapper[4771]: I0227 01:35:38.224579 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67424e5d-eec0-4d0c-ba08-eebe40f4ac6e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl\" (UID: \"67424e5d-eec0-4d0c-ba08-eebe40f4ac6e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl" Feb 27 01:35:38 crc kubenswrapper[4771]: I0227 01:35:38.224639 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/67424e5d-eec0-4d0c-ba08-eebe40f4ac6e-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl\" (UID: \"67424e5d-eec0-4d0c-ba08-eebe40f4ac6e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl" Feb 27 01:35:38 crc kubenswrapper[4771]: I0227 01:35:38.225211 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxtkr\" (UniqueName: \"kubernetes.io/projected/67424e5d-eec0-4d0c-ba08-eebe40f4ac6e-kube-api-access-bxtkr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl\" (UID: \"67424e5d-eec0-4d0c-ba08-eebe40f4ac6e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl" Feb 27 01:35:38 crc kubenswrapper[4771]: I0227 01:35:38.327057 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67424e5d-eec0-4d0c-ba08-eebe40f4ac6e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl\" (UID: \"67424e5d-eec0-4d0c-ba08-eebe40f4ac6e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl" Feb 27 01:35:38 crc kubenswrapper[4771]: I0227 01:35:38.327139 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/67424e5d-eec0-4d0c-ba08-eebe40f4ac6e-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl\" (UID: \"67424e5d-eec0-4d0c-ba08-eebe40f4ac6e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl" Feb 27 01:35:38 crc kubenswrapper[4771]: I0227 01:35:38.327363 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxtkr\" (UniqueName: \"kubernetes.io/projected/67424e5d-eec0-4d0c-ba08-eebe40f4ac6e-kube-api-access-bxtkr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl\" (UID: \"67424e5d-eec0-4d0c-ba08-eebe40f4ac6e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl" Feb 27 01:35:38 crc kubenswrapper[4771]: I0227 01:35:38.334201 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67424e5d-eec0-4d0c-ba08-eebe40f4ac6e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl\" (UID: \"67424e5d-eec0-4d0c-ba08-eebe40f4ac6e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl" Feb 27 01:35:38 crc kubenswrapper[4771]: I0227 01:35:38.334264 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/67424e5d-eec0-4d0c-ba08-eebe40f4ac6e-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl\" (UID: \"67424e5d-eec0-4d0c-ba08-eebe40f4ac6e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl" Feb 27 01:35:38 crc kubenswrapper[4771]: I0227 01:35:38.346725 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxtkr\" (UniqueName: \"kubernetes.io/projected/67424e5d-eec0-4d0c-ba08-eebe40f4ac6e-kube-api-access-bxtkr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl\" (UID: \"67424e5d-eec0-4d0c-ba08-eebe40f4ac6e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl" Feb 27 01:35:38 crc kubenswrapper[4771]: I0227 01:35:38.447668 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl" Feb 27 01:35:39 crc kubenswrapper[4771]: I0227 01:35:39.001748 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl"] Feb 27 01:35:39 crc kubenswrapper[4771]: I0227 01:35:39.960285 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl" event={"ID":"67424e5d-eec0-4d0c-ba08-eebe40f4ac6e","Type":"ContainerStarted","Data":"553baeeefb4bf12e9c0cbde0dfa0f32cbfc482fe2e470ae0508f7134240f4c83"} Feb 27 01:35:39 crc kubenswrapper[4771]: I0227 01:35:39.960716 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl" event={"ID":"67424e5d-eec0-4d0c-ba08-eebe40f4ac6e","Type":"ContainerStarted","Data":"7824c5a4db32d098444152dff47b8eb5e7034bb75cb675972cf076e84cfc8787"} Feb 27 01:35:39 crc kubenswrapper[4771]: I0227 01:35:39.980347 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl" podStartSLOduration=1.538602502 podStartE2EDuration="1.980322939s" podCreationTimestamp="2026-02-27 01:35:38 +0000 UTC" firstStartedPulling="2026-02-27 01:35:39.034716795 +0000 UTC m=+1851.972278083" lastFinishedPulling="2026-02-27 01:35:39.476437232 +0000 UTC m=+1852.413998520" observedRunningTime="2026-02-27 01:35:39.976775052 +0000 UTC m=+1852.914336350" watchObservedRunningTime="2026-02-27 01:35:39.980322939 +0000 UTC m=+1852.917884227" Feb 27 01:35:45 crc kubenswrapper[4771]: I0227 01:35:45.774200 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:35:45 crc kubenswrapper[4771]: E0227 01:35:45.775041 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:35:57 crc kubenswrapper[4771]: I0227 01:35:57.075654 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zh9lc"] Feb 27 01:35:57 crc kubenswrapper[4771]: I0227 01:35:57.089080 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zh9lc"] Feb 27 01:35:57 crc kubenswrapper[4771]: I0227 01:35:57.785768 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a927ae95-187b-4517-b54d-7faaf7de3155" path="/var/lib/kubelet/pods/a927ae95-187b-4517-b54d-7faaf7de3155/volumes" Feb 27 01:35:58 crc kubenswrapper[4771]: I0227 01:35:58.773461 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:35:58 crc kubenswrapper[4771]: E0227 01:35:58.774059 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:36:00 crc kubenswrapper[4771]: I0227 01:36:00.180205 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535936-9v8rl"] Feb 27 01:36:00 crc kubenswrapper[4771]: I0227 01:36:00.182022 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535936-9v8rl" Feb 27 01:36:00 crc kubenswrapper[4771]: I0227 01:36:00.186061 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:36:00 crc kubenswrapper[4771]: I0227 01:36:00.186405 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:36:00 crc kubenswrapper[4771]: I0227 01:36:00.187160 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4scd\" (UniqueName: \"kubernetes.io/projected/c4ccf7d9-1c8a-4135-a561-4696af85f0d2-kube-api-access-g4scd\") pod \"auto-csr-approver-29535936-9v8rl\" (UID: \"c4ccf7d9-1c8a-4135-a561-4696af85f0d2\") " pod="openshift-infra/auto-csr-approver-29535936-9v8rl" Feb 27 01:36:00 crc kubenswrapper[4771]: I0227 01:36:00.190317 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 01:36:00 crc kubenswrapper[4771]: I0227 01:36:00.193032 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535936-9v8rl"] Feb 27 01:36:00 crc kubenswrapper[4771]: I0227 01:36:00.287860 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4scd\" (UniqueName: \"kubernetes.io/projected/c4ccf7d9-1c8a-4135-a561-4696af85f0d2-kube-api-access-g4scd\") pod \"auto-csr-approver-29535936-9v8rl\" (UID: \"c4ccf7d9-1c8a-4135-a561-4696af85f0d2\") " pod="openshift-infra/auto-csr-approver-29535936-9v8rl" Feb 27 01:36:00 crc kubenswrapper[4771]: I0227 01:36:00.311027 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4scd\" (UniqueName: \"kubernetes.io/projected/c4ccf7d9-1c8a-4135-a561-4696af85f0d2-kube-api-access-g4scd\") pod \"auto-csr-approver-29535936-9v8rl\" (UID: \"c4ccf7d9-1c8a-4135-a561-4696af85f0d2\") " pod="openshift-infra/auto-csr-approver-29535936-9v8rl" Feb 27 01:36:00 crc kubenswrapper[4771]: I0227 01:36:00.503668 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535936-9v8rl" Feb 27 01:36:01 crc kubenswrapper[4771]: I0227 01:36:01.009852 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535936-9v8rl"] Feb 27 01:36:01 crc kubenswrapper[4771]: W0227 01:36:01.011493 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4ccf7d9_1c8a_4135_a561_4696af85f0d2.slice/crio-5ca29b446a65df613b4db2acf837e87b2960a19e84c2153816c0e48ff37cced1 WatchSource:0}: Error finding container 5ca29b446a65df613b4db2acf837e87b2960a19e84c2153816c0e48ff37cced1: Status 404 returned error can't find the container with id 5ca29b446a65df613b4db2acf837e87b2960a19e84c2153816c0e48ff37cced1 Feb 27 01:36:01 crc kubenswrapper[4771]: I0227 01:36:01.157704 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535936-9v8rl" event={"ID":"c4ccf7d9-1c8a-4135-a561-4696af85f0d2","Type":"ContainerStarted","Data":"5ca29b446a65df613b4db2acf837e87b2960a19e84c2153816c0e48ff37cced1"} Feb 27 01:36:03 crc kubenswrapper[4771]: I0227 01:36:03.175847 4771 generic.go:334] "Generic (PLEG): container finished" podID="c4ccf7d9-1c8a-4135-a561-4696af85f0d2" containerID="8e9357fb966ff6815d3212cc080f320deadd0e408de9e6b6cc13ed3148b0b347" exitCode=0 Feb 27 01:36:03 crc kubenswrapper[4771]: I0227 01:36:03.175921 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535936-9v8rl" event={"ID":"c4ccf7d9-1c8a-4135-a561-4696af85f0d2","Type":"ContainerDied","Data":"8e9357fb966ff6815d3212cc080f320deadd0e408de9e6b6cc13ed3148b0b347"} Feb 27 01:36:04 crc kubenswrapper[4771]: I0227 01:36:04.604893 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535936-9v8rl" Feb 27 01:36:04 crc kubenswrapper[4771]: I0227 01:36:04.771926 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4scd\" (UniqueName: \"kubernetes.io/projected/c4ccf7d9-1c8a-4135-a561-4696af85f0d2-kube-api-access-g4scd\") pod \"c4ccf7d9-1c8a-4135-a561-4696af85f0d2\" (UID: \"c4ccf7d9-1c8a-4135-a561-4696af85f0d2\") " Feb 27 01:36:04 crc kubenswrapper[4771]: I0227 01:36:04.782773 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4ccf7d9-1c8a-4135-a561-4696af85f0d2-kube-api-access-g4scd" (OuterVolumeSpecName: "kube-api-access-g4scd") pod "c4ccf7d9-1c8a-4135-a561-4696af85f0d2" (UID: "c4ccf7d9-1c8a-4135-a561-4696af85f0d2"). InnerVolumeSpecName "kube-api-access-g4scd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:36:04 crc kubenswrapper[4771]: I0227 01:36:04.874755 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4scd\" (UniqueName: \"kubernetes.io/projected/c4ccf7d9-1c8a-4135-a561-4696af85f0d2-kube-api-access-g4scd\") on node \"crc\" DevicePath \"\"" Feb 27 01:36:05 crc kubenswrapper[4771]: I0227 01:36:05.196521 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535936-9v8rl" event={"ID":"c4ccf7d9-1c8a-4135-a561-4696af85f0d2","Type":"ContainerDied","Data":"5ca29b446a65df613b4db2acf837e87b2960a19e84c2153816c0e48ff37cced1"} Feb 27 01:36:05 crc kubenswrapper[4771]: I0227 01:36:05.196605 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ca29b446a65df613b4db2acf837e87b2960a19e84c2153816c0e48ff37cced1" Feb 27 01:36:05 crc kubenswrapper[4771]: I0227 01:36:05.196601 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535936-9v8rl" Feb 27 01:36:05 crc kubenswrapper[4771]: I0227 01:36:05.670771 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535930-g6sj2"] Feb 27 01:36:05 crc kubenswrapper[4771]: I0227 01:36:05.682659 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535930-g6sj2"] Feb 27 01:36:05 crc kubenswrapper[4771]: I0227 01:36:05.787814 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e3b6274-faf2-4259-908a-c50a1babdb81" path="/var/lib/kubelet/pods/3e3b6274-faf2-4259-908a-c50a1babdb81/volumes" Feb 27 01:36:12 crc kubenswrapper[4771]: I0227 01:36:12.772982 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:36:12 crc kubenswrapper[4771]: E0227 01:36:12.774256 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:36:19 crc kubenswrapper[4771]: I0227 01:36:19.030610 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-6vpvq"] Feb 27 01:36:19 crc kubenswrapper[4771]: I0227 01:36:19.038984 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-6vpvq"] Feb 27 01:36:19 crc kubenswrapper[4771]: I0227 01:36:19.787956 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d5da4dc-52ee-48f5-b5af-0fea453db0d7" path="/var/lib/kubelet/pods/4d5da4dc-52ee-48f5-b5af-0fea453db0d7/volumes" Feb 27 01:36:20 crc kubenswrapper[4771]: I0227 01:36:20.032930 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bbd9v"] Feb 27 01:36:20 crc kubenswrapper[4771]: I0227 01:36:20.042539 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bbd9v"] Feb 27 01:36:21 crc kubenswrapper[4771]: I0227 01:36:21.799429 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76c84cc9-8833-4227-af3f-7064c9232366" path="/var/lib/kubelet/pods/76c84cc9-8833-4227-af3f-7064c9232366/volumes" Feb 27 01:36:25 crc kubenswrapper[4771]: I0227 01:36:25.774172 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:36:25 crc kubenswrapper[4771]: E0227 01:36:25.775632 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:36:28 crc kubenswrapper[4771]: I0227 01:36:28.236920 4771 scope.go:117] "RemoveContainer" containerID="ee68bff01ce8ca900c86f5a63b7a74e4a67b432bd36c1ab26062f6d21057fcee" Feb 27 01:36:28 crc kubenswrapper[4771]: I0227 01:36:28.288313 4771 scope.go:117] "RemoveContainer" containerID="e46e028aae564b8b61bae21b82557178b6c6124429818922dfca92da85ac543f" Feb 27 01:36:28 crc kubenswrapper[4771]: I0227 01:36:28.351279 4771 scope.go:117] "RemoveContainer" containerID="37c856aacb1089f98e87ed101087ce663fea108416ee0ee9716ff87847067073" Feb 27 01:36:28 crc kubenswrapper[4771]: I0227 01:36:28.386247 4771 scope.go:117] "RemoveContainer" containerID="ec2f9a03b3dda302804ac55c3e4a7b3e8c1f087c8f42a89c2abf799701567422" Feb 27 01:36:28 crc kubenswrapper[4771]: I0227 01:36:28.503273 4771 scope.go:117] "RemoveContainer" containerID="bcdff32e1ad8a2f8ddd816aa35365e0c1ab30ef0c1ab7ec36c45ddd67cde41fe" Feb 27 01:36:28 crc kubenswrapper[4771]: I0227 01:36:28.531580 4771 scope.go:117] "RemoveContainer" containerID="de4c99802ee4fb5b938608f68ba2f8f6e8581da8ec45da90efbd05b14c2e2124" Feb 27 01:36:28 crc kubenswrapper[4771]: I0227 01:36:28.566097 4771 scope.go:117] "RemoveContainer" containerID="a2240891cb114d6fa5dcbf2cd6dcd84379d8ca96e14a60c684c90dece827cd5d" Feb 27 01:36:28 crc kubenswrapper[4771]: I0227 01:36:28.628443 4771 scope.go:117] "RemoveContainer" containerID="bf2e80dfffdae487e743928065935ce3b663de7121e35856f3e40f18ab34213c" Feb 27 01:36:28 crc kubenswrapper[4771]: I0227 01:36:28.649505 4771 scope.go:117] "RemoveContainer" containerID="a03b248a22aa2e9b7dd3f17102eb2ab0fdc9f235c7b021aae5b056ecef362cb1" Feb 27 01:36:28 crc kubenswrapper[4771]: I0227 01:36:28.694609 4771 scope.go:117] "RemoveContainer" containerID="4f25ff54fef33988e38ce0a2441a74ef999ed718b9f05e65087731a7701707d7" Feb 27 01:36:30 crc kubenswrapper[4771]: I0227 01:36:30.463183 4771 generic.go:334] "Generic (PLEG): container finished" podID="67424e5d-eec0-4d0c-ba08-eebe40f4ac6e" containerID="553baeeefb4bf12e9c0cbde0dfa0f32cbfc482fe2e470ae0508f7134240f4c83" exitCode=0 Feb 27 01:36:30 crc kubenswrapper[4771]: I0227 01:36:30.463233 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl" event={"ID":"67424e5d-eec0-4d0c-ba08-eebe40f4ac6e","Type":"ContainerDied","Data":"553baeeefb4bf12e9c0cbde0dfa0f32cbfc482fe2e470ae0508f7134240f4c83"} Feb 27 01:36:31 crc kubenswrapper[4771]: I0227 01:36:31.894878 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.015827 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxtkr\" (UniqueName: \"kubernetes.io/projected/67424e5d-eec0-4d0c-ba08-eebe40f4ac6e-kube-api-access-bxtkr\") pod \"67424e5d-eec0-4d0c-ba08-eebe40f4ac6e\" (UID: \"67424e5d-eec0-4d0c-ba08-eebe40f4ac6e\") " Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.016010 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67424e5d-eec0-4d0c-ba08-eebe40f4ac6e-inventory\") pod \"67424e5d-eec0-4d0c-ba08-eebe40f4ac6e\" (UID: \"67424e5d-eec0-4d0c-ba08-eebe40f4ac6e\") " Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.016035 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/67424e5d-eec0-4d0c-ba08-eebe40f4ac6e-ssh-key-openstack-edpm-ipam\") pod \"67424e5d-eec0-4d0c-ba08-eebe40f4ac6e\" (UID: \"67424e5d-eec0-4d0c-ba08-eebe40f4ac6e\") " Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.022801 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67424e5d-eec0-4d0c-ba08-eebe40f4ac6e-kube-api-access-bxtkr" (OuterVolumeSpecName: "kube-api-access-bxtkr") pod "67424e5d-eec0-4d0c-ba08-eebe40f4ac6e" (UID: "67424e5d-eec0-4d0c-ba08-eebe40f4ac6e"). InnerVolumeSpecName "kube-api-access-bxtkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.055321 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67424e5d-eec0-4d0c-ba08-eebe40f4ac6e-inventory" (OuterVolumeSpecName: "inventory") pod "67424e5d-eec0-4d0c-ba08-eebe40f4ac6e" (UID: "67424e5d-eec0-4d0c-ba08-eebe40f4ac6e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.070170 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67424e5d-eec0-4d0c-ba08-eebe40f4ac6e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "67424e5d-eec0-4d0c-ba08-eebe40f4ac6e" (UID: "67424e5d-eec0-4d0c-ba08-eebe40f4ac6e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.118889 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67424e5d-eec0-4d0c-ba08-eebe40f4ac6e-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.118942 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/67424e5d-eec0-4d0c-ba08-eebe40f4ac6e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.118991 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxtkr\" (UniqueName: \"kubernetes.io/projected/67424e5d-eec0-4d0c-ba08-eebe40f4ac6e-kube-api-access-bxtkr\") on node \"crc\" DevicePath \"\"" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.483478 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl" event={"ID":"67424e5d-eec0-4d0c-ba08-eebe40f4ac6e","Type":"ContainerDied","Data":"7824c5a4db32d098444152dff47b8eb5e7034bb75cb675972cf076e84cfc8787"} Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.483893 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7824c5a4db32d098444152dff47b8eb5e7034bb75cb675972cf076e84cfc8787" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.483588 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.576387 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-2v7s2"] Feb 27 01:36:32 crc kubenswrapper[4771]: E0227 01:36:32.576903 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67424e5d-eec0-4d0c-ba08-eebe40f4ac6e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.576926 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="67424e5d-eec0-4d0c-ba08-eebe40f4ac6e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 27 01:36:32 crc kubenswrapper[4771]: E0227 01:36:32.576962 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ccf7d9-1c8a-4135-a561-4696af85f0d2" containerName="oc" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.576972 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ccf7d9-1c8a-4135-a561-4696af85f0d2" containerName="oc" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.577224 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4ccf7d9-1c8a-4135-a561-4696af85f0d2" containerName="oc" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.577245 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="67424e5d-eec0-4d0c-ba08-eebe40f4ac6e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.578072 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2v7s2" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.584084 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.584263 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.584395 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kwjcj" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.585062 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.601096 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-2v7s2"] Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.732245 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59c79bc1-5dcb-495e-8ce8-7c74517d2df6-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-2v7s2\" (UID: \"59c79bc1-5dcb-495e-8ce8-7c74517d2df6\") " pod="openstack/ssh-known-hosts-edpm-deployment-2v7s2" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.732619 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/59c79bc1-5dcb-495e-8ce8-7c74517d2df6-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-2v7s2\" (UID: \"59c79bc1-5dcb-495e-8ce8-7c74517d2df6\") " pod="openstack/ssh-known-hosts-edpm-deployment-2v7s2" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.732825 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvwms\" (UniqueName: \"kubernetes.io/projected/59c79bc1-5dcb-495e-8ce8-7c74517d2df6-kube-api-access-jvwms\") pod \"ssh-known-hosts-edpm-deployment-2v7s2\" (UID: \"59c79bc1-5dcb-495e-8ce8-7c74517d2df6\") " pod="openstack/ssh-known-hosts-edpm-deployment-2v7s2" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.834692 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59c79bc1-5dcb-495e-8ce8-7c74517d2df6-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-2v7s2\" (UID: \"59c79bc1-5dcb-495e-8ce8-7c74517d2df6\") " pod="openstack/ssh-known-hosts-edpm-deployment-2v7s2" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.834799 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/59c79bc1-5dcb-495e-8ce8-7c74517d2df6-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-2v7s2\" (UID: \"59c79bc1-5dcb-495e-8ce8-7c74517d2df6\") " pod="openstack/ssh-known-hosts-edpm-deployment-2v7s2" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.834837 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvwms\" (UniqueName: \"kubernetes.io/projected/59c79bc1-5dcb-495e-8ce8-7c74517d2df6-kube-api-access-jvwms\") pod \"ssh-known-hosts-edpm-deployment-2v7s2\" (UID: \"59c79bc1-5dcb-495e-8ce8-7c74517d2df6\") " pod="openstack/ssh-known-hosts-edpm-deployment-2v7s2" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.838016 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/59c79bc1-5dcb-495e-8ce8-7c74517d2df6-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-2v7s2\" (UID: \"59c79bc1-5dcb-495e-8ce8-7c74517d2df6\") " pod="openstack/ssh-known-hosts-edpm-deployment-2v7s2" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.838162 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59c79bc1-5dcb-495e-8ce8-7c74517d2df6-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-2v7s2\" (UID: \"59c79bc1-5dcb-495e-8ce8-7c74517d2df6\") " pod="openstack/ssh-known-hosts-edpm-deployment-2v7s2" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.850139 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvwms\" (UniqueName: \"kubernetes.io/projected/59c79bc1-5dcb-495e-8ce8-7c74517d2df6-kube-api-access-jvwms\") pod \"ssh-known-hosts-edpm-deployment-2v7s2\" (UID: \"59c79bc1-5dcb-495e-8ce8-7c74517d2df6\") " pod="openstack/ssh-known-hosts-edpm-deployment-2v7s2" Feb 27 01:36:32 crc kubenswrapper[4771]: I0227 01:36:32.895153 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2v7s2" Feb 27 01:36:33 crc kubenswrapper[4771]: I0227 01:36:33.483894 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-2v7s2"] Feb 27 01:36:33 crc kubenswrapper[4771]: W0227 01:36:33.487583 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59c79bc1_5dcb_495e_8ce8_7c74517d2df6.slice/crio-9a27bf26ccc13962429019a3ebc9a41093eb6d2f34a6bc9d66ab39360225dcf6 WatchSource:0}: Error finding container 9a27bf26ccc13962429019a3ebc9a41093eb6d2f34a6bc9d66ab39360225dcf6: Status 404 returned error can't find the container with id 9a27bf26ccc13962429019a3ebc9a41093eb6d2f34a6bc9d66ab39360225dcf6 Feb 27 01:36:34 crc kubenswrapper[4771]: I0227 01:36:34.506730 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2v7s2" event={"ID":"59c79bc1-5dcb-495e-8ce8-7c74517d2df6","Type":"ContainerStarted","Data":"e889567f6718b1c62d0642dd472d4453c905fba63caca5a76daa8467c838db37"} Feb 27 01:36:34 crc kubenswrapper[4771]: I0227 01:36:34.507089 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2v7s2" event={"ID":"59c79bc1-5dcb-495e-8ce8-7c74517d2df6","Type":"ContainerStarted","Data":"9a27bf26ccc13962429019a3ebc9a41093eb6d2f34a6bc9d66ab39360225dcf6"} Feb 27 01:36:34 crc kubenswrapper[4771]: I0227 01:36:34.535136 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-2v7s2" podStartSLOduration=2.112260627 podStartE2EDuration="2.535111059s" podCreationTimestamp="2026-02-27 01:36:32 +0000 UTC" firstStartedPulling="2026-02-27 01:36:33.489894453 +0000 UTC m=+1906.427455751" lastFinishedPulling="2026-02-27 01:36:33.912744905 +0000 UTC m=+1906.850306183" observedRunningTime="2026-02-27 01:36:34.52165418 +0000 UTC m=+1907.459215508" watchObservedRunningTime="2026-02-27 01:36:34.535111059 +0000 UTC m=+1907.472672387" Feb 27 01:36:38 crc kubenswrapper[4771]: I0227 01:36:38.773432 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:36:38 crc kubenswrapper[4771]: E0227 01:36:38.775238 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:36:41 crc kubenswrapper[4771]: I0227 01:36:41.574703 4771 generic.go:334] "Generic (PLEG): container finished" podID="59c79bc1-5dcb-495e-8ce8-7c74517d2df6" containerID="e889567f6718b1c62d0642dd472d4453c905fba63caca5a76daa8467c838db37" exitCode=0 Feb 27 01:36:41 crc kubenswrapper[4771]: I0227 01:36:41.574760 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2v7s2" event={"ID":"59c79bc1-5dcb-495e-8ce8-7c74517d2df6","Type":"ContainerDied","Data":"e889567f6718b1c62d0642dd472d4453c905fba63caca5a76daa8467c838db37"} Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.063366 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2v7s2" Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.172053 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59c79bc1-5dcb-495e-8ce8-7c74517d2df6-ssh-key-openstack-edpm-ipam\") pod \"59c79bc1-5dcb-495e-8ce8-7c74517d2df6\" (UID: \"59c79bc1-5dcb-495e-8ce8-7c74517d2df6\") " Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.172274 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/59c79bc1-5dcb-495e-8ce8-7c74517d2df6-inventory-0\") pod \"59c79bc1-5dcb-495e-8ce8-7c74517d2df6\" (UID: \"59c79bc1-5dcb-495e-8ce8-7c74517d2df6\") " Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.172351 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvwms\" (UniqueName: \"kubernetes.io/projected/59c79bc1-5dcb-495e-8ce8-7c74517d2df6-kube-api-access-jvwms\") pod \"59c79bc1-5dcb-495e-8ce8-7c74517d2df6\" (UID: \"59c79bc1-5dcb-495e-8ce8-7c74517d2df6\") " Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.178222 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59c79bc1-5dcb-495e-8ce8-7c74517d2df6-kube-api-access-jvwms" (OuterVolumeSpecName: "kube-api-access-jvwms") pod "59c79bc1-5dcb-495e-8ce8-7c74517d2df6" (UID: "59c79bc1-5dcb-495e-8ce8-7c74517d2df6"). InnerVolumeSpecName "kube-api-access-jvwms". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.199808 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59c79bc1-5dcb-495e-8ce8-7c74517d2df6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "59c79bc1-5dcb-495e-8ce8-7c74517d2df6" (UID: "59c79bc1-5dcb-495e-8ce8-7c74517d2df6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.200773 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59c79bc1-5dcb-495e-8ce8-7c74517d2df6-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "59c79bc1-5dcb-495e-8ce8-7c74517d2df6" (UID: "59c79bc1-5dcb-495e-8ce8-7c74517d2df6"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.274134 4771 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/59c79bc1-5dcb-495e-8ce8-7c74517d2df6-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.274175 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvwms\" (UniqueName: \"kubernetes.io/projected/59c79bc1-5dcb-495e-8ce8-7c74517d2df6-kube-api-access-jvwms\") on node \"crc\" DevicePath \"\"" Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.274189 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/59c79bc1-5dcb-495e-8ce8-7c74517d2df6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.597130 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2v7s2" event={"ID":"59c79bc1-5dcb-495e-8ce8-7c74517d2df6","Type":"ContainerDied","Data":"9a27bf26ccc13962429019a3ebc9a41093eb6d2f34a6bc9d66ab39360225dcf6"} Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.597690 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a27bf26ccc13962429019a3ebc9a41093eb6d2f34a6bc9d66ab39360225dcf6" Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.597213 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2v7s2" Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.706125 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pr4dd"] Feb 27 01:36:43 crc kubenswrapper[4771]: E0227 01:36:43.706582 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59c79bc1-5dcb-495e-8ce8-7c74517d2df6" containerName="ssh-known-hosts-edpm-deployment" Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.706600 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="59c79bc1-5dcb-495e-8ce8-7c74517d2df6" containerName="ssh-known-hosts-edpm-deployment" Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.706781 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="59c79bc1-5dcb-495e-8ce8-7c74517d2df6" containerName="ssh-known-hosts-edpm-deployment" Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.707383 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pr4dd" Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.709747 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.709788 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.710349 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.710601 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kwjcj" Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.721153 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pr4dd"] Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.885586 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6124b0a4-176b-41d9-8ebc-db0675eeb0e4-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pr4dd\" (UID: \"6124b0a4-176b-41d9-8ebc-db0675eeb0e4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pr4dd" Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.885861 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6124b0a4-176b-41d9-8ebc-db0675eeb0e4-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pr4dd\" (UID: \"6124b0a4-176b-41d9-8ebc-db0675eeb0e4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pr4dd" Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.886107 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdnx2\" (UniqueName: \"kubernetes.io/projected/6124b0a4-176b-41d9-8ebc-db0675eeb0e4-kube-api-access-mdnx2\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pr4dd\" (UID: \"6124b0a4-176b-41d9-8ebc-db0675eeb0e4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pr4dd" Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.988088 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdnx2\" (UniqueName: \"kubernetes.io/projected/6124b0a4-176b-41d9-8ebc-db0675eeb0e4-kube-api-access-mdnx2\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pr4dd\" (UID: \"6124b0a4-176b-41d9-8ebc-db0675eeb0e4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pr4dd" Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.988201 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6124b0a4-176b-41d9-8ebc-db0675eeb0e4-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pr4dd\" (UID: \"6124b0a4-176b-41d9-8ebc-db0675eeb0e4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pr4dd" Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.988253 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6124b0a4-176b-41d9-8ebc-db0675eeb0e4-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pr4dd\" (UID: \"6124b0a4-176b-41d9-8ebc-db0675eeb0e4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pr4dd" Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.998841 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6124b0a4-176b-41d9-8ebc-db0675eeb0e4-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pr4dd\" (UID: \"6124b0a4-176b-41d9-8ebc-db0675eeb0e4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pr4dd" Feb 27 01:36:43 crc kubenswrapper[4771]: I0227 01:36:43.999241 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6124b0a4-176b-41d9-8ebc-db0675eeb0e4-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pr4dd\" (UID: \"6124b0a4-176b-41d9-8ebc-db0675eeb0e4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pr4dd" Feb 27 01:36:44 crc kubenswrapper[4771]: I0227 01:36:44.024164 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdnx2\" (UniqueName: \"kubernetes.io/projected/6124b0a4-176b-41d9-8ebc-db0675eeb0e4-kube-api-access-mdnx2\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pr4dd\" (UID: \"6124b0a4-176b-41d9-8ebc-db0675eeb0e4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pr4dd" Feb 27 01:36:44 crc kubenswrapper[4771]: I0227 01:36:44.029236 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pr4dd" Feb 27 01:36:44 crc kubenswrapper[4771]: I0227 01:36:44.604064 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pr4dd"] Feb 27 01:36:45 crc kubenswrapper[4771]: I0227 01:36:45.617037 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pr4dd" event={"ID":"6124b0a4-176b-41d9-8ebc-db0675eeb0e4","Type":"ContainerStarted","Data":"0ef109aa60afd6fd1be99c4b4f4dd94624735a5b064b95ae50d14f585d27c319"} Feb 27 01:36:45 crc kubenswrapper[4771]: I0227 01:36:45.617517 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pr4dd" event={"ID":"6124b0a4-176b-41d9-8ebc-db0675eeb0e4","Type":"ContainerStarted","Data":"f6e1b30b27110f703511d86eb9b66d7c4b7921f9f8b78c045edd01af5aa2d74c"} Feb 27 01:36:50 crc kubenswrapper[4771]: I0227 01:36:50.774147 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:36:50 crc kubenswrapper[4771]: E0227 01:36:50.775282 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:36:53 crc kubenswrapper[4771]: E0227 01:36:53.448662 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6124b0a4_176b_41d9_8ebc_db0675eeb0e4.slice/crio-conmon-0ef109aa60afd6fd1be99c4b4f4dd94624735a5b064b95ae50d14f585d27c319.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6124b0a4_176b_41d9_8ebc_db0675eeb0e4.slice/crio-0ef109aa60afd6fd1be99c4b4f4dd94624735a5b064b95ae50d14f585d27c319.scope\": RecentStats: unable to find data in memory cache]" Feb 27 01:36:53 crc kubenswrapper[4771]: I0227 01:36:53.708434 4771 generic.go:334] "Generic (PLEG): container finished" podID="6124b0a4-176b-41d9-8ebc-db0675eeb0e4" containerID="0ef109aa60afd6fd1be99c4b4f4dd94624735a5b064b95ae50d14f585d27c319" exitCode=0 Feb 27 01:36:53 crc kubenswrapper[4771]: I0227 01:36:53.708926 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pr4dd" event={"ID":"6124b0a4-176b-41d9-8ebc-db0675eeb0e4","Type":"ContainerDied","Data":"0ef109aa60afd6fd1be99c4b4f4dd94624735a5b064b95ae50d14f585d27c319"} Feb 27 01:36:55 crc kubenswrapper[4771]: I0227 01:36:55.189223 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pr4dd" Feb 27 01:36:55 crc kubenswrapper[4771]: I0227 01:36:55.324898 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6124b0a4-176b-41d9-8ebc-db0675eeb0e4-ssh-key-openstack-edpm-ipam\") pod \"6124b0a4-176b-41d9-8ebc-db0675eeb0e4\" (UID: \"6124b0a4-176b-41d9-8ebc-db0675eeb0e4\") " Feb 27 01:36:55 crc kubenswrapper[4771]: I0227 01:36:55.325265 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdnx2\" (UniqueName: \"kubernetes.io/projected/6124b0a4-176b-41d9-8ebc-db0675eeb0e4-kube-api-access-mdnx2\") pod \"6124b0a4-176b-41d9-8ebc-db0675eeb0e4\" (UID: \"6124b0a4-176b-41d9-8ebc-db0675eeb0e4\") " Feb 27 01:36:55 crc kubenswrapper[4771]: I0227 01:36:55.325523 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6124b0a4-176b-41d9-8ebc-db0675eeb0e4-inventory\") pod \"6124b0a4-176b-41d9-8ebc-db0675eeb0e4\" (UID: \"6124b0a4-176b-41d9-8ebc-db0675eeb0e4\") " Feb 27 01:36:55 crc kubenswrapper[4771]: I0227 01:36:55.334316 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6124b0a4-176b-41d9-8ebc-db0675eeb0e4-kube-api-access-mdnx2" (OuterVolumeSpecName: "kube-api-access-mdnx2") pod "6124b0a4-176b-41d9-8ebc-db0675eeb0e4" (UID: "6124b0a4-176b-41d9-8ebc-db0675eeb0e4"). InnerVolumeSpecName "kube-api-access-mdnx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:36:55 crc kubenswrapper[4771]: I0227 01:36:55.356965 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6124b0a4-176b-41d9-8ebc-db0675eeb0e4-inventory" (OuterVolumeSpecName: "inventory") pod "6124b0a4-176b-41d9-8ebc-db0675eeb0e4" (UID: "6124b0a4-176b-41d9-8ebc-db0675eeb0e4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:36:55 crc kubenswrapper[4771]: I0227 01:36:55.362028 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6124b0a4-176b-41d9-8ebc-db0675eeb0e4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6124b0a4-176b-41d9-8ebc-db0675eeb0e4" (UID: "6124b0a4-176b-41d9-8ebc-db0675eeb0e4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:36:55 crc kubenswrapper[4771]: I0227 01:36:55.428268 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6124b0a4-176b-41d9-8ebc-db0675eeb0e4-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 01:36:55 crc kubenswrapper[4771]: I0227 01:36:55.428342 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6124b0a4-176b-41d9-8ebc-db0675eeb0e4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 01:36:55 crc kubenswrapper[4771]: I0227 01:36:55.428375 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdnx2\" (UniqueName: \"kubernetes.io/projected/6124b0a4-176b-41d9-8ebc-db0675eeb0e4-kube-api-access-mdnx2\") on node \"crc\" DevicePath \"\"" Feb 27 01:36:55 crc kubenswrapper[4771]: I0227 01:36:55.726622 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pr4dd" event={"ID":"6124b0a4-176b-41d9-8ebc-db0675eeb0e4","Type":"ContainerDied","Data":"f6e1b30b27110f703511d86eb9b66d7c4b7921f9f8b78c045edd01af5aa2d74c"} Feb 27 01:36:55 crc kubenswrapper[4771]: I0227 01:36:55.726670 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6e1b30b27110f703511d86eb9b66d7c4b7921f9f8b78c045edd01af5aa2d74c" Feb 27 01:36:55 crc kubenswrapper[4771]: I0227 01:36:55.726688 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pr4dd" Feb 27 01:36:55 crc kubenswrapper[4771]: I0227 01:36:55.798931 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k"] Feb 27 01:36:55 crc kubenswrapper[4771]: E0227 01:36:55.799370 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6124b0a4-176b-41d9-8ebc-db0675eeb0e4" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 27 01:36:55 crc kubenswrapper[4771]: I0227 01:36:55.799392 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6124b0a4-176b-41d9-8ebc-db0675eeb0e4" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 27 01:36:55 crc kubenswrapper[4771]: I0227 01:36:55.799617 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="6124b0a4-176b-41d9-8ebc-db0675eeb0e4" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 27 01:36:55 crc kubenswrapper[4771]: I0227 01:36:55.800338 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k" Feb 27 01:36:55 crc kubenswrapper[4771]: I0227 01:36:55.802235 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 01:36:55 crc kubenswrapper[4771]: I0227 01:36:55.803097 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kwjcj" Feb 27 01:36:55 crc kubenswrapper[4771]: I0227 01:36:55.803464 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 01:36:55 crc kubenswrapper[4771]: I0227 01:36:55.803746 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 01:36:55 crc kubenswrapper[4771]: I0227 01:36:55.815277 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k"] Feb 27 01:36:55 crc kubenswrapper[4771]: I0227 01:36:55.938992 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvqnn\" (UniqueName: \"kubernetes.io/projected/df815e54-72eb-44e8-b6dd-a1758fd381e0-kube-api-access-tvqnn\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k\" (UID: \"df815e54-72eb-44e8-b6dd-a1758fd381e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k" Feb 27 01:36:55 crc kubenswrapper[4771]: I0227 01:36:55.939454 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df815e54-72eb-44e8-b6dd-a1758fd381e0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k\" (UID: \"df815e54-72eb-44e8-b6dd-a1758fd381e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k" Feb 27 01:36:55 crc kubenswrapper[4771]: I0227 01:36:55.939641 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df815e54-72eb-44e8-b6dd-a1758fd381e0-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k\" (UID: \"df815e54-72eb-44e8-b6dd-a1758fd381e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k" Feb 27 01:36:56 crc kubenswrapper[4771]: I0227 01:36:56.041300 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df815e54-72eb-44e8-b6dd-a1758fd381e0-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k\" (UID: \"df815e54-72eb-44e8-b6dd-a1758fd381e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k" Feb 27 01:36:56 crc kubenswrapper[4771]: I0227 01:36:56.041401 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvqnn\" (UniqueName: \"kubernetes.io/projected/df815e54-72eb-44e8-b6dd-a1758fd381e0-kube-api-access-tvqnn\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k\" (UID: \"df815e54-72eb-44e8-b6dd-a1758fd381e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k" Feb 27 01:36:56 crc kubenswrapper[4771]: I0227 01:36:56.041469 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df815e54-72eb-44e8-b6dd-a1758fd381e0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k\" (UID: \"df815e54-72eb-44e8-b6dd-a1758fd381e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k" Feb 27 01:36:56 crc kubenswrapper[4771]: I0227 01:36:56.045087 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df815e54-72eb-44e8-b6dd-a1758fd381e0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k\" (UID: \"df815e54-72eb-44e8-b6dd-a1758fd381e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k" Feb 27 01:36:56 crc kubenswrapper[4771]: I0227 01:36:56.045258 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df815e54-72eb-44e8-b6dd-a1758fd381e0-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k\" (UID: \"df815e54-72eb-44e8-b6dd-a1758fd381e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k" Feb 27 01:36:56 crc kubenswrapper[4771]: I0227 01:36:56.061102 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvqnn\" (UniqueName: \"kubernetes.io/projected/df815e54-72eb-44e8-b6dd-a1758fd381e0-kube-api-access-tvqnn\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k\" (UID: \"df815e54-72eb-44e8-b6dd-a1758fd381e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k" Feb 27 01:36:56 crc kubenswrapper[4771]: I0227 01:36:56.149681 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k" Feb 27 01:36:56 crc kubenswrapper[4771]: I0227 01:36:56.711567 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k"] Feb 27 01:36:56 crc kubenswrapper[4771]: W0227 01:36:56.714618 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf815e54_72eb_44e8_b6dd_a1758fd381e0.slice/crio-af084cf8d0d9bc17ea0550539656b9c0c64d0164c9ebf1d5d0bd4a1c05ade5df WatchSource:0}: Error finding container af084cf8d0d9bc17ea0550539656b9c0c64d0164c9ebf1d5d0bd4a1c05ade5df: Status 404 returned error can't find the container with id af084cf8d0d9bc17ea0550539656b9c0c64d0164c9ebf1d5d0bd4a1c05ade5df Feb 27 01:36:56 crc kubenswrapper[4771]: I0227 01:36:56.744040 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k" event={"ID":"df815e54-72eb-44e8-b6dd-a1758fd381e0","Type":"ContainerStarted","Data":"af084cf8d0d9bc17ea0550539656b9c0c64d0164c9ebf1d5d0bd4a1c05ade5df"} Feb 27 01:36:57 crc kubenswrapper[4771]: I0227 01:36:57.753090 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k" event={"ID":"df815e54-72eb-44e8-b6dd-a1758fd381e0","Type":"ContainerStarted","Data":"70907245fcaf3c1a95cedc09d9cfb293411d62af91c63d4149dbed61cf89a310"} Feb 27 01:36:57 crc kubenswrapper[4771]: I0227 01:36:57.787126 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k" podStartSLOduration=2.3584720040000002 podStartE2EDuration="2.787106745s" podCreationTimestamp="2026-02-27 01:36:55 +0000 UTC" firstStartedPulling="2026-02-27 01:36:56.718626814 +0000 UTC m=+1929.656188102" lastFinishedPulling="2026-02-27 01:36:57.147261525 +0000 UTC m=+1930.084822843" observedRunningTime="2026-02-27 01:36:57.769405281 +0000 UTC m=+1930.706966579" watchObservedRunningTime="2026-02-27 01:36:57.787106745 +0000 UTC m=+1930.724668053" Feb 27 01:37:02 crc kubenswrapper[4771]: I0227 01:37:02.773704 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:37:02 crc kubenswrapper[4771]: E0227 01:37:02.774836 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:37:04 crc kubenswrapper[4771]: I0227 01:37:04.047128 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-mzh27"] Feb 27 01:37:04 crc kubenswrapper[4771]: I0227 01:37:04.058934 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-mzh27"] Feb 27 01:37:05 crc kubenswrapper[4771]: I0227 01:37:05.793352 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0f3b6c0-daf5-40b9-bdd9-008890a2684a" path="/var/lib/kubelet/pods/f0f3b6c0-daf5-40b9-bdd9-008890a2684a/volumes" Feb 27 01:37:06 crc kubenswrapper[4771]: I0227 01:37:06.862597 4771 generic.go:334] "Generic (PLEG): container finished" podID="df815e54-72eb-44e8-b6dd-a1758fd381e0" containerID="70907245fcaf3c1a95cedc09d9cfb293411d62af91c63d4149dbed61cf89a310" exitCode=0 Feb 27 01:37:06 crc kubenswrapper[4771]: I0227 01:37:06.862678 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k" event={"ID":"df815e54-72eb-44e8-b6dd-a1758fd381e0","Type":"ContainerDied","Data":"70907245fcaf3c1a95cedc09d9cfb293411d62af91c63d4149dbed61cf89a310"} Feb 27 01:37:08 crc kubenswrapper[4771]: I0227 01:37:08.340785 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k" Feb 27 01:37:08 crc kubenswrapper[4771]: I0227 01:37:08.532140 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df815e54-72eb-44e8-b6dd-a1758fd381e0-inventory\") pod \"df815e54-72eb-44e8-b6dd-a1758fd381e0\" (UID: \"df815e54-72eb-44e8-b6dd-a1758fd381e0\") " Feb 27 01:37:08 crc kubenswrapper[4771]: I0227 01:37:08.532514 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvqnn\" (UniqueName: \"kubernetes.io/projected/df815e54-72eb-44e8-b6dd-a1758fd381e0-kube-api-access-tvqnn\") pod \"df815e54-72eb-44e8-b6dd-a1758fd381e0\" (UID: \"df815e54-72eb-44e8-b6dd-a1758fd381e0\") " Feb 27 01:37:08 crc kubenswrapper[4771]: I0227 01:37:08.532659 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df815e54-72eb-44e8-b6dd-a1758fd381e0-ssh-key-openstack-edpm-ipam\") pod \"df815e54-72eb-44e8-b6dd-a1758fd381e0\" (UID: \"df815e54-72eb-44e8-b6dd-a1758fd381e0\") " Feb 27 01:37:08 crc kubenswrapper[4771]: I0227 01:37:08.540126 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df815e54-72eb-44e8-b6dd-a1758fd381e0-kube-api-access-tvqnn" (OuterVolumeSpecName: "kube-api-access-tvqnn") pod "df815e54-72eb-44e8-b6dd-a1758fd381e0" (UID: "df815e54-72eb-44e8-b6dd-a1758fd381e0"). InnerVolumeSpecName "kube-api-access-tvqnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:37:08 crc kubenswrapper[4771]: I0227 01:37:08.559974 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df815e54-72eb-44e8-b6dd-a1758fd381e0-inventory" (OuterVolumeSpecName: "inventory") pod "df815e54-72eb-44e8-b6dd-a1758fd381e0" (UID: "df815e54-72eb-44e8-b6dd-a1758fd381e0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:37:08 crc kubenswrapper[4771]: I0227 01:37:08.561777 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df815e54-72eb-44e8-b6dd-a1758fd381e0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "df815e54-72eb-44e8-b6dd-a1758fd381e0" (UID: "df815e54-72eb-44e8-b6dd-a1758fd381e0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:37:08 crc kubenswrapper[4771]: I0227 01:37:08.634629 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df815e54-72eb-44e8-b6dd-a1758fd381e0-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 01:37:08 crc kubenswrapper[4771]: I0227 01:37:08.634666 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvqnn\" (UniqueName: \"kubernetes.io/projected/df815e54-72eb-44e8-b6dd-a1758fd381e0-kube-api-access-tvqnn\") on node \"crc\" DevicePath \"\"" Feb 27 01:37:08 crc kubenswrapper[4771]: I0227 01:37:08.634683 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df815e54-72eb-44e8-b6dd-a1758fd381e0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 01:37:08 crc kubenswrapper[4771]: I0227 01:37:08.885103 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k" event={"ID":"df815e54-72eb-44e8-b6dd-a1758fd381e0","Type":"ContainerDied","Data":"af084cf8d0d9bc17ea0550539656b9c0c64d0164c9ebf1d5d0bd4a1c05ade5df"} Feb 27 01:37:08 crc kubenswrapper[4771]: I0227 01:37:08.885140 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af084cf8d0d9bc17ea0550539656b9c0c64d0164c9ebf1d5d0bd4a1c05ade5df" Feb 27 01:37:08 crc kubenswrapper[4771]: I0227 01:37:08.885183 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.061958 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5"] Feb 27 01:37:09 crc kubenswrapper[4771]: E0227 01:37:09.062451 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df815e54-72eb-44e8-b6dd-a1758fd381e0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.062480 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="df815e54-72eb-44e8-b6dd-a1758fd381e0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.062748 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="df815e54-72eb-44e8-b6dd-a1758fd381e0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.063437 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.067029 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.067092 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.067306 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.067422 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.067527 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.067669 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.067695 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kwjcj" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.067803 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.071742 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5"] Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.246757 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.246830 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.247072 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj28j\" (UniqueName: \"kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-kube-api-access-sj28j\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.247185 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.247278 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.247318 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.247396 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.247441 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.247494 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.247599 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.247689 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.247714 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.247745 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.247773 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.349362 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.349465 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.349618 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj28j\" (UniqueName: \"kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-kube-api-access-sj28j\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.349692 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.349853 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.350371 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.350476 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.350531 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.350655 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.350773 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.350858 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.350917 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.350978 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.351042 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.354398 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.354905 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.354994 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.355182 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.356456 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.356751 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.356971 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.357517 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.357524 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.358861 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.359686 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.362518 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.362525 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.369945 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj28j\" (UniqueName: \"kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-kube-api-access-sj28j\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.397918 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:09 crc kubenswrapper[4771]: I0227 01:37:09.992733 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5"] Feb 27 01:37:10 crc kubenswrapper[4771]: I0227 01:37:10.919891 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" event={"ID":"d903dbaa-f429-4c92-8c5a-17c1622bf8bd","Type":"ContainerStarted","Data":"56124ee8022e13dd2e203048eaf551d961f1829257ef782adff3c03907deb3d3"} Feb 27 01:37:10 crc kubenswrapper[4771]: I0227 01:37:10.920749 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" event={"ID":"d903dbaa-f429-4c92-8c5a-17c1622bf8bd","Type":"ContainerStarted","Data":"cedc89382aebd50411624bd6bd3bde7188920d6107534226c0444039d1bbc5ad"} Feb 27 01:37:10 crc kubenswrapper[4771]: I0227 01:37:10.943901 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" podStartSLOduration=1.458368594 podStartE2EDuration="1.943877338s" podCreationTimestamp="2026-02-27 01:37:09 +0000 UTC" firstStartedPulling="2026-02-27 01:37:09.998583293 +0000 UTC m=+1942.936144611" lastFinishedPulling="2026-02-27 01:37:10.484092037 +0000 UTC m=+1943.421653355" observedRunningTime="2026-02-27 01:37:10.942181732 +0000 UTC m=+1943.879743030" watchObservedRunningTime="2026-02-27 01:37:10.943877338 +0000 UTC m=+1943.881438666" Feb 27 01:37:13 crc kubenswrapper[4771]: I0227 01:37:13.773258 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:37:13 crc kubenswrapper[4771]: E0227 01:37:13.773724 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:37:24 crc kubenswrapper[4771]: I0227 01:37:24.774777 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:37:24 crc kubenswrapper[4771]: E0227 01:37:24.775930 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:37:28 crc kubenswrapper[4771]: I0227 01:37:28.975586 4771 scope.go:117] "RemoveContainer" containerID="795b1623dbd9b500c659a38f5a9f2f23a7f05e928acadd25f8f99fccf7637d4f" Feb 27 01:37:35 crc kubenswrapper[4771]: I0227 01:37:35.774163 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:37:35 crc kubenswrapper[4771]: E0227 01:37:35.775310 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:37:47 crc kubenswrapper[4771]: I0227 01:37:47.785545 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:37:47 crc kubenswrapper[4771]: E0227 01:37:47.786698 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:37:50 crc kubenswrapper[4771]: I0227 01:37:50.330694 4771 generic.go:334] "Generic (PLEG): container finished" podID="d903dbaa-f429-4c92-8c5a-17c1622bf8bd" containerID="56124ee8022e13dd2e203048eaf551d961f1829257ef782adff3c03907deb3d3" exitCode=0 Feb 27 01:37:50 crc kubenswrapper[4771]: I0227 01:37:50.331252 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" event={"ID":"d903dbaa-f429-4c92-8c5a-17c1622bf8bd","Type":"ContainerDied","Data":"56124ee8022e13dd2e203048eaf551d961f1829257ef782adff3c03907deb3d3"} Feb 27 01:37:51 crc kubenswrapper[4771]: I0227 01:37:51.846385 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:51 crc kubenswrapper[4771]: I0227 01:37:51.987900 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-libvirt-combined-ca-bundle\") pod \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " Feb 27 01:37:51 crc kubenswrapper[4771]: I0227 01:37:51.988013 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj28j\" (UniqueName: \"kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-kube-api-access-sj28j\") pod \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " Feb 27 01:37:51 crc kubenswrapper[4771]: I0227 01:37:51.988098 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-repo-setup-combined-ca-bundle\") pod \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " Feb 27 01:37:51 crc kubenswrapper[4771]: I0227 01:37:51.988148 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " Feb 27 01:37:51 crc kubenswrapper[4771]: I0227 01:37:51.988187 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-inventory\") pod \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " Feb 27 01:37:51 crc kubenswrapper[4771]: I0227 01:37:51.988261 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-ovn-combined-ca-bundle\") pod \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " Feb 27 01:37:51 crc kubenswrapper[4771]: I0227 01:37:51.988305 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-bootstrap-combined-ca-bundle\") pod \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " Feb 27 01:37:51 crc kubenswrapper[4771]: I0227 01:37:51.988371 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-nova-combined-ca-bundle\") pod \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " Feb 27 01:37:51 crc kubenswrapper[4771]: I0227 01:37:51.988433 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-telemetry-combined-ca-bundle\") pod \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " Feb 27 01:37:51 crc kubenswrapper[4771]: I0227 01:37:51.988578 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-ssh-key-openstack-edpm-ipam\") pod \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " Feb 27 01:37:51 crc kubenswrapper[4771]: I0227 01:37:51.988617 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " Feb 27 01:37:51 crc kubenswrapper[4771]: I0227 01:37:51.988738 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " Feb 27 01:37:51 crc kubenswrapper[4771]: I0227 01:37:51.988831 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " Feb 27 01:37:51 crc kubenswrapper[4771]: I0227 01:37:51.988865 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-neutron-metadata-combined-ca-bundle\") pod \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\" (UID: \"d903dbaa-f429-4c92-8c5a-17c1622bf8bd\") " Feb 27 01:37:51 crc kubenswrapper[4771]: I0227 01:37:51.998696 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "d903dbaa-f429-4c92-8c5a-17c1622bf8bd" (UID: "d903dbaa-f429-4c92-8c5a-17c1622bf8bd"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:51.999975 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "d903dbaa-f429-4c92-8c5a-17c1622bf8bd" (UID: "d903dbaa-f429-4c92-8c5a-17c1622bf8bd"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.000490 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "d903dbaa-f429-4c92-8c5a-17c1622bf8bd" (UID: "d903dbaa-f429-4c92-8c5a-17c1622bf8bd"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.000503 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "d903dbaa-f429-4c92-8c5a-17c1622bf8bd" (UID: "d903dbaa-f429-4c92-8c5a-17c1622bf8bd"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.001009 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-kube-api-access-sj28j" (OuterVolumeSpecName: "kube-api-access-sj28j") pod "d903dbaa-f429-4c92-8c5a-17c1622bf8bd" (UID: "d903dbaa-f429-4c92-8c5a-17c1622bf8bd"). InnerVolumeSpecName "kube-api-access-sj28j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.001676 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "d903dbaa-f429-4c92-8c5a-17c1622bf8bd" (UID: "d903dbaa-f429-4c92-8c5a-17c1622bf8bd"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.002216 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "d903dbaa-f429-4c92-8c5a-17c1622bf8bd" (UID: "d903dbaa-f429-4c92-8c5a-17c1622bf8bd"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.005061 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "d903dbaa-f429-4c92-8c5a-17c1622bf8bd" (UID: "d903dbaa-f429-4c92-8c5a-17c1622bf8bd"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.005247 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "d903dbaa-f429-4c92-8c5a-17c1622bf8bd" (UID: "d903dbaa-f429-4c92-8c5a-17c1622bf8bd"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.005762 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "d903dbaa-f429-4c92-8c5a-17c1622bf8bd" (UID: "d903dbaa-f429-4c92-8c5a-17c1622bf8bd"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.006483 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "d903dbaa-f429-4c92-8c5a-17c1622bf8bd" (UID: "d903dbaa-f429-4c92-8c5a-17c1622bf8bd"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.008685 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "d903dbaa-f429-4c92-8c5a-17c1622bf8bd" (UID: "d903dbaa-f429-4c92-8c5a-17c1622bf8bd"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.027037 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-inventory" (OuterVolumeSpecName: "inventory") pod "d903dbaa-f429-4c92-8c5a-17c1622bf8bd" (UID: "d903dbaa-f429-4c92-8c5a-17c1622bf8bd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.030309 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d903dbaa-f429-4c92-8c5a-17c1622bf8bd" (UID: "d903dbaa-f429-4c92-8c5a-17c1622bf8bd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.090931 4771 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.090976 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj28j\" (UniqueName: \"kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-kube-api-access-sj28j\") on node \"crc\" DevicePath \"\"" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.090989 4771 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.091009 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.091024 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.091035 4771 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.091047 4771 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.091057 4771 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.091068 4771 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.091081 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.091092 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.091107 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.091121 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.091135 4771 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d903dbaa-f429-4c92-8c5a-17c1622bf8bd-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.352999 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" event={"ID":"d903dbaa-f429-4c92-8c5a-17c1622bf8bd","Type":"ContainerDied","Data":"cedc89382aebd50411624bd6bd3bde7188920d6107534226c0444039d1bbc5ad"} Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.353375 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cedc89382aebd50411624bd6bd3bde7188920d6107534226c0444039d1bbc5ad" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.353087 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.483836 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk"] Feb 27 01:37:52 crc kubenswrapper[4771]: E0227 01:37:52.484274 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d903dbaa-f429-4c92-8c5a-17c1622bf8bd" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.484295 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d903dbaa-f429-4c92-8c5a-17c1622bf8bd" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.484503 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d903dbaa-f429-4c92-8c5a-17c1622bf8bd" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.485221 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.492825 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.493655 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kwjcj" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.494105 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.495733 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.504022 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.514602 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk"] Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.607929 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64b58c2b-7189-40c8-94b0-c31f167845d1-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9bfk\" (UID: \"64b58c2b-7189-40c8-94b0-c31f167845d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.607977 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64b58c2b-7189-40c8-94b0-c31f167845d1-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9bfk\" (UID: \"64b58c2b-7189-40c8-94b0-c31f167845d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.608026 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64b58c2b-7189-40c8-94b0-c31f167845d1-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9bfk\" (UID: \"64b58c2b-7189-40c8-94b0-c31f167845d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.608091 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/64b58c2b-7189-40c8-94b0-c31f167845d1-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9bfk\" (UID: \"64b58c2b-7189-40c8-94b0-c31f167845d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.608161 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt5b9\" (UniqueName: \"kubernetes.io/projected/64b58c2b-7189-40c8-94b0-c31f167845d1-kube-api-access-xt5b9\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9bfk\" (UID: \"64b58c2b-7189-40c8-94b0-c31f167845d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.709699 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/64b58c2b-7189-40c8-94b0-c31f167845d1-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9bfk\" (UID: \"64b58c2b-7189-40c8-94b0-c31f167845d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.709785 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt5b9\" (UniqueName: \"kubernetes.io/projected/64b58c2b-7189-40c8-94b0-c31f167845d1-kube-api-access-xt5b9\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9bfk\" (UID: \"64b58c2b-7189-40c8-94b0-c31f167845d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.709863 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64b58c2b-7189-40c8-94b0-c31f167845d1-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9bfk\" (UID: \"64b58c2b-7189-40c8-94b0-c31f167845d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.709881 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64b58c2b-7189-40c8-94b0-c31f167845d1-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9bfk\" (UID: \"64b58c2b-7189-40c8-94b0-c31f167845d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.709909 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64b58c2b-7189-40c8-94b0-c31f167845d1-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9bfk\" (UID: \"64b58c2b-7189-40c8-94b0-c31f167845d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.710997 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/64b58c2b-7189-40c8-94b0-c31f167845d1-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9bfk\" (UID: \"64b58c2b-7189-40c8-94b0-c31f167845d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.713783 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64b58c2b-7189-40c8-94b0-c31f167845d1-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9bfk\" (UID: \"64b58c2b-7189-40c8-94b0-c31f167845d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.713994 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64b58c2b-7189-40c8-94b0-c31f167845d1-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9bfk\" (UID: \"64b58c2b-7189-40c8-94b0-c31f167845d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.714361 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64b58c2b-7189-40c8-94b0-c31f167845d1-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9bfk\" (UID: \"64b58c2b-7189-40c8-94b0-c31f167845d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.726240 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt5b9\" (UniqueName: \"kubernetes.io/projected/64b58c2b-7189-40c8-94b0-c31f167845d1-kube-api-access-xt5b9\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c9bfk\" (UID: \"64b58c2b-7189-40c8-94b0-c31f167845d1\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk" Feb 27 01:37:52 crc kubenswrapper[4771]: I0227 01:37:52.820965 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk" Feb 27 01:37:53 crc kubenswrapper[4771]: I0227 01:37:53.372113 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk"] Feb 27 01:37:54 crc kubenswrapper[4771]: I0227 01:37:54.375098 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk" event={"ID":"64b58c2b-7189-40c8-94b0-c31f167845d1","Type":"ContainerStarted","Data":"73d2a904d859f60d06fc5d5c5db9cfa2386244e786499ef7f028ea16053c2b93"} Feb 27 01:37:54 crc kubenswrapper[4771]: I0227 01:37:54.375707 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk" event={"ID":"64b58c2b-7189-40c8-94b0-c31f167845d1","Type":"ContainerStarted","Data":"35c28f33681a82ada0ce6ee7739444fb08fc7aa782bc987710fb104e335f60e9"} Feb 27 01:37:54 crc kubenswrapper[4771]: I0227 01:37:54.397928 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk" podStartSLOduration=1.9332156010000001 podStartE2EDuration="2.397909666s" podCreationTimestamp="2026-02-27 01:37:52 +0000 UTC" firstStartedPulling="2026-02-27 01:37:53.3841332 +0000 UTC m=+1986.321694528" lastFinishedPulling="2026-02-27 01:37:53.848827305 +0000 UTC m=+1986.786388593" observedRunningTime="2026-02-27 01:37:54.39070735 +0000 UTC m=+1987.328268638" watchObservedRunningTime="2026-02-27 01:37:54.397909666 +0000 UTC m=+1987.335470964" Feb 27 01:37:58 crc kubenswrapper[4771]: I0227 01:37:58.774634 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:37:58 crc kubenswrapper[4771]: E0227 01:37:58.775232 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:38:00 crc kubenswrapper[4771]: I0227 01:38:00.160999 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535938-jkl79"] Feb 27 01:38:00 crc kubenswrapper[4771]: I0227 01:38:00.162283 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535938-jkl79" Feb 27 01:38:00 crc kubenswrapper[4771]: I0227 01:38:00.164921 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:38:00 crc kubenswrapper[4771]: I0227 01:38:00.165802 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 01:38:00 crc kubenswrapper[4771]: I0227 01:38:00.166065 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:38:00 crc kubenswrapper[4771]: I0227 01:38:00.174512 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535938-jkl79"] Feb 27 01:38:00 crc kubenswrapper[4771]: I0227 01:38:00.275821 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbln5\" (UniqueName: \"kubernetes.io/projected/c8e0385d-ba62-4fcc-a059-5114bb130263-kube-api-access-lbln5\") pod \"auto-csr-approver-29535938-jkl79\" (UID: \"c8e0385d-ba62-4fcc-a059-5114bb130263\") " pod="openshift-infra/auto-csr-approver-29535938-jkl79" Feb 27 01:38:00 crc kubenswrapper[4771]: I0227 01:38:00.377963 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbln5\" (UniqueName: \"kubernetes.io/projected/c8e0385d-ba62-4fcc-a059-5114bb130263-kube-api-access-lbln5\") pod \"auto-csr-approver-29535938-jkl79\" (UID: \"c8e0385d-ba62-4fcc-a059-5114bb130263\") " pod="openshift-infra/auto-csr-approver-29535938-jkl79" Feb 27 01:38:00 crc kubenswrapper[4771]: I0227 01:38:00.409227 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbln5\" (UniqueName: \"kubernetes.io/projected/c8e0385d-ba62-4fcc-a059-5114bb130263-kube-api-access-lbln5\") pod \"auto-csr-approver-29535938-jkl79\" (UID: \"c8e0385d-ba62-4fcc-a059-5114bb130263\") " pod="openshift-infra/auto-csr-approver-29535938-jkl79" Feb 27 01:38:00 crc kubenswrapper[4771]: I0227 01:38:00.489379 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535938-jkl79" Feb 27 01:38:00 crc kubenswrapper[4771]: I0227 01:38:00.990942 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535938-jkl79"] Feb 27 01:38:01 crc kubenswrapper[4771]: I0227 01:38:01.456741 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535938-jkl79" event={"ID":"c8e0385d-ba62-4fcc-a059-5114bb130263","Type":"ContainerStarted","Data":"0dacd9c6982514363e9b4c2220265954809370d4c63a25b6a2cc3e5df408a9bb"} Feb 27 01:38:02 crc kubenswrapper[4771]: I0227 01:38:02.471394 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535938-jkl79" event={"ID":"c8e0385d-ba62-4fcc-a059-5114bb130263","Type":"ContainerStarted","Data":"636e297e3d86ea96342139341e1b675169250e885d08f049b5c0e5a8a772580d"} Feb 27 01:38:02 crc kubenswrapper[4771]: I0227 01:38:02.488964 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535938-jkl79" podStartSLOduration=1.48702048 podStartE2EDuration="2.488948323s" podCreationTimestamp="2026-02-27 01:38:00 +0000 UTC" firstStartedPulling="2026-02-27 01:38:00.999878542 +0000 UTC m=+1993.937439830" lastFinishedPulling="2026-02-27 01:38:02.001806345 +0000 UTC m=+1994.939367673" observedRunningTime="2026-02-27 01:38:02.487116673 +0000 UTC m=+1995.424677981" watchObservedRunningTime="2026-02-27 01:38:02.488948323 +0000 UTC m=+1995.426509611" Feb 27 01:38:03 crc kubenswrapper[4771]: I0227 01:38:03.484833 4771 generic.go:334] "Generic (PLEG): container finished" podID="c8e0385d-ba62-4fcc-a059-5114bb130263" containerID="636e297e3d86ea96342139341e1b675169250e885d08f049b5c0e5a8a772580d" exitCode=0 Feb 27 01:38:03 crc kubenswrapper[4771]: I0227 01:38:03.484911 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535938-jkl79" event={"ID":"c8e0385d-ba62-4fcc-a059-5114bb130263","Type":"ContainerDied","Data":"636e297e3d86ea96342139341e1b675169250e885d08f049b5c0e5a8a772580d"} Feb 27 01:38:04 crc kubenswrapper[4771]: I0227 01:38:04.966702 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535938-jkl79" Feb 27 01:38:04 crc kubenswrapper[4771]: I0227 01:38:04.985923 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbln5\" (UniqueName: \"kubernetes.io/projected/c8e0385d-ba62-4fcc-a059-5114bb130263-kube-api-access-lbln5\") pod \"c8e0385d-ba62-4fcc-a059-5114bb130263\" (UID: \"c8e0385d-ba62-4fcc-a059-5114bb130263\") " Feb 27 01:38:04 crc kubenswrapper[4771]: I0227 01:38:04.994707 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8e0385d-ba62-4fcc-a059-5114bb130263-kube-api-access-lbln5" (OuterVolumeSpecName: "kube-api-access-lbln5") pod "c8e0385d-ba62-4fcc-a059-5114bb130263" (UID: "c8e0385d-ba62-4fcc-a059-5114bb130263"). InnerVolumeSpecName "kube-api-access-lbln5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:38:05 crc kubenswrapper[4771]: I0227 01:38:05.089165 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbln5\" (UniqueName: \"kubernetes.io/projected/c8e0385d-ba62-4fcc-a059-5114bb130263-kube-api-access-lbln5\") on node \"crc\" DevicePath \"\"" Feb 27 01:38:05 crc kubenswrapper[4771]: I0227 01:38:05.504881 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535938-jkl79" event={"ID":"c8e0385d-ba62-4fcc-a059-5114bb130263","Type":"ContainerDied","Data":"0dacd9c6982514363e9b4c2220265954809370d4c63a25b6a2cc3e5df408a9bb"} Feb 27 01:38:05 crc kubenswrapper[4771]: I0227 01:38:05.504930 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dacd9c6982514363e9b4c2220265954809370d4c63a25b6a2cc3e5df408a9bb" Feb 27 01:38:05 crc kubenswrapper[4771]: I0227 01:38:05.504947 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535938-jkl79" Feb 27 01:38:05 crc kubenswrapper[4771]: I0227 01:38:05.563314 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535932-4rx7t"] Feb 27 01:38:05 crc kubenswrapper[4771]: I0227 01:38:05.574328 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535932-4rx7t"] Feb 27 01:38:05 crc kubenswrapper[4771]: I0227 01:38:05.791697 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef8584c4-bf5c-47f7-83af-af6162407eba" path="/var/lib/kubelet/pods/ef8584c4-bf5c-47f7-83af-af6162407eba/volumes" Feb 27 01:38:09 crc kubenswrapper[4771]: I0227 01:38:09.773574 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:38:09 crc kubenswrapper[4771]: E0227 01:38:09.774256 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:38:21 crc kubenswrapper[4771]: I0227 01:38:21.774257 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:38:21 crc kubenswrapper[4771]: E0227 01:38:21.775361 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:38:29 crc kubenswrapper[4771]: I0227 01:38:29.060205 4771 scope.go:117] "RemoveContainer" containerID="113f2d9db81d779181c411a5cffdb485c1a9dba56549f29c916f6125a9d62fc5" Feb 27 01:38:34 crc kubenswrapper[4771]: I0227 01:38:34.773036 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:38:34 crc kubenswrapper[4771]: E0227 01:38:34.773937 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:38:45 crc kubenswrapper[4771]: I0227 01:38:45.774042 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:38:45 crc kubenswrapper[4771]: E0227 01:38:45.775095 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:38:56 crc kubenswrapper[4771]: I0227 01:38:56.774801 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:38:56 crc kubenswrapper[4771]: E0227 01:38:56.776016 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:38:58 crc kubenswrapper[4771]: I0227 01:38:58.053872 4771 generic.go:334] "Generic (PLEG): container finished" podID="64b58c2b-7189-40c8-94b0-c31f167845d1" containerID="73d2a904d859f60d06fc5d5c5db9cfa2386244e786499ef7f028ea16053c2b93" exitCode=0 Feb 27 01:38:58 crc kubenswrapper[4771]: I0227 01:38:58.053938 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk" event={"ID":"64b58c2b-7189-40c8-94b0-c31f167845d1","Type":"ContainerDied","Data":"73d2a904d859f60d06fc5d5c5db9cfa2386244e786499ef7f028ea16053c2b93"} Feb 27 01:38:59 crc kubenswrapper[4771]: I0227 01:38:59.522223 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk" Feb 27 01:38:59 crc kubenswrapper[4771]: I0227 01:38:59.587510 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt5b9\" (UniqueName: \"kubernetes.io/projected/64b58c2b-7189-40c8-94b0-c31f167845d1-kube-api-access-xt5b9\") pod \"64b58c2b-7189-40c8-94b0-c31f167845d1\" (UID: \"64b58c2b-7189-40c8-94b0-c31f167845d1\") " Feb 27 01:38:59 crc kubenswrapper[4771]: I0227 01:38:59.587612 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64b58c2b-7189-40c8-94b0-c31f167845d1-ssh-key-openstack-edpm-ipam\") pod \"64b58c2b-7189-40c8-94b0-c31f167845d1\" (UID: \"64b58c2b-7189-40c8-94b0-c31f167845d1\") " Feb 27 01:38:59 crc kubenswrapper[4771]: I0227 01:38:59.588595 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64b58c2b-7189-40c8-94b0-c31f167845d1-ovn-combined-ca-bundle\") pod \"64b58c2b-7189-40c8-94b0-c31f167845d1\" (UID: \"64b58c2b-7189-40c8-94b0-c31f167845d1\") " Feb 27 01:38:59 crc kubenswrapper[4771]: I0227 01:38:59.588855 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64b58c2b-7189-40c8-94b0-c31f167845d1-inventory\") pod \"64b58c2b-7189-40c8-94b0-c31f167845d1\" (UID: \"64b58c2b-7189-40c8-94b0-c31f167845d1\") " Feb 27 01:38:59 crc kubenswrapper[4771]: I0227 01:38:59.588872 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/64b58c2b-7189-40c8-94b0-c31f167845d1-ovncontroller-config-0\") pod \"64b58c2b-7189-40c8-94b0-c31f167845d1\" (UID: \"64b58c2b-7189-40c8-94b0-c31f167845d1\") " Feb 27 01:38:59 crc kubenswrapper[4771]: I0227 01:38:59.595410 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64b58c2b-7189-40c8-94b0-c31f167845d1-kube-api-access-xt5b9" (OuterVolumeSpecName: "kube-api-access-xt5b9") pod "64b58c2b-7189-40c8-94b0-c31f167845d1" (UID: "64b58c2b-7189-40c8-94b0-c31f167845d1"). InnerVolumeSpecName "kube-api-access-xt5b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:38:59 crc kubenswrapper[4771]: I0227 01:38:59.607077 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64b58c2b-7189-40c8-94b0-c31f167845d1-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "64b58c2b-7189-40c8-94b0-c31f167845d1" (UID: "64b58c2b-7189-40c8-94b0-c31f167845d1"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:38:59 crc kubenswrapper[4771]: I0227 01:38:59.621024 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64b58c2b-7189-40c8-94b0-c31f167845d1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "64b58c2b-7189-40c8-94b0-c31f167845d1" (UID: "64b58c2b-7189-40c8-94b0-c31f167845d1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:38:59 crc kubenswrapper[4771]: I0227 01:38:59.638290 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64b58c2b-7189-40c8-94b0-c31f167845d1-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "64b58c2b-7189-40c8-94b0-c31f167845d1" (UID: "64b58c2b-7189-40c8-94b0-c31f167845d1"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:38:59 crc kubenswrapper[4771]: I0227 01:38:59.644985 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64b58c2b-7189-40c8-94b0-c31f167845d1-inventory" (OuterVolumeSpecName: "inventory") pod "64b58c2b-7189-40c8-94b0-c31f167845d1" (UID: "64b58c2b-7189-40c8-94b0-c31f167845d1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:38:59 crc kubenswrapper[4771]: I0227 01:38:59.690647 4771 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64b58c2b-7189-40c8-94b0-c31f167845d1-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:38:59 crc kubenswrapper[4771]: I0227 01:38:59.690684 4771 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/64b58c2b-7189-40c8-94b0-c31f167845d1-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 27 01:38:59 crc kubenswrapper[4771]: I0227 01:38:59.690697 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64b58c2b-7189-40c8-94b0-c31f167845d1-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 01:38:59 crc kubenswrapper[4771]: I0227 01:38:59.690710 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt5b9\" (UniqueName: \"kubernetes.io/projected/64b58c2b-7189-40c8-94b0-c31f167845d1-kube-api-access-xt5b9\") on node \"crc\" DevicePath \"\"" Feb 27 01:38:59 crc kubenswrapper[4771]: I0227 01:38:59.690724 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64b58c2b-7189-40c8-94b0-c31f167845d1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.091231 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk" event={"ID":"64b58c2b-7189-40c8-94b0-c31f167845d1","Type":"ContainerDied","Data":"35c28f33681a82ada0ce6ee7739444fb08fc7aa782bc987710fb104e335f60e9"} Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.091318 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35c28f33681a82ada0ce6ee7739444fb08fc7aa782bc987710fb104e335f60e9" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.109379 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c9bfk" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.195359 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl"] Feb 27 01:39:00 crc kubenswrapper[4771]: E0227 01:39:00.195880 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8e0385d-ba62-4fcc-a059-5114bb130263" containerName="oc" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.195900 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8e0385d-ba62-4fcc-a059-5114bb130263" containerName="oc" Feb 27 01:39:00 crc kubenswrapper[4771]: E0227 01:39:00.195929 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b58c2b-7189-40c8-94b0-c31f167845d1" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.195937 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b58c2b-7189-40c8-94b0-c31f167845d1" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.196143 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8e0385d-ba62-4fcc-a059-5114bb130263" containerName="oc" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.196160 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="64b58c2b-7189-40c8-94b0-c31f167845d1" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.196901 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.200071 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.200223 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.200425 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kwjcj" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.201022 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.201262 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.203250 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.207710 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl"] Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.307086 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl\" (UID: \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.307271 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl\" (UID: \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.307315 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl\" (UID: \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.307434 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl\" (UID: \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.307463 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg4qg\" (UniqueName: \"kubernetes.io/projected/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-kube-api-access-gg4qg\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl\" (UID: \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.307713 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl\" (UID: \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.409934 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl\" (UID: \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.410016 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl\" (UID: \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.410043 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl\" (UID: \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.410099 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg4qg\" (UniqueName: \"kubernetes.io/projected/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-kube-api-access-gg4qg\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl\" (UID: \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.410127 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl\" (UID: \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.410186 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl\" (UID: \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.424621 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl\" (UID: \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.424623 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl\" (UID: \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.424622 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl\" (UID: \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.425386 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl\" (UID: \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.428221 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl\" (UID: \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.428764 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg4qg\" (UniqueName: \"kubernetes.io/projected/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-kube-api-access-gg4qg\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl\" (UID: \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" Feb 27 01:39:00 crc kubenswrapper[4771]: I0227 01:39:00.530495 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" Feb 27 01:39:01 crc kubenswrapper[4771]: I0227 01:39:01.068908 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl"] Feb 27 01:39:01 crc kubenswrapper[4771]: I0227 01:39:01.104541 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" event={"ID":"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8","Type":"ContainerStarted","Data":"5842f7578429add30cdea66dcb349dee17b3a612a5c994b0ad6410b536cc1693"} Feb 27 01:39:02 crc kubenswrapper[4771]: I0227 01:39:02.116289 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" event={"ID":"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8","Type":"ContainerStarted","Data":"bdf7b94c322e4349874d3297ff8bcbd8379411e1090d423a7b7c630a330184b5"} Feb 27 01:39:02 crc kubenswrapper[4771]: I0227 01:39:02.159505 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" podStartSLOduration=1.713589236 podStartE2EDuration="2.159479889s" podCreationTimestamp="2026-02-27 01:39:00 +0000 UTC" firstStartedPulling="2026-02-27 01:39:01.067925212 +0000 UTC m=+2054.005486500" lastFinishedPulling="2026-02-27 01:39:01.513815865 +0000 UTC m=+2054.451377153" observedRunningTime="2026-02-27 01:39:02.147484151 +0000 UTC m=+2055.085045459" watchObservedRunningTime="2026-02-27 01:39:02.159479889 +0000 UTC m=+2055.097041207" Feb 27 01:39:11 crc kubenswrapper[4771]: I0227 01:39:11.773676 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:39:11 crc kubenswrapper[4771]: E0227 01:39:11.775240 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:39:22 crc kubenswrapper[4771]: I0227 01:39:22.774490 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:39:22 crc kubenswrapper[4771]: E0227 01:39:22.775600 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:39:34 crc kubenswrapper[4771]: I0227 01:39:34.773368 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:39:35 crc kubenswrapper[4771]: I0227 01:39:35.453110 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerStarted","Data":"2fa58e5c69c6875961ade4a259d074511e93c8575b54e900bfb9f5dcc26be68a"} Feb 27 01:39:50 crc kubenswrapper[4771]: I0227 01:39:50.746822 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5cls7"] Feb 27 01:39:50 crc kubenswrapper[4771]: I0227 01:39:50.760365 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5cls7" Feb 27 01:39:50 crc kubenswrapper[4771]: I0227 01:39:50.774643 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5cls7"] Feb 27 01:39:50 crc kubenswrapper[4771]: I0227 01:39:50.862643 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpjgp\" (UniqueName: \"kubernetes.io/projected/99d5f314-0101-4405-820e-19f22ee99a4c-kube-api-access-xpjgp\") pod \"community-operators-5cls7\" (UID: \"99d5f314-0101-4405-820e-19f22ee99a4c\") " pod="openshift-marketplace/community-operators-5cls7" Feb 27 01:39:50 crc kubenswrapper[4771]: I0227 01:39:50.862735 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99d5f314-0101-4405-820e-19f22ee99a4c-utilities\") pod \"community-operators-5cls7\" (UID: \"99d5f314-0101-4405-820e-19f22ee99a4c\") " pod="openshift-marketplace/community-operators-5cls7" Feb 27 01:39:50 crc kubenswrapper[4771]: I0227 01:39:50.862906 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99d5f314-0101-4405-820e-19f22ee99a4c-catalog-content\") pod \"community-operators-5cls7\" (UID: \"99d5f314-0101-4405-820e-19f22ee99a4c\") " pod="openshift-marketplace/community-operators-5cls7" Feb 27 01:39:50 crc kubenswrapper[4771]: I0227 01:39:50.964543 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99d5f314-0101-4405-820e-19f22ee99a4c-utilities\") pod \"community-operators-5cls7\" (UID: \"99d5f314-0101-4405-820e-19f22ee99a4c\") " pod="openshift-marketplace/community-operators-5cls7" Feb 27 01:39:50 crc kubenswrapper[4771]: I0227 01:39:50.964886 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99d5f314-0101-4405-820e-19f22ee99a4c-catalog-content\") pod \"community-operators-5cls7\" (UID: \"99d5f314-0101-4405-820e-19f22ee99a4c\") " pod="openshift-marketplace/community-operators-5cls7" Feb 27 01:39:50 crc kubenswrapper[4771]: I0227 01:39:50.964956 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpjgp\" (UniqueName: \"kubernetes.io/projected/99d5f314-0101-4405-820e-19f22ee99a4c-kube-api-access-xpjgp\") pod \"community-operators-5cls7\" (UID: \"99d5f314-0101-4405-820e-19f22ee99a4c\") " pod="openshift-marketplace/community-operators-5cls7" Feb 27 01:39:50 crc kubenswrapper[4771]: I0227 01:39:50.966338 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99d5f314-0101-4405-820e-19f22ee99a4c-utilities\") pod \"community-operators-5cls7\" (UID: \"99d5f314-0101-4405-820e-19f22ee99a4c\") " pod="openshift-marketplace/community-operators-5cls7" Feb 27 01:39:50 crc kubenswrapper[4771]: I0227 01:39:50.966878 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99d5f314-0101-4405-820e-19f22ee99a4c-catalog-content\") pod \"community-operators-5cls7\" (UID: \"99d5f314-0101-4405-820e-19f22ee99a4c\") " pod="openshift-marketplace/community-operators-5cls7" Feb 27 01:39:50 crc kubenswrapper[4771]: I0227 01:39:50.986691 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpjgp\" (UniqueName: \"kubernetes.io/projected/99d5f314-0101-4405-820e-19f22ee99a4c-kube-api-access-xpjgp\") pod \"community-operators-5cls7\" (UID: \"99d5f314-0101-4405-820e-19f22ee99a4c\") " pod="openshift-marketplace/community-operators-5cls7" Feb 27 01:39:51 crc kubenswrapper[4771]: I0227 01:39:51.083610 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5cls7" Feb 27 01:39:51 crc kubenswrapper[4771]: I0227 01:39:51.652507 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5cls7"] Feb 27 01:39:52 crc kubenswrapper[4771]: I0227 01:39:52.630754 4771 generic.go:334] "Generic (PLEG): container finished" podID="9a2ce866-27c5-4ac5-8a27-d44ba505c3d8" containerID="bdf7b94c322e4349874d3297ff8bcbd8379411e1090d423a7b7c630a330184b5" exitCode=0 Feb 27 01:39:52 crc kubenswrapper[4771]: I0227 01:39:52.630861 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" event={"ID":"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8","Type":"ContainerDied","Data":"bdf7b94c322e4349874d3297ff8bcbd8379411e1090d423a7b7c630a330184b5"} Feb 27 01:39:52 crc kubenswrapper[4771]: I0227 01:39:52.634392 4771 generic.go:334] "Generic (PLEG): container finished" podID="99d5f314-0101-4405-820e-19f22ee99a4c" containerID="18f067ad41e05e5ce419f8c2a3a095a8c0a7fb9539f376e9bd2c8da2023c83fb" exitCode=0 Feb 27 01:39:52 crc kubenswrapper[4771]: I0227 01:39:52.634436 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cls7" event={"ID":"99d5f314-0101-4405-820e-19f22ee99a4c","Type":"ContainerDied","Data":"18f067ad41e05e5ce419f8c2a3a095a8c0a7fb9539f376e9bd2c8da2023c83fb"} Feb 27 01:39:52 crc kubenswrapper[4771]: I0227 01:39:52.634465 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cls7" event={"ID":"99d5f314-0101-4405-820e-19f22ee99a4c","Type":"ContainerStarted","Data":"561f99f816a528017ca22eceeb61795de56cd16b3e35f0e95aac9ef5ba301374"} Feb 27 01:39:52 crc kubenswrapper[4771]: I0227 01:39:52.638968 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.192293 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.353774 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg4qg\" (UniqueName: \"kubernetes.io/projected/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-kube-api-access-gg4qg\") pod \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\" (UID: \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\") " Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.353836 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-neutron-ovn-metadata-agent-neutron-config-0\") pod \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\" (UID: \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\") " Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.353866 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-nova-metadata-neutron-config-0\") pod \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\" (UID: \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\") " Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.353908 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-inventory\") pod \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\" (UID: \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\") " Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.354933 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-neutron-metadata-combined-ca-bundle\") pod \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\" (UID: \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\") " Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.354965 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-ssh-key-openstack-edpm-ipam\") pod \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\" (UID: \"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8\") " Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.362546 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-kube-api-access-gg4qg" (OuterVolumeSpecName: "kube-api-access-gg4qg") pod "9a2ce866-27c5-4ac5-8a27-d44ba505c3d8" (UID: "9a2ce866-27c5-4ac5-8a27-d44ba505c3d8"). InnerVolumeSpecName "kube-api-access-gg4qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.368707 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9a2ce866-27c5-4ac5-8a27-d44ba505c3d8" (UID: "9a2ce866-27c5-4ac5-8a27-d44ba505c3d8"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.402130 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "9a2ce866-27c5-4ac5-8a27-d44ba505c3d8" (UID: "9a2ce866-27c5-4ac5-8a27-d44ba505c3d8"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.402894 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9a2ce866-27c5-4ac5-8a27-d44ba505c3d8" (UID: "9a2ce866-27c5-4ac5-8a27-d44ba505c3d8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.422448 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-inventory" (OuterVolumeSpecName: "inventory") pod "9a2ce866-27c5-4ac5-8a27-d44ba505c3d8" (UID: "9a2ce866-27c5-4ac5-8a27-d44ba505c3d8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.424370 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "9a2ce866-27c5-4ac5-8a27-d44ba505c3d8" (UID: "9a2ce866-27c5-4ac5-8a27-d44ba505c3d8"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.457584 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg4qg\" (UniqueName: \"kubernetes.io/projected/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-kube-api-access-gg4qg\") on node \"crc\" DevicePath \"\"" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.457635 4771 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.457652 4771 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.457668 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.457681 4771 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.457696 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a2ce866-27c5-4ac5-8a27-d44ba505c3d8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.658639 4771 generic.go:334] "Generic (PLEG): container finished" podID="99d5f314-0101-4405-820e-19f22ee99a4c" containerID="cc4c4d436bfc2d82fe12f1161d05a21312e10dc5bec20c918c2fd0d8e3c2f668" exitCode=0 Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.658795 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cls7" event={"ID":"99d5f314-0101-4405-820e-19f22ee99a4c","Type":"ContainerDied","Data":"cc4c4d436bfc2d82fe12f1161d05a21312e10dc5bec20c918c2fd0d8e3c2f668"} Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.662165 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" event={"ID":"9a2ce866-27c5-4ac5-8a27-d44ba505c3d8","Type":"ContainerDied","Data":"5842f7578429add30cdea66dcb349dee17b3a612a5c994b0ad6410b536cc1693"} Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.662227 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5842f7578429add30cdea66dcb349dee17b3a612a5c994b0ad6410b536cc1693" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.662311 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.780747 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz"] Feb 27 01:39:54 crc kubenswrapper[4771]: E0227 01:39:54.781397 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a2ce866-27c5-4ac5-8a27-d44ba505c3d8" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.781429 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a2ce866-27c5-4ac5-8a27-d44ba505c3d8" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.781724 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a2ce866-27c5-4ac5-8a27-d44ba505c3d8" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.782757 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.785261 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.785474 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.785702 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kwjcj" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.785766 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.785916 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.797637 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz"] Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.968699 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40c7ae0e-123b-42cf-99cf-57309d7c22b0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz\" (UID: \"40c7ae0e-123b-42cf-99cf-57309d7c22b0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.968912 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/40c7ae0e-123b-42cf-99cf-57309d7c22b0-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz\" (UID: \"40c7ae0e-123b-42cf-99cf-57309d7c22b0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.971049 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c7ae0e-123b-42cf-99cf-57309d7c22b0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz\" (UID: \"40c7ae0e-123b-42cf-99cf-57309d7c22b0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.971371 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29sfw\" (UniqueName: \"kubernetes.io/projected/40c7ae0e-123b-42cf-99cf-57309d7c22b0-kube-api-access-29sfw\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz\" (UID: \"40c7ae0e-123b-42cf-99cf-57309d7c22b0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz" Feb 27 01:39:54 crc kubenswrapper[4771]: I0227 01:39:54.971972 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/40c7ae0e-123b-42cf-99cf-57309d7c22b0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz\" (UID: \"40c7ae0e-123b-42cf-99cf-57309d7c22b0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz" Feb 27 01:39:55 crc kubenswrapper[4771]: I0227 01:39:55.076033 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40c7ae0e-123b-42cf-99cf-57309d7c22b0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz\" (UID: \"40c7ae0e-123b-42cf-99cf-57309d7c22b0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz" Feb 27 01:39:55 crc kubenswrapper[4771]: I0227 01:39:55.077422 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/40c7ae0e-123b-42cf-99cf-57309d7c22b0-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz\" (UID: \"40c7ae0e-123b-42cf-99cf-57309d7c22b0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz" Feb 27 01:39:55 crc kubenswrapper[4771]: I0227 01:39:55.077767 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c7ae0e-123b-42cf-99cf-57309d7c22b0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz\" (UID: \"40c7ae0e-123b-42cf-99cf-57309d7c22b0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz" Feb 27 01:39:55 crc kubenswrapper[4771]: I0227 01:39:55.077862 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29sfw\" (UniqueName: \"kubernetes.io/projected/40c7ae0e-123b-42cf-99cf-57309d7c22b0-kube-api-access-29sfw\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz\" (UID: \"40c7ae0e-123b-42cf-99cf-57309d7c22b0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz" Feb 27 01:39:55 crc kubenswrapper[4771]: I0227 01:39:55.077918 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/40c7ae0e-123b-42cf-99cf-57309d7c22b0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz\" (UID: \"40c7ae0e-123b-42cf-99cf-57309d7c22b0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz" Feb 27 01:39:55 crc kubenswrapper[4771]: I0227 01:39:55.084072 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c7ae0e-123b-42cf-99cf-57309d7c22b0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz\" (UID: \"40c7ae0e-123b-42cf-99cf-57309d7c22b0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz" Feb 27 01:39:55 crc kubenswrapper[4771]: I0227 01:39:55.085692 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/40c7ae0e-123b-42cf-99cf-57309d7c22b0-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz\" (UID: \"40c7ae0e-123b-42cf-99cf-57309d7c22b0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz" Feb 27 01:39:55 crc kubenswrapper[4771]: I0227 01:39:55.087144 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40c7ae0e-123b-42cf-99cf-57309d7c22b0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz\" (UID: \"40c7ae0e-123b-42cf-99cf-57309d7c22b0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz" Feb 27 01:39:55 crc kubenswrapper[4771]: I0227 01:39:55.087385 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/40c7ae0e-123b-42cf-99cf-57309d7c22b0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz\" (UID: \"40c7ae0e-123b-42cf-99cf-57309d7c22b0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz" Feb 27 01:39:55 crc kubenswrapper[4771]: I0227 01:39:55.107800 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29sfw\" (UniqueName: \"kubernetes.io/projected/40c7ae0e-123b-42cf-99cf-57309d7c22b0-kube-api-access-29sfw\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz\" (UID: \"40c7ae0e-123b-42cf-99cf-57309d7c22b0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz" Feb 27 01:39:55 crc kubenswrapper[4771]: I0227 01:39:55.408586 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz" Feb 27 01:39:55 crc kubenswrapper[4771]: I0227 01:39:55.673265 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cls7" event={"ID":"99d5f314-0101-4405-820e-19f22ee99a4c","Type":"ContainerStarted","Data":"983284cb4c6c5916546d24a7d309180b8d53e203ad62b9429274959fa86d9c63"} Feb 27 01:39:55 crc kubenswrapper[4771]: I0227 01:39:55.695824 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5cls7" podStartSLOduration=3.233331283 podStartE2EDuration="5.695804345s" podCreationTimestamp="2026-02-27 01:39:50 +0000 UTC" firstStartedPulling="2026-02-27 01:39:52.636762861 +0000 UTC m=+2105.574324159" lastFinishedPulling="2026-02-27 01:39:55.099235893 +0000 UTC m=+2108.036797221" observedRunningTime="2026-02-27 01:39:55.687413655 +0000 UTC m=+2108.624974943" watchObservedRunningTime="2026-02-27 01:39:55.695804345 +0000 UTC m=+2108.633365633" Feb 27 01:39:55 crc kubenswrapper[4771]: I0227 01:39:55.950021 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz"] Feb 27 01:39:55 crc kubenswrapper[4771]: W0227 01:39:55.953992 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40c7ae0e_123b_42cf_99cf_57309d7c22b0.slice/crio-c71b92ac50bc7ab5890b45a77763c22a8eb33d27d98d1283dfc0393a381cb846 WatchSource:0}: Error finding container c71b92ac50bc7ab5890b45a77763c22a8eb33d27d98d1283dfc0393a381cb846: Status 404 returned error can't find the container with id c71b92ac50bc7ab5890b45a77763c22a8eb33d27d98d1283dfc0393a381cb846 Feb 27 01:39:56 crc kubenswrapper[4771]: I0227 01:39:56.685248 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz" event={"ID":"40c7ae0e-123b-42cf-99cf-57309d7c22b0","Type":"ContainerStarted","Data":"fa68ff7b11ac3d5081d22fc1a7e2eee79bb32e8839e91c6eb57c62684a2e362d"} Feb 27 01:39:56 crc kubenswrapper[4771]: I0227 01:39:56.685645 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz" event={"ID":"40c7ae0e-123b-42cf-99cf-57309d7c22b0","Type":"ContainerStarted","Data":"c71b92ac50bc7ab5890b45a77763c22a8eb33d27d98d1283dfc0393a381cb846"} Feb 27 01:39:56 crc kubenswrapper[4771]: I0227 01:39:56.707592 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz" podStartSLOduration=2.253360839 podStartE2EDuration="2.70757472s" podCreationTimestamp="2026-02-27 01:39:54 +0000 UTC" firstStartedPulling="2026-02-27 01:39:55.95641547 +0000 UTC m=+2108.893976758" lastFinishedPulling="2026-02-27 01:39:56.410629341 +0000 UTC m=+2109.348190639" observedRunningTime="2026-02-27 01:39:56.701614996 +0000 UTC m=+2109.639176294" watchObservedRunningTime="2026-02-27 01:39:56.70757472 +0000 UTC m=+2109.645136008" Feb 27 01:40:00 crc kubenswrapper[4771]: I0227 01:40:00.129767 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535940-t6x6t"] Feb 27 01:40:00 crc kubenswrapper[4771]: I0227 01:40:00.132959 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535940-t6x6t" Feb 27 01:40:00 crc kubenswrapper[4771]: I0227 01:40:00.135465 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:40:00 crc kubenswrapper[4771]: I0227 01:40:00.135482 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 01:40:00 crc kubenswrapper[4771]: I0227 01:40:00.135479 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:40:00 crc kubenswrapper[4771]: I0227 01:40:00.139208 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535940-t6x6t"] Feb 27 01:40:00 crc kubenswrapper[4771]: I0227 01:40:00.286178 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n8tp\" (UniqueName: \"kubernetes.io/projected/2bb33d2c-2a90-4f39-b5a4-1f18141ad41d-kube-api-access-6n8tp\") pod \"auto-csr-approver-29535940-t6x6t\" (UID: \"2bb33d2c-2a90-4f39-b5a4-1f18141ad41d\") " pod="openshift-infra/auto-csr-approver-29535940-t6x6t" Feb 27 01:40:00 crc kubenswrapper[4771]: I0227 01:40:00.387915 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n8tp\" (UniqueName: \"kubernetes.io/projected/2bb33d2c-2a90-4f39-b5a4-1f18141ad41d-kube-api-access-6n8tp\") pod \"auto-csr-approver-29535940-t6x6t\" (UID: \"2bb33d2c-2a90-4f39-b5a4-1f18141ad41d\") " pod="openshift-infra/auto-csr-approver-29535940-t6x6t" Feb 27 01:40:00 crc kubenswrapper[4771]: I0227 01:40:00.414424 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n8tp\" (UniqueName: \"kubernetes.io/projected/2bb33d2c-2a90-4f39-b5a4-1f18141ad41d-kube-api-access-6n8tp\") pod \"auto-csr-approver-29535940-t6x6t\" (UID: \"2bb33d2c-2a90-4f39-b5a4-1f18141ad41d\") " pod="openshift-infra/auto-csr-approver-29535940-t6x6t" Feb 27 01:40:00 crc kubenswrapper[4771]: I0227 01:40:00.456242 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535940-t6x6t" Feb 27 01:40:00 crc kubenswrapper[4771]: I0227 01:40:00.934724 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535940-t6x6t"] Feb 27 01:40:00 crc kubenswrapper[4771]: W0227 01:40:00.945190 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bb33d2c_2a90_4f39_b5a4_1f18141ad41d.slice/crio-7e7b8ac5e7598d5fe3498dcc19b3bffa626955167982acece76e598106d3ad88 WatchSource:0}: Error finding container 7e7b8ac5e7598d5fe3498dcc19b3bffa626955167982acece76e598106d3ad88: Status 404 returned error can't find the container with id 7e7b8ac5e7598d5fe3498dcc19b3bffa626955167982acece76e598106d3ad88 Feb 27 01:40:01 crc kubenswrapper[4771]: I0227 01:40:01.085045 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5cls7" Feb 27 01:40:01 crc kubenswrapper[4771]: I0227 01:40:01.085094 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5cls7" Feb 27 01:40:01 crc kubenswrapper[4771]: I0227 01:40:01.133143 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5cls7" Feb 27 01:40:01 crc kubenswrapper[4771]: I0227 01:40:01.736779 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535940-t6x6t" event={"ID":"2bb33d2c-2a90-4f39-b5a4-1f18141ad41d","Type":"ContainerStarted","Data":"7e7b8ac5e7598d5fe3498dcc19b3bffa626955167982acece76e598106d3ad88"} Feb 27 01:40:01 crc kubenswrapper[4771]: I0227 01:40:01.804276 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5cls7" Feb 27 01:40:01 crc kubenswrapper[4771]: I0227 01:40:01.857659 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5cls7"] Feb 27 01:40:02 crc kubenswrapper[4771]: I0227 01:40:02.750978 4771 generic.go:334] "Generic (PLEG): container finished" podID="2bb33d2c-2a90-4f39-b5a4-1f18141ad41d" containerID="825dec4b2239079b87f038b9becca27e61ea99a634550930605fb7928de59bea" exitCode=0 Feb 27 01:40:02 crc kubenswrapper[4771]: I0227 01:40:02.751075 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535940-t6x6t" event={"ID":"2bb33d2c-2a90-4f39-b5a4-1f18141ad41d","Type":"ContainerDied","Data":"825dec4b2239079b87f038b9becca27e61ea99a634550930605fb7928de59bea"} Feb 27 01:40:03 crc kubenswrapper[4771]: I0227 01:40:03.762419 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5cls7" podUID="99d5f314-0101-4405-820e-19f22ee99a4c" containerName="registry-server" containerID="cri-o://983284cb4c6c5916546d24a7d309180b8d53e203ad62b9429274959fa86d9c63" gracePeriod=2 Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.103284 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535940-t6x6t" Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.251702 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5cls7" Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.268120 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n8tp\" (UniqueName: \"kubernetes.io/projected/2bb33d2c-2a90-4f39-b5a4-1f18141ad41d-kube-api-access-6n8tp\") pod \"2bb33d2c-2a90-4f39-b5a4-1f18141ad41d\" (UID: \"2bb33d2c-2a90-4f39-b5a4-1f18141ad41d\") " Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.274396 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bb33d2c-2a90-4f39-b5a4-1f18141ad41d-kube-api-access-6n8tp" (OuterVolumeSpecName: "kube-api-access-6n8tp") pod "2bb33d2c-2a90-4f39-b5a4-1f18141ad41d" (UID: "2bb33d2c-2a90-4f39-b5a4-1f18141ad41d"). InnerVolumeSpecName "kube-api-access-6n8tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.370269 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpjgp\" (UniqueName: \"kubernetes.io/projected/99d5f314-0101-4405-820e-19f22ee99a4c-kube-api-access-xpjgp\") pod \"99d5f314-0101-4405-820e-19f22ee99a4c\" (UID: \"99d5f314-0101-4405-820e-19f22ee99a4c\") " Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.370640 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99d5f314-0101-4405-820e-19f22ee99a4c-catalog-content\") pod \"99d5f314-0101-4405-820e-19f22ee99a4c\" (UID: \"99d5f314-0101-4405-820e-19f22ee99a4c\") " Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.370807 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99d5f314-0101-4405-820e-19f22ee99a4c-utilities\") pod \"99d5f314-0101-4405-820e-19f22ee99a4c\" (UID: \"99d5f314-0101-4405-820e-19f22ee99a4c\") " Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.371424 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n8tp\" (UniqueName: \"kubernetes.io/projected/2bb33d2c-2a90-4f39-b5a4-1f18141ad41d-kube-api-access-6n8tp\") on node \"crc\" DevicePath \"\"" Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.371930 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99d5f314-0101-4405-820e-19f22ee99a4c-utilities" (OuterVolumeSpecName: "utilities") pod "99d5f314-0101-4405-820e-19f22ee99a4c" (UID: "99d5f314-0101-4405-820e-19f22ee99a4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.374331 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99d5f314-0101-4405-820e-19f22ee99a4c-kube-api-access-xpjgp" (OuterVolumeSpecName: "kube-api-access-xpjgp") pod "99d5f314-0101-4405-820e-19f22ee99a4c" (UID: "99d5f314-0101-4405-820e-19f22ee99a4c"). InnerVolumeSpecName "kube-api-access-xpjgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.427418 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99d5f314-0101-4405-820e-19f22ee99a4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99d5f314-0101-4405-820e-19f22ee99a4c" (UID: "99d5f314-0101-4405-820e-19f22ee99a4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.473539 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99d5f314-0101-4405-820e-19f22ee99a4c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.473649 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99d5f314-0101-4405-820e-19f22ee99a4c-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.473666 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpjgp\" (UniqueName: \"kubernetes.io/projected/99d5f314-0101-4405-820e-19f22ee99a4c-kube-api-access-xpjgp\") on node \"crc\" DevicePath \"\"" Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.782814 4771 generic.go:334] "Generic (PLEG): container finished" podID="99d5f314-0101-4405-820e-19f22ee99a4c" containerID="983284cb4c6c5916546d24a7d309180b8d53e203ad62b9429274959fa86d9c63" exitCode=0 Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.782871 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cls7" event={"ID":"99d5f314-0101-4405-820e-19f22ee99a4c","Type":"ContainerDied","Data":"983284cb4c6c5916546d24a7d309180b8d53e203ad62b9429274959fa86d9c63"} Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.782931 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cls7" event={"ID":"99d5f314-0101-4405-820e-19f22ee99a4c","Type":"ContainerDied","Data":"561f99f816a528017ca22eceeb61795de56cd16b3e35f0e95aac9ef5ba301374"} Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.782941 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5cls7" Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.782953 4771 scope.go:117] "RemoveContainer" containerID="983284cb4c6c5916546d24a7d309180b8d53e203ad62b9429274959fa86d9c63" Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.785416 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535940-t6x6t" event={"ID":"2bb33d2c-2a90-4f39-b5a4-1f18141ad41d","Type":"ContainerDied","Data":"7e7b8ac5e7598d5fe3498dcc19b3bffa626955167982acece76e598106d3ad88"} Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.785442 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e7b8ac5e7598d5fe3498dcc19b3bffa626955167982acece76e598106d3ad88" Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.785508 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535940-t6x6t" Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.816706 4771 scope.go:117] "RemoveContainer" containerID="cc4c4d436bfc2d82fe12f1161d05a21312e10dc5bec20c918c2fd0d8e3c2f668" Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.848900 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5cls7"] Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.863307 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5cls7"] Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.873997 4771 scope.go:117] "RemoveContainer" containerID="18f067ad41e05e5ce419f8c2a3a095a8c0a7fb9539f376e9bd2c8da2023c83fb" Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.945935 4771 scope.go:117] "RemoveContainer" containerID="983284cb4c6c5916546d24a7d309180b8d53e203ad62b9429274959fa86d9c63" Feb 27 01:40:04 crc kubenswrapper[4771]: E0227 01:40:04.946811 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"983284cb4c6c5916546d24a7d309180b8d53e203ad62b9429274959fa86d9c63\": container with ID starting with 983284cb4c6c5916546d24a7d309180b8d53e203ad62b9429274959fa86d9c63 not found: ID does not exist" containerID="983284cb4c6c5916546d24a7d309180b8d53e203ad62b9429274959fa86d9c63" Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.946866 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"983284cb4c6c5916546d24a7d309180b8d53e203ad62b9429274959fa86d9c63"} err="failed to get container status \"983284cb4c6c5916546d24a7d309180b8d53e203ad62b9429274959fa86d9c63\": rpc error: code = NotFound desc = could not find container \"983284cb4c6c5916546d24a7d309180b8d53e203ad62b9429274959fa86d9c63\": container with ID starting with 983284cb4c6c5916546d24a7d309180b8d53e203ad62b9429274959fa86d9c63 not found: ID does not exist" Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.946899 4771 scope.go:117] "RemoveContainer" containerID="cc4c4d436bfc2d82fe12f1161d05a21312e10dc5bec20c918c2fd0d8e3c2f668" Feb 27 01:40:04 crc kubenswrapper[4771]: E0227 01:40:04.947333 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc4c4d436bfc2d82fe12f1161d05a21312e10dc5bec20c918c2fd0d8e3c2f668\": container with ID starting with cc4c4d436bfc2d82fe12f1161d05a21312e10dc5bec20c918c2fd0d8e3c2f668 not found: ID does not exist" containerID="cc4c4d436bfc2d82fe12f1161d05a21312e10dc5bec20c918c2fd0d8e3c2f668" Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.947380 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc4c4d436bfc2d82fe12f1161d05a21312e10dc5bec20c918c2fd0d8e3c2f668"} err="failed to get container status \"cc4c4d436bfc2d82fe12f1161d05a21312e10dc5bec20c918c2fd0d8e3c2f668\": rpc error: code = NotFound desc = could not find container \"cc4c4d436bfc2d82fe12f1161d05a21312e10dc5bec20c918c2fd0d8e3c2f668\": container with ID starting with cc4c4d436bfc2d82fe12f1161d05a21312e10dc5bec20c918c2fd0d8e3c2f668 not found: ID does not exist" Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.947407 4771 scope.go:117] "RemoveContainer" containerID="18f067ad41e05e5ce419f8c2a3a095a8c0a7fb9539f376e9bd2c8da2023c83fb" Feb 27 01:40:04 crc kubenswrapper[4771]: E0227 01:40:04.948724 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18f067ad41e05e5ce419f8c2a3a095a8c0a7fb9539f376e9bd2c8da2023c83fb\": container with ID starting with 18f067ad41e05e5ce419f8c2a3a095a8c0a7fb9539f376e9bd2c8da2023c83fb not found: ID does not exist" containerID="18f067ad41e05e5ce419f8c2a3a095a8c0a7fb9539f376e9bd2c8da2023c83fb" Feb 27 01:40:04 crc kubenswrapper[4771]: I0227 01:40:04.948879 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18f067ad41e05e5ce419f8c2a3a095a8c0a7fb9539f376e9bd2c8da2023c83fb"} err="failed to get container status \"18f067ad41e05e5ce419f8c2a3a095a8c0a7fb9539f376e9bd2c8da2023c83fb\": rpc error: code = NotFound desc = could not find container \"18f067ad41e05e5ce419f8c2a3a095a8c0a7fb9539f376e9bd2c8da2023c83fb\": container with ID starting with 18f067ad41e05e5ce419f8c2a3a095a8c0a7fb9539f376e9bd2c8da2023c83fb not found: ID does not exist" Feb 27 01:40:05 crc kubenswrapper[4771]: I0227 01:40:05.181787 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535934-lh2pf"] Feb 27 01:40:05 crc kubenswrapper[4771]: I0227 01:40:05.189992 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535934-lh2pf"] Feb 27 01:40:05 crc kubenswrapper[4771]: I0227 01:40:05.785657 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e4c1bda-e210-4613-9bd2-34b82bc45640" path="/var/lib/kubelet/pods/2e4c1bda-e210-4613-9bd2-34b82bc45640/volumes" Feb 27 01:40:05 crc kubenswrapper[4771]: I0227 01:40:05.786541 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99d5f314-0101-4405-820e-19f22ee99a4c" path="/var/lib/kubelet/pods/99d5f314-0101-4405-820e-19f22ee99a4c/volumes" Feb 27 01:40:29 crc kubenswrapper[4771]: I0227 01:40:29.165597 4771 scope.go:117] "RemoveContainer" containerID="2da1f5f128734333c8e6b022f59c10e537034ce9ee5b4bb44a76bae45cd9bbf9" Feb 27 01:41:13 crc kubenswrapper[4771]: I0227 01:41:13.816595 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m5sqt"] Feb 27 01:41:13 crc kubenswrapper[4771]: E0227 01:41:13.817727 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb33d2c-2a90-4f39-b5a4-1f18141ad41d" containerName="oc" Feb 27 01:41:13 crc kubenswrapper[4771]: I0227 01:41:13.817745 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb33d2c-2a90-4f39-b5a4-1f18141ad41d" containerName="oc" Feb 27 01:41:13 crc kubenswrapper[4771]: E0227 01:41:13.817770 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d5f314-0101-4405-820e-19f22ee99a4c" containerName="extract-content" Feb 27 01:41:13 crc kubenswrapper[4771]: I0227 01:41:13.817779 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d5f314-0101-4405-820e-19f22ee99a4c" containerName="extract-content" Feb 27 01:41:13 crc kubenswrapper[4771]: E0227 01:41:13.817810 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d5f314-0101-4405-820e-19f22ee99a4c" containerName="registry-server" Feb 27 01:41:13 crc kubenswrapper[4771]: I0227 01:41:13.817819 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d5f314-0101-4405-820e-19f22ee99a4c" containerName="registry-server" Feb 27 01:41:13 crc kubenswrapper[4771]: E0227 01:41:13.817840 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d5f314-0101-4405-820e-19f22ee99a4c" containerName="extract-utilities" Feb 27 01:41:13 crc kubenswrapper[4771]: I0227 01:41:13.817848 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d5f314-0101-4405-820e-19f22ee99a4c" containerName="extract-utilities" Feb 27 01:41:13 crc kubenswrapper[4771]: I0227 01:41:13.818071 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="99d5f314-0101-4405-820e-19f22ee99a4c" containerName="registry-server" Feb 27 01:41:13 crc kubenswrapper[4771]: I0227 01:41:13.818094 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb33d2c-2a90-4f39-b5a4-1f18141ad41d" containerName="oc" Feb 27 01:41:13 crc kubenswrapper[4771]: I0227 01:41:13.819903 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m5sqt" Feb 27 01:41:13 crc kubenswrapper[4771]: I0227 01:41:13.828941 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m5sqt"] Feb 27 01:41:13 crc kubenswrapper[4771]: I0227 01:41:13.942408 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sxqm\" (UniqueName: \"kubernetes.io/projected/98a4919b-d031-42ef-ae21-df6e724be151-kube-api-access-7sxqm\") pod \"certified-operators-m5sqt\" (UID: \"98a4919b-d031-42ef-ae21-df6e724be151\") " pod="openshift-marketplace/certified-operators-m5sqt" Feb 27 01:41:13 crc kubenswrapper[4771]: I0227 01:41:13.945510 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98a4919b-d031-42ef-ae21-df6e724be151-catalog-content\") pod \"certified-operators-m5sqt\" (UID: \"98a4919b-d031-42ef-ae21-df6e724be151\") " pod="openshift-marketplace/certified-operators-m5sqt" Feb 27 01:41:13 crc kubenswrapper[4771]: I0227 01:41:13.946251 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98a4919b-d031-42ef-ae21-df6e724be151-utilities\") pod \"certified-operators-m5sqt\" (UID: \"98a4919b-d031-42ef-ae21-df6e724be151\") " pod="openshift-marketplace/certified-operators-m5sqt" Feb 27 01:41:14 crc kubenswrapper[4771]: I0227 01:41:14.048851 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98a4919b-d031-42ef-ae21-df6e724be151-utilities\") pod \"certified-operators-m5sqt\" (UID: \"98a4919b-d031-42ef-ae21-df6e724be151\") " pod="openshift-marketplace/certified-operators-m5sqt" Feb 27 01:41:14 crc kubenswrapper[4771]: I0227 01:41:14.049435 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sxqm\" (UniqueName: \"kubernetes.io/projected/98a4919b-d031-42ef-ae21-df6e724be151-kube-api-access-7sxqm\") pod \"certified-operators-m5sqt\" (UID: \"98a4919b-d031-42ef-ae21-df6e724be151\") " pod="openshift-marketplace/certified-operators-m5sqt" Feb 27 01:41:14 crc kubenswrapper[4771]: I0227 01:41:14.049508 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98a4919b-d031-42ef-ae21-df6e724be151-catalog-content\") pod \"certified-operators-m5sqt\" (UID: \"98a4919b-d031-42ef-ae21-df6e724be151\") " pod="openshift-marketplace/certified-operators-m5sqt" Feb 27 01:41:14 crc kubenswrapper[4771]: I0227 01:41:14.049988 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98a4919b-d031-42ef-ae21-df6e724be151-utilities\") pod \"certified-operators-m5sqt\" (UID: \"98a4919b-d031-42ef-ae21-df6e724be151\") " pod="openshift-marketplace/certified-operators-m5sqt" Feb 27 01:41:14 crc kubenswrapper[4771]: I0227 01:41:14.050121 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98a4919b-d031-42ef-ae21-df6e724be151-catalog-content\") pod \"certified-operators-m5sqt\" (UID: \"98a4919b-d031-42ef-ae21-df6e724be151\") " pod="openshift-marketplace/certified-operators-m5sqt" Feb 27 01:41:14 crc kubenswrapper[4771]: I0227 01:41:14.072674 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sxqm\" (UniqueName: \"kubernetes.io/projected/98a4919b-d031-42ef-ae21-df6e724be151-kube-api-access-7sxqm\") pod \"certified-operators-m5sqt\" (UID: \"98a4919b-d031-42ef-ae21-df6e724be151\") " pod="openshift-marketplace/certified-operators-m5sqt" Feb 27 01:41:14 crc kubenswrapper[4771]: I0227 01:41:14.159093 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m5sqt" Feb 27 01:41:14 crc kubenswrapper[4771]: I0227 01:41:14.647814 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m5sqt"] Feb 27 01:41:15 crc kubenswrapper[4771]: I0227 01:41:15.460039 4771 generic.go:334] "Generic (PLEG): container finished" podID="98a4919b-d031-42ef-ae21-df6e724be151" containerID="90a9798bd0c18a3bb334c63b20ae644ade109fa93f81149b0bcb6648cc57354f" exitCode=0 Feb 27 01:41:15 crc kubenswrapper[4771]: I0227 01:41:15.460483 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5sqt" event={"ID":"98a4919b-d031-42ef-ae21-df6e724be151","Type":"ContainerDied","Data":"90a9798bd0c18a3bb334c63b20ae644ade109fa93f81149b0bcb6648cc57354f"} Feb 27 01:41:15 crc kubenswrapper[4771]: I0227 01:41:15.460537 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5sqt" event={"ID":"98a4919b-d031-42ef-ae21-df6e724be151","Type":"ContainerStarted","Data":"cb47b444188a9dc2d63c03d98640c50ae877cf56866eb3a88070a4d3f666df3f"} Feb 27 01:41:16 crc kubenswrapper[4771]: I0227 01:41:16.470101 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5sqt" event={"ID":"98a4919b-d031-42ef-ae21-df6e724be151","Type":"ContainerStarted","Data":"b13c31ba8fb63f9042dd5ab3076ce4824ea101514258d5322f620f8c173226a6"} Feb 27 01:41:17 crc kubenswrapper[4771]: I0227 01:41:17.497365 4771 generic.go:334] "Generic (PLEG): container finished" podID="98a4919b-d031-42ef-ae21-df6e724be151" containerID="b13c31ba8fb63f9042dd5ab3076ce4824ea101514258d5322f620f8c173226a6" exitCode=0 Feb 27 01:41:17 crc kubenswrapper[4771]: I0227 01:41:17.497508 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5sqt" event={"ID":"98a4919b-d031-42ef-ae21-df6e724be151","Type":"ContainerDied","Data":"b13c31ba8fb63f9042dd5ab3076ce4824ea101514258d5322f620f8c173226a6"} Feb 27 01:41:18 crc kubenswrapper[4771]: I0227 01:41:18.510820 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5sqt" event={"ID":"98a4919b-d031-42ef-ae21-df6e724be151","Type":"ContainerStarted","Data":"9c56a9503fb1dddd9b4fa26292256bc128edd36193235e17ca55a98bbf4d4238"} Feb 27 01:41:18 crc kubenswrapper[4771]: I0227 01:41:18.539432 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m5sqt" podStartSLOduration=2.967476606 podStartE2EDuration="5.539412521s" podCreationTimestamp="2026-02-27 01:41:13 +0000 UTC" firstStartedPulling="2026-02-27 01:41:15.463252859 +0000 UTC m=+2188.400814177" lastFinishedPulling="2026-02-27 01:41:18.035188784 +0000 UTC m=+2190.972750092" observedRunningTime="2026-02-27 01:41:18.531043792 +0000 UTC m=+2191.468605080" watchObservedRunningTime="2026-02-27 01:41:18.539412521 +0000 UTC m=+2191.476973829" Feb 27 01:41:24 crc kubenswrapper[4771]: I0227 01:41:24.160439 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m5sqt" Feb 27 01:41:24 crc kubenswrapper[4771]: I0227 01:41:24.160882 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m5sqt" Feb 27 01:41:24 crc kubenswrapper[4771]: I0227 01:41:24.230602 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m5sqt" Feb 27 01:41:24 crc kubenswrapper[4771]: I0227 01:41:24.621369 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m5sqt" Feb 27 01:41:24 crc kubenswrapper[4771]: I0227 01:41:24.670109 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m5sqt"] Feb 27 01:41:26 crc kubenswrapper[4771]: I0227 01:41:26.595491 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m5sqt" podUID="98a4919b-d031-42ef-ae21-df6e724be151" containerName="registry-server" containerID="cri-o://9c56a9503fb1dddd9b4fa26292256bc128edd36193235e17ca55a98bbf4d4238" gracePeriod=2 Feb 27 01:41:27 crc kubenswrapper[4771]: I0227 01:41:27.078634 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m5sqt" Feb 27 01:41:27 crc kubenswrapper[4771]: I0227 01:41:27.210146 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98a4919b-d031-42ef-ae21-df6e724be151-utilities\") pod \"98a4919b-d031-42ef-ae21-df6e724be151\" (UID: \"98a4919b-d031-42ef-ae21-df6e724be151\") " Feb 27 01:41:27 crc kubenswrapper[4771]: I0227 01:41:27.210408 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98a4919b-d031-42ef-ae21-df6e724be151-catalog-content\") pod \"98a4919b-d031-42ef-ae21-df6e724be151\" (UID: \"98a4919b-d031-42ef-ae21-df6e724be151\") " Feb 27 01:41:27 crc kubenswrapper[4771]: I0227 01:41:27.210455 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sxqm\" (UniqueName: \"kubernetes.io/projected/98a4919b-d031-42ef-ae21-df6e724be151-kube-api-access-7sxqm\") pod \"98a4919b-d031-42ef-ae21-df6e724be151\" (UID: \"98a4919b-d031-42ef-ae21-df6e724be151\") " Feb 27 01:41:27 crc kubenswrapper[4771]: I0227 01:41:27.211158 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98a4919b-d031-42ef-ae21-df6e724be151-utilities" (OuterVolumeSpecName: "utilities") pod "98a4919b-d031-42ef-ae21-df6e724be151" (UID: "98a4919b-d031-42ef-ae21-df6e724be151"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:41:27 crc kubenswrapper[4771]: I0227 01:41:27.219916 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98a4919b-d031-42ef-ae21-df6e724be151-kube-api-access-7sxqm" (OuterVolumeSpecName: "kube-api-access-7sxqm") pod "98a4919b-d031-42ef-ae21-df6e724be151" (UID: "98a4919b-d031-42ef-ae21-df6e724be151"). InnerVolumeSpecName "kube-api-access-7sxqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:41:27 crc kubenswrapper[4771]: I0227 01:41:27.312658 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sxqm\" (UniqueName: \"kubernetes.io/projected/98a4919b-d031-42ef-ae21-df6e724be151-kube-api-access-7sxqm\") on node \"crc\" DevicePath \"\"" Feb 27 01:41:27 crc kubenswrapper[4771]: I0227 01:41:27.312694 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98a4919b-d031-42ef-ae21-df6e724be151-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:41:27 crc kubenswrapper[4771]: I0227 01:41:27.611244 4771 generic.go:334] "Generic (PLEG): container finished" podID="98a4919b-d031-42ef-ae21-df6e724be151" containerID="9c56a9503fb1dddd9b4fa26292256bc128edd36193235e17ca55a98bbf4d4238" exitCode=0 Feb 27 01:41:27 crc kubenswrapper[4771]: I0227 01:41:27.611302 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5sqt" event={"ID":"98a4919b-d031-42ef-ae21-df6e724be151","Type":"ContainerDied","Data":"9c56a9503fb1dddd9b4fa26292256bc128edd36193235e17ca55a98bbf4d4238"} Feb 27 01:41:27 crc kubenswrapper[4771]: I0227 01:41:27.611659 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5sqt" event={"ID":"98a4919b-d031-42ef-ae21-df6e724be151","Type":"ContainerDied","Data":"cb47b444188a9dc2d63c03d98640c50ae877cf56866eb3a88070a4d3f666df3f"} Feb 27 01:41:27 crc kubenswrapper[4771]: I0227 01:41:27.611689 4771 scope.go:117] "RemoveContainer" containerID="9c56a9503fb1dddd9b4fa26292256bc128edd36193235e17ca55a98bbf4d4238" Feb 27 01:41:27 crc kubenswrapper[4771]: I0227 01:41:27.611338 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m5sqt" Feb 27 01:41:27 crc kubenswrapper[4771]: I0227 01:41:27.648838 4771 scope.go:117] "RemoveContainer" containerID="b13c31ba8fb63f9042dd5ab3076ce4824ea101514258d5322f620f8c173226a6" Feb 27 01:41:27 crc kubenswrapper[4771]: I0227 01:41:27.679421 4771 scope.go:117] "RemoveContainer" containerID="90a9798bd0c18a3bb334c63b20ae644ade109fa93f81149b0bcb6648cc57354f" Feb 27 01:41:27 crc kubenswrapper[4771]: I0227 01:41:27.747548 4771 scope.go:117] "RemoveContainer" containerID="9c56a9503fb1dddd9b4fa26292256bc128edd36193235e17ca55a98bbf4d4238" Feb 27 01:41:27 crc kubenswrapper[4771]: E0227 01:41:27.748200 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c56a9503fb1dddd9b4fa26292256bc128edd36193235e17ca55a98bbf4d4238\": container with ID starting with 9c56a9503fb1dddd9b4fa26292256bc128edd36193235e17ca55a98bbf4d4238 not found: ID does not exist" containerID="9c56a9503fb1dddd9b4fa26292256bc128edd36193235e17ca55a98bbf4d4238" Feb 27 01:41:27 crc kubenswrapper[4771]: I0227 01:41:27.748237 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c56a9503fb1dddd9b4fa26292256bc128edd36193235e17ca55a98bbf4d4238"} err="failed to get container status \"9c56a9503fb1dddd9b4fa26292256bc128edd36193235e17ca55a98bbf4d4238\": rpc error: code = NotFound desc = could not find container \"9c56a9503fb1dddd9b4fa26292256bc128edd36193235e17ca55a98bbf4d4238\": container with ID starting with 9c56a9503fb1dddd9b4fa26292256bc128edd36193235e17ca55a98bbf4d4238 not found: ID does not exist" Feb 27 01:41:27 crc kubenswrapper[4771]: I0227 01:41:27.748265 4771 scope.go:117] "RemoveContainer" containerID="b13c31ba8fb63f9042dd5ab3076ce4824ea101514258d5322f620f8c173226a6" Feb 27 01:41:27 crc kubenswrapper[4771]: E0227 01:41:27.748859 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b13c31ba8fb63f9042dd5ab3076ce4824ea101514258d5322f620f8c173226a6\": container with ID starting with b13c31ba8fb63f9042dd5ab3076ce4824ea101514258d5322f620f8c173226a6 not found: ID does not exist" containerID="b13c31ba8fb63f9042dd5ab3076ce4824ea101514258d5322f620f8c173226a6" Feb 27 01:41:27 crc kubenswrapper[4771]: I0227 01:41:27.748958 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b13c31ba8fb63f9042dd5ab3076ce4824ea101514258d5322f620f8c173226a6"} err="failed to get container status \"b13c31ba8fb63f9042dd5ab3076ce4824ea101514258d5322f620f8c173226a6\": rpc error: code = NotFound desc = could not find container \"b13c31ba8fb63f9042dd5ab3076ce4824ea101514258d5322f620f8c173226a6\": container with ID starting with b13c31ba8fb63f9042dd5ab3076ce4824ea101514258d5322f620f8c173226a6 not found: ID does not exist" Feb 27 01:41:27 crc kubenswrapper[4771]: I0227 01:41:27.749633 4771 scope.go:117] "RemoveContainer" containerID="90a9798bd0c18a3bb334c63b20ae644ade109fa93f81149b0bcb6648cc57354f" Feb 27 01:41:27 crc kubenswrapper[4771]: E0227 01:41:27.750598 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90a9798bd0c18a3bb334c63b20ae644ade109fa93f81149b0bcb6648cc57354f\": container with ID starting with 90a9798bd0c18a3bb334c63b20ae644ade109fa93f81149b0bcb6648cc57354f not found: ID does not exist" containerID="90a9798bd0c18a3bb334c63b20ae644ade109fa93f81149b0bcb6648cc57354f" Feb 27 01:41:27 crc kubenswrapper[4771]: I0227 01:41:27.750642 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90a9798bd0c18a3bb334c63b20ae644ade109fa93f81149b0bcb6648cc57354f"} err="failed to get container status \"90a9798bd0c18a3bb334c63b20ae644ade109fa93f81149b0bcb6648cc57354f\": rpc error: code = NotFound desc = could not find container \"90a9798bd0c18a3bb334c63b20ae644ade109fa93f81149b0bcb6648cc57354f\": container with ID starting with 90a9798bd0c18a3bb334c63b20ae644ade109fa93f81149b0bcb6648cc57354f not found: ID does not exist" Feb 27 01:41:28 crc kubenswrapper[4771]: I0227 01:41:28.079457 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98a4919b-d031-42ef-ae21-df6e724be151-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98a4919b-d031-42ef-ae21-df6e724be151" (UID: "98a4919b-d031-42ef-ae21-df6e724be151"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:41:28 crc kubenswrapper[4771]: I0227 01:41:28.132329 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98a4919b-d031-42ef-ae21-df6e724be151-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:41:28 crc kubenswrapper[4771]: I0227 01:41:28.256025 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m5sqt"] Feb 27 01:41:28 crc kubenswrapper[4771]: I0227 01:41:28.268680 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m5sqt"] Feb 27 01:41:29 crc kubenswrapper[4771]: I0227 01:41:29.789155 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98a4919b-d031-42ef-ae21-df6e724be151" path="/var/lib/kubelet/pods/98a4919b-d031-42ef-ae21-df6e724be151/volumes" Feb 27 01:41:34 crc kubenswrapper[4771]: I0227 01:41:34.044890 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-746x6"] Feb 27 01:41:34 crc kubenswrapper[4771]: E0227 01:41:34.046002 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98a4919b-d031-42ef-ae21-df6e724be151" containerName="extract-utilities" Feb 27 01:41:34 crc kubenswrapper[4771]: I0227 01:41:34.046022 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="98a4919b-d031-42ef-ae21-df6e724be151" containerName="extract-utilities" Feb 27 01:41:34 crc kubenswrapper[4771]: E0227 01:41:34.046042 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98a4919b-d031-42ef-ae21-df6e724be151" containerName="extract-content" Feb 27 01:41:34 crc kubenswrapper[4771]: I0227 01:41:34.046050 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="98a4919b-d031-42ef-ae21-df6e724be151" containerName="extract-content" Feb 27 01:41:34 crc kubenswrapper[4771]: E0227 01:41:34.046095 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98a4919b-d031-42ef-ae21-df6e724be151" containerName="registry-server" Feb 27 01:41:34 crc kubenswrapper[4771]: I0227 01:41:34.046113 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="98a4919b-d031-42ef-ae21-df6e724be151" containerName="registry-server" Feb 27 01:41:34 crc kubenswrapper[4771]: I0227 01:41:34.046384 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="98a4919b-d031-42ef-ae21-df6e724be151" containerName="registry-server" Feb 27 01:41:34 crc kubenswrapper[4771]: I0227 01:41:34.048450 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-746x6" Feb 27 01:41:34 crc kubenswrapper[4771]: I0227 01:41:34.055463 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jknvw\" (UniqueName: \"kubernetes.io/projected/b415b584-3f32-4d98-aaed-2f622b93b43c-kube-api-access-jknvw\") pod \"redhat-operators-746x6\" (UID: \"b415b584-3f32-4d98-aaed-2f622b93b43c\") " pod="openshift-marketplace/redhat-operators-746x6" Feb 27 01:41:34 crc kubenswrapper[4771]: I0227 01:41:34.055860 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b415b584-3f32-4d98-aaed-2f622b93b43c-catalog-content\") pod \"redhat-operators-746x6\" (UID: \"b415b584-3f32-4d98-aaed-2f622b93b43c\") " pod="openshift-marketplace/redhat-operators-746x6" Feb 27 01:41:34 crc kubenswrapper[4771]: I0227 01:41:34.055988 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b415b584-3f32-4d98-aaed-2f622b93b43c-utilities\") pod \"redhat-operators-746x6\" (UID: \"b415b584-3f32-4d98-aaed-2f622b93b43c\") " pod="openshift-marketplace/redhat-operators-746x6" Feb 27 01:41:34 crc kubenswrapper[4771]: I0227 01:41:34.071472 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-746x6"] Feb 27 01:41:34 crc kubenswrapper[4771]: I0227 01:41:34.157660 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b415b584-3f32-4d98-aaed-2f622b93b43c-utilities\") pod \"redhat-operators-746x6\" (UID: \"b415b584-3f32-4d98-aaed-2f622b93b43c\") " pod="openshift-marketplace/redhat-operators-746x6" Feb 27 01:41:34 crc kubenswrapper[4771]: I0227 01:41:34.157790 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jknvw\" (UniqueName: \"kubernetes.io/projected/b415b584-3f32-4d98-aaed-2f622b93b43c-kube-api-access-jknvw\") pod \"redhat-operators-746x6\" (UID: \"b415b584-3f32-4d98-aaed-2f622b93b43c\") " pod="openshift-marketplace/redhat-operators-746x6" Feb 27 01:41:34 crc kubenswrapper[4771]: I0227 01:41:34.157829 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b415b584-3f32-4d98-aaed-2f622b93b43c-catalog-content\") pod \"redhat-operators-746x6\" (UID: \"b415b584-3f32-4d98-aaed-2f622b93b43c\") " pod="openshift-marketplace/redhat-operators-746x6" Feb 27 01:41:34 crc kubenswrapper[4771]: I0227 01:41:34.158396 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b415b584-3f32-4d98-aaed-2f622b93b43c-utilities\") pod \"redhat-operators-746x6\" (UID: \"b415b584-3f32-4d98-aaed-2f622b93b43c\") " pod="openshift-marketplace/redhat-operators-746x6" Feb 27 01:41:34 crc kubenswrapper[4771]: I0227 01:41:34.158418 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b415b584-3f32-4d98-aaed-2f622b93b43c-catalog-content\") pod \"redhat-operators-746x6\" (UID: \"b415b584-3f32-4d98-aaed-2f622b93b43c\") " pod="openshift-marketplace/redhat-operators-746x6" Feb 27 01:41:34 crc kubenswrapper[4771]: I0227 01:41:34.181270 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jknvw\" (UniqueName: \"kubernetes.io/projected/b415b584-3f32-4d98-aaed-2f622b93b43c-kube-api-access-jknvw\") pod \"redhat-operators-746x6\" (UID: \"b415b584-3f32-4d98-aaed-2f622b93b43c\") " pod="openshift-marketplace/redhat-operators-746x6" Feb 27 01:41:34 crc kubenswrapper[4771]: I0227 01:41:34.390325 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-746x6" Feb 27 01:41:34 crc kubenswrapper[4771]: I0227 01:41:34.843523 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-746x6"] Feb 27 01:41:35 crc kubenswrapper[4771]: I0227 01:41:35.688664 4771 generic.go:334] "Generic (PLEG): container finished" podID="b415b584-3f32-4d98-aaed-2f622b93b43c" containerID="a7010ecb6650ab7714f312dc89f475766a8b23edf8a9f05a76985f425c072a0b" exitCode=0 Feb 27 01:41:35 crc kubenswrapper[4771]: I0227 01:41:35.688854 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-746x6" event={"ID":"b415b584-3f32-4d98-aaed-2f622b93b43c","Type":"ContainerDied","Data":"a7010ecb6650ab7714f312dc89f475766a8b23edf8a9f05a76985f425c072a0b"} Feb 27 01:41:35 crc kubenswrapper[4771]: I0227 01:41:35.688962 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-746x6" event={"ID":"b415b584-3f32-4d98-aaed-2f622b93b43c","Type":"ContainerStarted","Data":"a91ff077675c488d49de159b123ddca281900f5b7121c370c7e6d08c87e7e4e3"} Feb 27 01:41:36 crc kubenswrapper[4771]: I0227 01:41:36.699860 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-746x6" event={"ID":"b415b584-3f32-4d98-aaed-2f622b93b43c","Type":"ContainerStarted","Data":"c45ecd806dec72a117e33ee54e39236f47b4d7c826d4b2651eab4f8c0113931d"} Feb 27 01:41:38 crc kubenswrapper[4771]: I0227 01:41:38.717516 4771 generic.go:334] "Generic (PLEG): container finished" podID="b415b584-3f32-4d98-aaed-2f622b93b43c" containerID="c45ecd806dec72a117e33ee54e39236f47b4d7c826d4b2651eab4f8c0113931d" exitCode=0 Feb 27 01:41:38 crc kubenswrapper[4771]: I0227 01:41:38.717620 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-746x6" event={"ID":"b415b584-3f32-4d98-aaed-2f622b93b43c","Type":"ContainerDied","Data":"c45ecd806dec72a117e33ee54e39236f47b4d7c826d4b2651eab4f8c0113931d"} Feb 27 01:41:40 crc kubenswrapper[4771]: I0227 01:41:40.735909 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-746x6" event={"ID":"b415b584-3f32-4d98-aaed-2f622b93b43c","Type":"ContainerStarted","Data":"72cf51f127df9b1b07dde5f610628f8807ff1564ef0b3e501ba8fb5190d286d9"} Feb 27 01:41:40 crc kubenswrapper[4771]: I0227 01:41:40.758242 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-746x6" podStartSLOduration=3.328523682 podStartE2EDuration="6.75822655s" podCreationTimestamp="2026-02-27 01:41:34 +0000 UTC" firstStartedPulling="2026-02-27 01:41:35.690872193 +0000 UTC m=+2208.628433471" lastFinishedPulling="2026-02-27 01:41:39.120575051 +0000 UTC m=+2212.058136339" observedRunningTime="2026-02-27 01:41:40.754341463 +0000 UTC m=+2213.691902781" watchObservedRunningTime="2026-02-27 01:41:40.75822655 +0000 UTC m=+2213.695787838" Feb 27 01:41:44 crc kubenswrapper[4771]: I0227 01:41:44.391269 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-746x6" Feb 27 01:41:44 crc kubenswrapper[4771]: I0227 01:41:44.391960 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-746x6" Feb 27 01:41:45 crc kubenswrapper[4771]: I0227 01:41:45.453058 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-746x6" podUID="b415b584-3f32-4d98-aaed-2f622b93b43c" containerName="registry-server" probeResult="failure" output=< Feb 27 01:41:45 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 27 01:41:45 crc kubenswrapper[4771]: > Feb 27 01:41:54 crc kubenswrapper[4771]: I0227 01:41:54.473342 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-746x6" Feb 27 01:41:54 crc kubenswrapper[4771]: I0227 01:41:54.568197 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-746x6" Feb 27 01:41:54 crc kubenswrapper[4771]: I0227 01:41:54.730970 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-746x6"] Feb 27 01:41:55 crc kubenswrapper[4771]: I0227 01:41:55.872519 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-746x6" podUID="b415b584-3f32-4d98-aaed-2f622b93b43c" containerName="registry-server" containerID="cri-o://72cf51f127df9b1b07dde5f610628f8807ff1564ef0b3e501ba8fb5190d286d9" gracePeriod=2 Feb 27 01:41:56 crc kubenswrapper[4771]: I0227 01:41:56.390545 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-746x6" Feb 27 01:41:56 crc kubenswrapper[4771]: I0227 01:41:56.516005 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jknvw\" (UniqueName: \"kubernetes.io/projected/b415b584-3f32-4d98-aaed-2f622b93b43c-kube-api-access-jknvw\") pod \"b415b584-3f32-4d98-aaed-2f622b93b43c\" (UID: \"b415b584-3f32-4d98-aaed-2f622b93b43c\") " Feb 27 01:41:56 crc kubenswrapper[4771]: I0227 01:41:56.516071 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b415b584-3f32-4d98-aaed-2f622b93b43c-catalog-content\") pod \"b415b584-3f32-4d98-aaed-2f622b93b43c\" (UID: \"b415b584-3f32-4d98-aaed-2f622b93b43c\") " Feb 27 01:41:56 crc kubenswrapper[4771]: I0227 01:41:56.516138 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b415b584-3f32-4d98-aaed-2f622b93b43c-utilities\") pod \"b415b584-3f32-4d98-aaed-2f622b93b43c\" (UID: \"b415b584-3f32-4d98-aaed-2f622b93b43c\") " Feb 27 01:41:56 crc kubenswrapper[4771]: I0227 01:41:56.516869 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b415b584-3f32-4d98-aaed-2f622b93b43c-utilities" (OuterVolumeSpecName: "utilities") pod "b415b584-3f32-4d98-aaed-2f622b93b43c" (UID: "b415b584-3f32-4d98-aaed-2f622b93b43c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:41:56 crc kubenswrapper[4771]: I0227 01:41:56.517932 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b415b584-3f32-4d98-aaed-2f622b93b43c-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:41:56 crc kubenswrapper[4771]: I0227 01:41:56.521894 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b415b584-3f32-4d98-aaed-2f622b93b43c-kube-api-access-jknvw" (OuterVolumeSpecName: "kube-api-access-jknvw") pod "b415b584-3f32-4d98-aaed-2f622b93b43c" (UID: "b415b584-3f32-4d98-aaed-2f622b93b43c"). InnerVolumeSpecName "kube-api-access-jknvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:41:56 crc kubenswrapper[4771]: I0227 01:41:56.619600 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jknvw\" (UniqueName: \"kubernetes.io/projected/b415b584-3f32-4d98-aaed-2f622b93b43c-kube-api-access-jknvw\") on node \"crc\" DevicePath \"\"" Feb 27 01:41:56 crc kubenswrapper[4771]: I0227 01:41:56.650988 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b415b584-3f32-4d98-aaed-2f622b93b43c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b415b584-3f32-4d98-aaed-2f622b93b43c" (UID: "b415b584-3f32-4d98-aaed-2f622b93b43c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:41:56 crc kubenswrapper[4771]: I0227 01:41:56.721268 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b415b584-3f32-4d98-aaed-2f622b93b43c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:41:56 crc kubenswrapper[4771]: I0227 01:41:56.885767 4771 generic.go:334] "Generic (PLEG): container finished" podID="b415b584-3f32-4d98-aaed-2f622b93b43c" containerID="72cf51f127df9b1b07dde5f610628f8807ff1564ef0b3e501ba8fb5190d286d9" exitCode=0 Feb 27 01:41:56 crc kubenswrapper[4771]: I0227 01:41:56.885814 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-746x6" event={"ID":"b415b584-3f32-4d98-aaed-2f622b93b43c","Type":"ContainerDied","Data":"72cf51f127df9b1b07dde5f610628f8807ff1564ef0b3e501ba8fb5190d286d9"} Feb 27 01:41:56 crc kubenswrapper[4771]: I0227 01:41:56.885847 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-746x6" Feb 27 01:41:56 crc kubenswrapper[4771]: I0227 01:41:56.885857 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-746x6" event={"ID":"b415b584-3f32-4d98-aaed-2f622b93b43c","Type":"ContainerDied","Data":"a91ff077675c488d49de159b123ddca281900f5b7121c370c7e6d08c87e7e4e3"} Feb 27 01:41:56 crc kubenswrapper[4771]: I0227 01:41:56.885880 4771 scope.go:117] "RemoveContainer" containerID="72cf51f127df9b1b07dde5f610628f8807ff1564ef0b3e501ba8fb5190d286d9" Feb 27 01:41:56 crc kubenswrapper[4771]: I0227 01:41:56.912305 4771 scope.go:117] "RemoveContainer" containerID="c45ecd806dec72a117e33ee54e39236f47b4d7c826d4b2651eab4f8c0113931d" Feb 27 01:41:56 crc kubenswrapper[4771]: I0227 01:41:56.927756 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-746x6"] Feb 27 01:41:56 crc kubenswrapper[4771]: I0227 01:41:56.939364 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-746x6"] Feb 27 01:41:56 crc kubenswrapper[4771]: I0227 01:41:56.954459 4771 scope.go:117] "RemoveContainer" containerID="a7010ecb6650ab7714f312dc89f475766a8b23edf8a9f05a76985f425c072a0b" Feb 27 01:41:57 crc kubenswrapper[4771]: I0227 01:41:57.007715 4771 scope.go:117] "RemoveContainer" containerID="72cf51f127df9b1b07dde5f610628f8807ff1564ef0b3e501ba8fb5190d286d9" Feb 27 01:41:57 crc kubenswrapper[4771]: E0227 01:41:57.008219 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72cf51f127df9b1b07dde5f610628f8807ff1564ef0b3e501ba8fb5190d286d9\": container with ID starting with 72cf51f127df9b1b07dde5f610628f8807ff1564ef0b3e501ba8fb5190d286d9 not found: ID does not exist" containerID="72cf51f127df9b1b07dde5f610628f8807ff1564ef0b3e501ba8fb5190d286d9" Feb 27 01:41:57 crc kubenswrapper[4771]: I0227 01:41:57.008258 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72cf51f127df9b1b07dde5f610628f8807ff1564ef0b3e501ba8fb5190d286d9"} err="failed to get container status \"72cf51f127df9b1b07dde5f610628f8807ff1564ef0b3e501ba8fb5190d286d9\": rpc error: code = NotFound desc = could not find container \"72cf51f127df9b1b07dde5f610628f8807ff1564ef0b3e501ba8fb5190d286d9\": container with ID starting with 72cf51f127df9b1b07dde5f610628f8807ff1564ef0b3e501ba8fb5190d286d9 not found: ID does not exist" Feb 27 01:41:57 crc kubenswrapper[4771]: I0227 01:41:57.008284 4771 scope.go:117] "RemoveContainer" containerID="c45ecd806dec72a117e33ee54e39236f47b4d7c826d4b2651eab4f8c0113931d" Feb 27 01:41:57 crc kubenswrapper[4771]: E0227 01:41:57.008714 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c45ecd806dec72a117e33ee54e39236f47b4d7c826d4b2651eab4f8c0113931d\": container with ID starting with c45ecd806dec72a117e33ee54e39236f47b4d7c826d4b2651eab4f8c0113931d not found: ID does not exist" containerID="c45ecd806dec72a117e33ee54e39236f47b4d7c826d4b2651eab4f8c0113931d" Feb 27 01:41:57 crc kubenswrapper[4771]: I0227 01:41:57.008743 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c45ecd806dec72a117e33ee54e39236f47b4d7c826d4b2651eab4f8c0113931d"} err="failed to get container status \"c45ecd806dec72a117e33ee54e39236f47b4d7c826d4b2651eab4f8c0113931d\": rpc error: code = NotFound desc = could not find container \"c45ecd806dec72a117e33ee54e39236f47b4d7c826d4b2651eab4f8c0113931d\": container with ID starting with c45ecd806dec72a117e33ee54e39236f47b4d7c826d4b2651eab4f8c0113931d not found: ID does not exist" Feb 27 01:41:57 crc kubenswrapper[4771]: I0227 01:41:57.008760 4771 scope.go:117] "RemoveContainer" containerID="a7010ecb6650ab7714f312dc89f475766a8b23edf8a9f05a76985f425c072a0b" Feb 27 01:41:57 crc kubenswrapper[4771]: E0227 01:41:57.009150 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7010ecb6650ab7714f312dc89f475766a8b23edf8a9f05a76985f425c072a0b\": container with ID starting with a7010ecb6650ab7714f312dc89f475766a8b23edf8a9f05a76985f425c072a0b not found: ID does not exist" containerID="a7010ecb6650ab7714f312dc89f475766a8b23edf8a9f05a76985f425c072a0b" Feb 27 01:41:57 crc kubenswrapper[4771]: I0227 01:41:57.009203 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7010ecb6650ab7714f312dc89f475766a8b23edf8a9f05a76985f425c072a0b"} err="failed to get container status \"a7010ecb6650ab7714f312dc89f475766a8b23edf8a9f05a76985f425c072a0b\": rpc error: code = NotFound desc = could not find container \"a7010ecb6650ab7714f312dc89f475766a8b23edf8a9f05a76985f425c072a0b\": container with ID starting with a7010ecb6650ab7714f312dc89f475766a8b23edf8a9f05a76985f425c072a0b not found: ID does not exist" Feb 27 01:41:57 crc kubenswrapper[4771]: I0227 01:41:57.791469 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b415b584-3f32-4d98-aaed-2f622b93b43c" path="/var/lib/kubelet/pods/b415b584-3f32-4d98-aaed-2f622b93b43c/volumes" Feb 27 01:41:58 crc kubenswrapper[4771]: I0227 01:41:58.953721 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:41:58 crc kubenswrapper[4771]: I0227 01:41:58.954282 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:42:00 crc kubenswrapper[4771]: I0227 01:42:00.148882 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535942-hdqwj"] Feb 27 01:42:00 crc kubenswrapper[4771]: E0227 01:42:00.149353 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b415b584-3f32-4d98-aaed-2f622b93b43c" containerName="extract-content" Feb 27 01:42:00 crc kubenswrapper[4771]: I0227 01:42:00.149368 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b415b584-3f32-4d98-aaed-2f622b93b43c" containerName="extract-content" Feb 27 01:42:00 crc kubenswrapper[4771]: E0227 01:42:00.149385 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b415b584-3f32-4d98-aaed-2f622b93b43c" containerName="extract-utilities" Feb 27 01:42:00 crc kubenswrapper[4771]: I0227 01:42:00.149394 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b415b584-3f32-4d98-aaed-2f622b93b43c" containerName="extract-utilities" Feb 27 01:42:00 crc kubenswrapper[4771]: E0227 01:42:00.149432 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b415b584-3f32-4d98-aaed-2f622b93b43c" containerName="registry-server" Feb 27 01:42:00 crc kubenswrapper[4771]: I0227 01:42:00.149439 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b415b584-3f32-4d98-aaed-2f622b93b43c" containerName="registry-server" Feb 27 01:42:00 crc kubenswrapper[4771]: I0227 01:42:00.149691 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b415b584-3f32-4d98-aaed-2f622b93b43c" containerName="registry-server" Feb 27 01:42:00 crc kubenswrapper[4771]: I0227 01:42:00.150464 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535942-hdqwj" Feb 27 01:42:00 crc kubenswrapper[4771]: I0227 01:42:00.153233 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:42:00 crc kubenswrapper[4771]: I0227 01:42:00.153282 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 01:42:00 crc kubenswrapper[4771]: I0227 01:42:00.153775 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:42:00 crc kubenswrapper[4771]: I0227 01:42:00.173082 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535942-hdqwj"] Feb 27 01:42:00 crc kubenswrapper[4771]: I0227 01:42:00.293837 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drtwg\" (UniqueName: \"kubernetes.io/projected/04d72e49-e603-4837-88de-df823ea62b8e-kube-api-access-drtwg\") pod \"auto-csr-approver-29535942-hdqwj\" (UID: \"04d72e49-e603-4837-88de-df823ea62b8e\") " pod="openshift-infra/auto-csr-approver-29535942-hdqwj" Feb 27 01:42:00 crc kubenswrapper[4771]: I0227 01:42:00.396363 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drtwg\" (UniqueName: \"kubernetes.io/projected/04d72e49-e603-4837-88de-df823ea62b8e-kube-api-access-drtwg\") pod \"auto-csr-approver-29535942-hdqwj\" (UID: \"04d72e49-e603-4837-88de-df823ea62b8e\") " pod="openshift-infra/auto-csr-approver-29535942-hdqwj" Feb 27 01:42:00 crc kubenswrapper[4771]: I0227 01:42:00.419361 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drtwg\" (UniqueName: \"kubernetes.io/projected/04d72e49-e603-4837-88de-df823ea62b8e-kube-api-access-drtwg\") pod \"auto-csr-approver-29535942-hdqwj\" (UID: \"04d72e49-e603-4837-88de-df823ea62b8e\") " pod="openshift-infra/auto-csr-approver-29535942-hdqwj" Feb 27 01:42:00 crc kubenswrapper[4771]: I0227 01:42:00.478006 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535942-hdqwj" Feb 27 01:42:00 crc kubenswrapper[4771]: I0227 01:42:00.935224 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535942-hdqwj"] Feb 27 01:42:01 crc kubenswrapper[4771]: I0227 01:42:01.981151 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535942-hdqwj" event={"ID":"04d72e49-e603-4837-88de-df823ea62b8e","Type":"ContainerStarted","Data":"6dfb5242e0664d669ec383aa0e5c619a1e3416d845dc5406ad15252fe59042a6"} Feb 27 01:42:02 crc kubenswrapper[4771]: I0227 01:42:02.993374 4771 generic.go:334] "Generic (PLEG): container finished" podID="04d72e49-e603-4837-88de-df823ea62b8e" containerID="28af7a170fd2d45d61f4c3a7255ca8ead24b6ebecc75e0624862258f6030bcc4" exitCode=0 Feb 27 01:42:02 crc kubenswrapper[4771]: I0227 01:42:02.993417 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535942-hdqwj" event={"ID":"04d72e49-e603-4837-88de-df823ea62b8e","Type":"ContainerDied","Data":"28af7a170fd2d45d61f4c3a7255ca8ead24b6ebecc75e0624862258f6030bcc4"} Feb 27 01:42:04 crc kubenswrapper[4771]: I0227 01:42:04.347893 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535942-hdqwj" Feb 27 01:42:04 crc kubenswrapper[4771]: I0227 01:42:04.481187 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drtwg\" (UniqueName: \"kubernetes.io/projected/04d72e49-e603-4837-88de-df823ea62b8e-kube-api-access-drtwg\") pod \"04d72e49-e603-4837-88de-df823ea62b8e\" (UID: \"04d72e49-e603-4837-88de-df823ea62b8e\") " Feb 27 01:42:04 crc kubenswrapper[4771]: I0227 01:42:04.487102 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04d72e49-e603-4837-88de-df823ea62b8e-kube-api-access-drtwg" (OuterVolumeSpecName: "kube-api-access-drtwg") pod "04d72e49-e603-4837-88de-df823ea62b8e" (UID: "04d72e49-e603-4837-88de-df823ea62b8e"). InnerVolumeSpecName "kube-api-access-drtwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:42:04 crc kubenswrapper[4771]: I0227 01:42:04.584325 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drtwg\" (UniqueName: \"kubernetes.io/projected/04d72e49-e603-4837-88de-df823ea62b8e-kube-api-access-drtwg\") on node \"crc\" DevicePath \"\"" Feb 27 01:42:05 crc kubenswrapper[4771]: I0227 01:42:05.014759 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535942-hdqwj" event={"ID":"04d72e49-e603-4837-88de-df823ea62b8e","Type":"ContainerDied","Data":"6dfb5242e0664d669ec383aa0e5c619a1e3416d845dc5406ad15252fe59042a6"} Feb 27 01:42:05 crc kubenswrapper[4771]: I0227 01:42:05.014821 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dfb5242e0664d669ec383aa0e5c619a1e3416d845dc5406ad15252fe59042a6" Feb 27 01:42:05 crc kubenswrapper[4771]: I0227 01:42:05.014909 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535942-hdqwj" Feb 27 01:42:05 crc kubenswrapper[4771]: I0227 01:42:05.433593 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535936-9v8rl"] Feb 27 01:42:05 crc kubenswrapper[4771]: I0227 01:42:05.447665 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535936-9v8rl"] Feb 27 01:42:05 crc kubenswrapper[4771]: I0227 01:42:05.783819 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4ccf7d9-1c8a-4135-a561-4696af85f0d2" path="/var/lib/kubelet/pods/c4ccf7d9-1c8a-4135-a561-4696af85f0d2/volumes" Feb 27 01:42:28 crc kubenswrapper[4771]: I0227 01:42:28.953504 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:42:28 crc kubenswrapper[4771]: I0227 01:42:28.954666 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:42:29 crc kubenswrapper[4771]: I0227 01:42:29.341252 4771 scope.go:117] "RemoveContainer" containerID="8e9357fb966ff6815d3212cc080f320deadd0e408de9e6b6cc13ed3148b0b347" Feb 27 01:42:58 crc kubenswrapper[4771]: I0227 01:42:58.954047 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:42:58 crc kubenswrapper[4771]: I0227 01:42:58.954586 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:42:58 crc kubenswrapper[4771]: I0227 01:42:58.954645 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 01:42:58 crc kubenswrapper[4771]: I0227 01:42:58.955792 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2fa58e5c69c6875961ade4a259d074511e93c8575b54e900bfb9f5dcc26be68a"} pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 01:42:58 crc kubenswrapper[4771]: I0227 01:42:58.955864 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" containerID="cri-o://2fa58e5c69c6875961ade4a259d074511e93c8575b54e900bfb9f5dcc26be68a" gracePeriod=600 Feb 27 01:42:59 crc kubenswrapper[4771]: I0227 01:42:59.578777 4771 generic.go:334] "Generic (PLEG): container finished" podID="ca81e505-d53f-496e-bd26-7cec669591e4" containerID="2fa58e5c69c6875961ade4a259d074511e93c8575b54e900bfb9f5dcc26be68a" exitCode=0 Feb 27 01:42:59 crc kubenswrapper[4771]: I0227 01:42:59.578853 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerDied","Data":"2fa58e5c69c6875961ade4a259d074511e93c8575b54e900bfb9f5dcc26be68a"} Feb 27 01:42:59 crc kubenswrapper[4771]: I0227 01:42:59.579511 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerStarted","Data":"8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f"} Feb 27 01:42:59 crc kubenswrapper[4771]: I0227 01:42:59.579540 4771 scope.go:117] "RemoveContainer" containerID="cbde99cdc9aa4ee0931334df72d0d5990f6bd1badbf9dd9e11dc0c05675f514f" Feb 27 01:43:33 crc kubenswrapper[4771]: I0227 01:43:33.999137 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r4ztj"] Feb 27 01:43:34 crc kubenswrapper[4771]: E0227 01:43:34.000261 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d72e49-e603-4837-88de-df823ea62b8e" containerName="oc" Feb 27 01:43:34 crc kubenswrapper[4771]: I0227 01:43:34.000282 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d72e49-e603-4837-88de-df823ea62b8e" containerName="oc" Feb 27 01:43:34 crc kubenswrapper[4771]: I0227 01:43:34.000595 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="04d72e49-e603-4837-88de-df823ea62b8e" containerName="oc" Feb 27 01:43:34 crc kubenswrapper[4771]: I0227 01:43:34.002710 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r4ztj" Feb 27 01:43:34 crc kubenswrapper[4771]: I0227 01:43:34.019385 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r4ztj"] Feb 27 01:43:34 crc kubenswrapper[4771]: I0227 01:43:34.049301 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/303e4502-b658-46e9-93cd-d92c53e9c10a-catalog-content\") pod \"redhat-marketplace-r4ztj\" (UID: \"303e4502-b658-46e9-93cd-d92c53e9c10a\") " pod="openshift-marketplace/redhat-marketplace-r4ztj" Feb 27 01:43:34 crc kubenswrapper[4771]: I0227 01:43:34.049360 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/303e4502-b658-46e9-93cd-d92c53e9c10a-utilities\") pod \"redhat-marketplace-r4ztj\" (UID: \"303e4502-b658-46e9-93cd-d92c53e9c10a\") " pod="openshift-marketplace/redhat-marketplace-r4ztj" Feb 27 01:43:34 crc kubenswrapper[4771]: I0227 01:43:34.049457 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqbx5\" (UniqueName: \"kubernetes.io/projected/303e4502-b658-46e9-93cd-d92c53e9c10a-kube-api-access-hqbx5\") pod \"redhat-marketplace-r4ztj\" (UID: \"303e4502-b658-46e9-93cd-d92c53e9c10a\") " pod="openshift-marketplace/redhat-marketplace-r4ztj" Feb 27 01:43:34 crc kubenswrapper[4771]: I0227 01:43:34.151219 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqbx5\" (UniqueName: \"kubernetes.io/projected/303e4502-b658-46e9-93cd-d92c53e9c10a-kube-api-access-hqbx5\") pod \"redhat-marketplace-r4ztj\" (UID: \"303e4502-b658-46e9-93cd-d92c53e9c10a\") " pod="openshift-marketplace/redhat-marketplace-r4ztj" Feb 27 01:43:34 crc kubenswrapper[4771]: I0227 01:43:34.151327 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/303e4502-b658-46e9-93cd-d92c53e9c10a-catalog-content\") pod \"redhat-marketplace-r4ztj\" (UID: \"303e4502-b658-46e9-93cd-d92c53e9c10a\") " pod="openshift-marketplace/redhat-marketplace-r4ztj" Feb 27 01:43:34 crc kubenswrapper[4771]: I0227 01:43:34.151360 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/303e4502-b658-46e9-93cd-d92c53e9c10a-utilities\") pod \"redhat-marketplace-r4ztj\" (UID: \"303e4502-b658-46e9-93cd-d92c53e9c10a\") " pod="openshift-marketplace/redhat-marketplace-r4ztj" Feb 27 01:43:34 crc kubenswrapper[4771]: I0227 01:43:34.151842 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/303e4502-b658-46e9-93cd-d92c53e9c10a-utilities\") pod \"redhat-marketplace-r4ztj\" (UID: \"303e4502-b658-46e9-93cd-d92c53e9c10a\") " pod="openshift-marketplace/redhat-marketplace-r4ztj" Feb 27 01:43:34 crc kubenswrapper[4771]: I0227 01:43:34.152077 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/303e4502-b658-46e9-93cd-d92c53e9c10a-catalog-content\") pod \"redhat-marketplace-r4ztj\" (UID: \"303e4502-b658-46e9-93cd-d92c53e9c10a\") " pod="openshift-marketplace/redhat-marketplace-r4ztj" Feb 27 01:43:34 crc kubenswrapper[4771]: I0227 01:43:34.175920 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqbx5\" (UniqueName: \"kubernetes.io/projected/303e4502-b658-46e9-93cd-d92c53e9c10a-kube-api-access-hqbx5\") pod \"redhat-marketplace-r4ztj\" (UID: \"303e4502-b658-46e9-93cd-d92c53e9c10a\") " pod="openshift-marketplace/redhat-marketplace-r4ztj" Feb 27 01:43:34 crc kubenswrapper[4771]: I0227 01:43:34.363716 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r4ztj" Feb 27 01:43:35 crc kubenswrapper[4771]: I0227 01:43:34.860409 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r4ztj"] Feb 27 01:43:35 crc kubenswrapper[4771]: I0227 01:43:34.952203 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r4ztj" event={"ID":"303e4502-b658-46e9-93cd-d92c53e9c10a","Type":"ContainerStarted","Data":"7ba540e0ed1bf0958cfb27f590646a954f41996d22a890e1f303ef0e34b9ff6f"} Feb 27 01:43:35 crc kubenswrapper[4771]: I0227 01:43:35.962711 4771 generic.go:334] "Generic (PLEG): container finished" podID="303e4502-b658-46e9-93cd-d92c53e9c10a" containerID="c13412e8dbc39bdc5aaf8ddc6ebb85e63c85f42293e3be37f11db65142cb31cb" exitCode=0 Feb 27 01:43:35 crc kubenswrapper[4771]: I0227 01:43:35.962766 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r4ztj" event={"ID":"303e4502-b658-46e9-93cd-d92c53e9c10a","Type":"ContainerDied","Data":"c13412e8dbc39bdc5aaf8ddc6ebb85e63c85f42293e3be37f11db65142cb31cb"} Feb 27 01:43:36 crc kubenswrapper[4771]: I0227 01:43:36.973043 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r4ztj" event={"ID":"303e4502-b658-46e9-93cd-d92c53e9c10a","Type":"ContainerStarted","Data":"7b74b455ca879d1954ad17094e8a6adc4234d6b4973a319541c1a80b0749aab5"} Feb 27 01:43:37 crc kubenswrapper[4771]: I0227 01:43:37.985336 4771 generic.go:334] "Generic (PLEG): container finished" podID="303e4502-b658-46e9-93cd-d92c53e9c10a" containerID="7b74b455ca879d1954ad17094e8a6adc4234d6b4973a319541c1a80b0749aab5" exitCode=0 Feb 27 01:43:37 crc kubenswrapper[4771]: I0227 01:43:37.985380 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r4ztj" event={"ID":"303e4502-b658-46e9-93cd-d92c53e9c10a","Type":"ContainerDied","Data":"7b74b455ca879d1954ad17094e8a6adc4234d6b4973a319541c1a80b0749aab5"} Feb 27 01:43:38 crc kubenswrapper[4771]: I0227 01:43:38.996410 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r4ztj" event={"ID":"303e4502-b658-46e9-93cd-d92c53e9c10a","Type":"ContainerStarted","Data":"0223b399a9cac8784bb3fc4fa0f6c2c7dc5888b9110fe44396b05c6fc6edcd52"} Feb 27 01:43:39 crc kubenswrapper[4771]: I0227 01:43:39.025567 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r4ztj" podStartSLOduration=3.599087661 podStartE2EDuration="6.025528514s" podCreationTimestamp="2026-02-27 01:43:33 +0000 UTC" firstStartedPulling="2026-02-27 01:43:35.965439747 +0000 UTC m=+2328.903001035" lastFinishedPulling="2026-02-27 01:43:38.39188056 +0000 UTC m=+2331.329441888" observedRunningTime="2026-02-27 01:43:39.01630471 +0000 UTC m=+2331.953866018" watchObservedRunningTime="2026-02-27 01:43:39.025528514 +0000 UTC m=+2331.963089812" Feb 27 01:43:44 crc kubenswrapper[4771]: I0227 01:43:44.364590 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r4ztj" Feb 27 01:43:44 crc kubenswrapper[4771]: I0227 01:43:44.365149 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r4ztj" Feb 27 01:43:44 crc kubenswrapper[4771]: I0227 01:43:44.414240 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r4ztj" Feb 27 01:43:45 crc kubenswrapper[4771]: I0227 01:43:45.093886 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r4ztj" Feb 27 01:43:45 crc kubenswrapper[4771]: I0227 01:43:45.158063 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r4ztj"] Feb 27 01:43:47 crc kubenswrapper[4771]: I0227 01:43:47.064254 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-r4ztj" podUID="303e4502-b658-46e9-93cd-d92c53e9c10a" containerName="registry-server" containerID="cri-o://0223b399a9cac8784bb3fc4fa0f6c2c7dc5888b9110fe44396b05c6fc6edcd52" gracePeriod=2 Feb 27 01:43:47 crc kubenswrapper[4771]: I0227 01:43:47.528507 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r4ztj" Feb 27 01:43:47 crc kubenswrapper[4771]: I0227 01:43:47.631161 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqbx5\" (UniqueName: \"kubernetes.io/projected/303e4502-b658-46e9-93cd-d92c53e9c10a-kube-api-access-hqbx5\") pod \"303e4502-b658-46e9-93cd-d92c53e9c10a\" (UID: \"303e4502-b658-46e9-93cd-d92c53e9c10a\") " Feb 27 01:43:47 crc kubenswrapper[4771]: I0227 01:43:47.631262 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/303e4502-b658-46e9-93cd-d92c53e9c10a-catalog-content\") pod \"303e4502-b658-46e9-93cd-d92c53e9c10a\" (UID: \"303e4502-b658-46e9-93cd-d92c53e9c10a\") " Feb 27 01:43:47 crc kubenswrapper[4771]: I0227 01:43:47.631487 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/303e4502-b658-46e9-93cd-d92c53e9c10a-utilities\") pod \"303e4502-b658-46e9-93cd-d92c53e9c10a\" (UID: \"303e4502-b658-46e9-93cd-d92c53e9c10a\") " Feb 27 01:43:47 crc kubenswrapper[4771]: I0227 01:43:47.632390 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/303e4502-b658-46e9-93cd-d92c53e9c10a-utilities" (OuterVolumeSpecName: "utilities") pod "303e4502-b658-46e9-93cd-d92c53e9c10a" (UID: "303e4502-b658-46e9-93cd-d92c53e9c10a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:43:47 crc kubenswrapper[4771]: I0227 01:43:47.643846 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/303e4502-b658-46e9-93cd-d92c53e9c10a-kube-api-access-hqbx5" (OuterVolumeSpecName: "kube-api-access-hqbx5") pod "303e4502-b658-46e9-93cd-d92c53e9c10a" (UID: "303e4502-b658-46e9-93cd-d92c53e9c10a"). InnerVolumeSpecName "kube-api-access-hqbx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:43:47 crc kubenswrapper[4771]: I0227 01:43:47.668481 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/303e4502-b658-46e9-93cd-d92c53e9c10a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "303e4502-b658-46e9-93cd-d92c53e9c10a" (UID: "303e4502-b658-46e9-93cd-d92c53e9c10a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:43:47 crc kubenswrapper[4771]: I0227 01:43:47.733162 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/303e4502-b658-46e9-93cd-d92c53e9c10a-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:43:47 crc kubenswrapper[4771]: I0227 01:43:47.733525 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqbx5\" (UniqueName: \"kubernetes.io/projected/303e4502-b658-46e9-93cd-d92c53e9c10a-kube-api-access-hqbx5\") on node \"crc\" DevicePath \"\"" Feb 27 01:43:47 crc kubenswrapper[4771]: I0227 01:43:47.733540 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/303e4502-b658-46e9-93cd-d92c53e9c10a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:43:48 crc kubenswrapper[4771]: I0227 01:43:48.080086 4771 generic.go:334] "Generic (PLEG): container finished" podID="303e4502-b658-46e9-93cd-d92c53e9c10a" containerID="0223b399a9cac8784bb3fc4fa0f6c2c7dc5888b9110fe44396b05c6fc6edcd52" exitCode=0 Feb 27 01:43:48 crc kubenswrapper[4771]: I0227 01:43:48.080139 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r4ztj" event={"ID":"303e4502-b658-46e9-93cd-d92c53e9c10a","Type":"ContainerDied","Data":"0223b399a9cac8784bb3fc4fa0f6c2c7dc5888b9110fe44396b05c6fc6edcd52"} Feb 27 01:43:48 crc kubenswrapper[4771]: I0227 01:43:48.080173 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r4ztj" event={"ID":"303e4502-b658-46e9-93cd-d92c53e9c10a","Type":"ContainerDied","Data":"7ba540e0ed1bf0958cfb27f590646a954f41996d22a890e1f303ef0e34b9ff6f"} Feb 27 01:43:48 crc kubenswrapper[4771]: I0227 01:43:48.080196 4771 scope.go:117] "RemoveContainer" containerID="0223b399a9cac8784bb3fc4fa0f6c2c7dc5888b9110fe44396b05c6fc6edcd52" Feb 27 01:43:48 crc kubenswrapper[4771]: I0227 01:43:48.080334 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r4ztj" Feb 27 01:43:48 crc kubenswrapper[4771]: I0227 01:43:48.123963 4771 scope.go:117] "RemoveContainer" containerID="7b74b455ca879d1954ad17094e8a6adc4234d6b4973a319541c1a80b0749aab5" Feb 27 01:43:48 crc kubenswrapper[4771]: I0227 01:43:48.128111 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r4ztj"] Feb 27 01:43:48 crc kubenswrapper[4771]: I0227 01:43:48.141296 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-r4ztj"] Feb 27 01:43:48 crc kubenswrapper[4771]: I0227 01:43:48.157645 4771 scope.go:117] "RemoveContainer" containerID="c13412e8dbc39bdc5aaf8ddc6ebb85e63c85f42293e3be37f11db65142cb31cb" Feb 27 01:43:48 crc kubenswrapper[4771]: I0227 01:43:48.198081 4771 scope.go:117] "RemoveContainer" containerID="0223b399a9cac8784bb3fc4fa0f6c2c7dc5888b9110fe44396b05c6fc6edcd52" Feb 27 01:43:48 crc kubenswrapper[4771]: E0227 01:43:48.198870 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0223b399a9cac8784bb3fc4fa0f6c2c7dc5888b9110fe44396b05c6fc6edcd52\": container with ID starting with 0223b399a9cac8784bb3fc4fa0f6c2c7dc5888b9110fe44396b05c6fc6edcd52 not found: ID does not exist" containerID="0223b399a9cac8784bb3fc4fa0f6c2c7dc5888b9110fe44396b05c6fc6edcd52" Feb 27 01:43:48 crc kubenswrapper[4771]: I0227 01:43:48.198929 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0223b399a9cac8784bb3fc4fa0f6c2c7dc5888b9110fe44396b05c6fc6edcd52"} err="failed to get container status \"0223b399a9cac8784bb3fc4fa0f6c2c7dc5888b9110fe44396b05c6fc6edcd52\": rpc error: code = NotFound desc = could not find container \"0223b399a9cac8784bb3fc4fa0f6c2c7dc5888b9110fe44396b05c6fc6edcd52\": container with ID starting with 0223b399a9cac8784bb3fc4fa0f6c2c7dc5888b9110fe44396b05c6fc6edcd52 not found: ID does not exist" Feb 27 01:43:48 crc kubenswrapper[4771]: I0227 01:43:48.198963 4771 scope.go:117] "RemoveContainer" containerID="7b74b455ca879d1954ad17094e8a6adc4234d6b4973a319541c1a80b0749aab5" Feb 27 01:43:48 crc kubenswrapper[4771]: E0227 01:43:48.199383 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b74b455ca879d1954ad17094e8a6adc4234d6b4973a319541c1a80b0749aab5\": container with ID starting with 7b74b455ca879d1954ad17094e8a6adc4234d6b4973a319541c1a80b0749aab5 not found: ID does not exist" containerID="7b74b455ca879d1954ad17094e8a6adc4234d6b4973a319541c1a80b0749aab5" Feb 27 01:43:48 crc kubenswrapper[4771]: I0227 01:43:48.199443 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b74b455ca879d1954ad17094e8a6adc4234d6b4973a319541c1a80b0749aab5"} err="failed to get container status \"7b74b455ca879d1954ad17094e8a6adc4234d6b4973a319541c1a80b0749aab5\": rpc error: code = NotFound desc = could not find container \"7b74b455ca879d1954ad17094e8a6adc4234d6b4973a319541c1a80b0749aab5\": container with ID starting with 7b74b455ca879d1954ad17094e8a6adc4234d6b4973a319541c1a80b0749aab5 not found: ID does not exist" Feb 27 01:43:48 crc kubenswrapper[4771]: I0227 01:43:48.199485 4771 scope.go:117] "RemoveContainer" containerID="c13412e8dbc39bdc5aaf8ddc6ebb85e63c85f42293e3be37f11db65142cb31cb" Feb 27 01:43:48 crc kubenswrapper[4771]: E0227 01:43:48.199798 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c13412e8dbc39bdc5aaf8ddc6ebb85e63c85f42293e3be37f11db65142cb31cb\": container with ID starting with c13412e8dbc39bdc5aaf8ddc6ebb85e63c85f42293e3be37f11db65142cb31cb not found: ID does not exist" containerID="c13412e8dbc39bdc5aaf8ddc6ebb85e63c85f42293e3be37f11db65142cb31cb" Feb 27 01:43:48 crc kubenswrapper[4771]: I0227 01:43:48.199835 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c13412e8dbc39bdc5aaf8ddc6ebb85e63c85f42293e3be37f11db65142cb31cb"} err="failed to get container status \"c13412e8dbc39bdc5aaf8ddc6ebb85e63c85f42293e3be37f11db65142cb31cb\": rpc error: code = NotFound desc = could not find container \"c13412e8dbc39bdc5aaf8ddc6ebb85e63c85f42293e3be37f11db65142cb31cb\": container with ID starting with c13412e8dbc39bdc5aaf8ddc6ebb85e63c85f42293e3be37f11db65142cb31cb not found: ID does not exist" Feb 27 01:43:49 crc kubenswrapper[4771]: I0227 01:43:49.789248 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="303e4502-b658-46e9-93cd-d92c53e9c10a" path="/var/lib/kubelet/pods/303e4502-b658-46e9-93cd-d92c53e9c10a/volumes" Feb 27 01:43:50 crc kubenswrapper[4771]: I0227 01:43:50.110167 4771 generic.go:334] "Generic (PLEG): container finished" podID="40c7ae0e-123b-42cf-99cf-57309d7c22b0" containerID="fa68ff7b11ac3d5081d22fc1a7e2eee79bb32e8839e91c6eb57c62684a2e362d" exitCode=0 Feb 27 01:43:50 crc kubenswrapper[4771]: I0227 01:43:50.110210 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz" event={"ID":"40c7ae0e-123b-42cf-99cf-57309d7c22b0","Type":"ContainerDied","Data":"fa68ff7b11ac3d5081d22fc1a7e2eee79bb32e8839e91c6eb57c62684a2e362d"} Feb 27 01:43:51 crc kubenswrapper[4771]: I0227 01:43:51.511398 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz" Feb 27 01:43:51 crc kubenswrapper[4771]: I0227 01:43:51.609810 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c7ae0e-123b-42cf-99cf-57309d7c22b0-libvirt-combined-ca-bundle\") pod \"40c7ae0e-123b-42cf-99cf-57309d7c22b0\" (UID: \"40c7ae0e-123b-42cf-99cf-57309d7c22b0\") " Feb 27 01:43:51 crc kubenswrapper[4771]: I0227 01:43:51.609891 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/40c7ae0e-123b-42cf-99cf-57309d7c22b0-ssh-key-openstack-edpm-ipam\") pod \"40c7ae0e-123b-42cf-99cf-57309d7c22b0\" (UID: \"40c7ae0e-123b-42cf-99cf-57309d7c22b0\") " Feb 27 01:43:51 crc kubenswrapper[4771]: I0227 01:43:51.609996 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/40c7ae0e-123b-42cf-99cf-57309d7c22b0-libvirt-secret-0\") pod \"40c7ae0e-123b-42cf-99cf-57309d7c22b0\" (UID: \"40c7ae0e-123b-42cf-99cf-57309d7c22b0\") " Feb 27 01:43:51 crc kubenswrapper[4771]: I0227 01:43:51.610071 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40c7ae0e-123b-42cf-99cf-57309d7c22b0-inventory\") pod \"40c7ae0e-123b-42cf-99cf-57309d7c22b0\" (UID: \"40c7ae0e-123b-42cf-99cf-57309d7c22b0\") " Feb 27 01:43:51 crc kubenswrapper[4771]: I0227 01:43:51.610129 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29sfw\" (UniqueName: \"kubernetes.io/projected/40c7ae0e-123b-42cf-99cf-57309d7c22b0-kube-api-access-29sfw\") pod \"40c7ae0e-123b-42cf-99cf-57309d7c22b0\" (UID: \"40c7ae0e-123b-42cf-99cf-57309d7c22b0\") " Feb 27 01:43:51 crc kubenswrapper[4771]: I0227 01:43:51.615429 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40c7ae0e-123b-42cf-99cf-57309d7c22b0-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "40c7ae0e-123b-42cf-99cf-57309d7c22b0" (UID: "40c7ae0e-123b-42cf-99cf-57309d7c22b0"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:43:51 crc kubenswrapper[4771]: I0227 01:43:51.616346 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40c7ae0e-123b-42cf-99cf-57309d7c22b0-kube-api-access-29sfw" (OuterVolumeSpecName: "kube-api-access-29sfw") pod "40c7ae0e-123b-42cf-99cf-57309d7c22b0" (UID: "40c7ae0e-123b-42cf-99cf-57309d7c22b0"). InnerVolumeSpecName "kube-api-access-29sfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:43:51 crc kubenswrapper[4771]: I0227 01:43:51.653925 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40c7ae0e-123b-42cf-99cf-57309d7c22b0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "40c7ae0e-123b-42cf-99cf-57309d7c22b0" (UID: "40c7ae0e-123b-42cf-99cf-57309d7c22b0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:43:51 crc kubenswrapper[4771]: I0227 01:43:51.656013 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40c7ae0e-123b-42cf-99cf-57309d7c22b0-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "40c7ae0e-123b-42cf-99cf-57309d7c22b0" (UID: "40c7ae0e-123b-42cf-99cf-57309d7c22b0"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:43:51 crc kubenswrapper[4771]: I0227 01:43:51.658122 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40c7ae0e-123b-42cf-99cf-57309d7c22b0-inventory" (OuterVolumeSpecName: "inventory") pod "40c7ae0e-123b-42cf-99cf-57309d7c22b0" (UID: "40c7ae0e-123b-42cf-99cf-57309d7c22b0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:43:51 crc kubenswrapper[4771]: I0227 01:43:51.712845 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40c7ae0e-123b-42cf-99cf-57309d7c22b0-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 01:43:51 crc kubenswrapper[4771]: I0227 01:43:51.712920 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29sfw\" (UniqueName: \"kubernetes.io/projected/40c7ae0e-123b-42cf-99cf-57309d7c22b0-kube-api-access-29sfw\") on node \"crc\" DevicePath \"\"" Feb 27 01:43:51 crc kubenswrapper[4771]: I0227 01:43:51.712941 4771 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c7ae0e-123b-42cf-99cf-57309d7c22b0-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:43:51 crc kubenswrapper[4771]: I0227 01:43:51.712959 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/40c7ae0e-123b-42cf-99cf-57309d7c22b0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 01:43:51 crc kubenswrapper[4771]: I0227 01:43:51.712973 4771 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/40c7ae0e-123b-42cf-99cf-57309d7c22b0-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.132429 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz" event={"ID":"40c7ae0e-123b-42cf-99cf-57309d7c22b0","Type":"ContainerDied","Data":"c71b92ac50bc7ab5890b45a77763c22a8eb33d27d98d1283dfc0393a381cb846"} Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.132478 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c71b92ac50bc7ab5890b45a77763c22a8eb33d27d98d1283dfc0393a381cb846" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.132528 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.246474 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2"] Feb 27 01:43:52 crc kubenswrapper[4771]: E0227 01:43:52.247076 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="303e4502-b658-46e9-93cd-d92c53e9c10a" containerName="extract-content" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.247097 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="303e4502-b658-46e9-93cd-d92c53e9c10a" containerName="extract-content" Feb 27 01:43:52 crc kubenswrapper[4771]: E0227 01:43:52.247109 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="303e4502-b658-46e9-93cd-d92c53e9c10a" containerName="registry-server" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.247116 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="303e4502-b658-46e9-93cd-d92c53e9c10a" containerName="registry-server" Feb 27 01:43:52 crc kubenswrapper[4771]: E0227 01:43:52.247159 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c7ae0e-123b-42cf-99cf-57309d7c22b0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.247169 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c7ae0e-123b-42cf-99cf-57309d7c22b0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 27 01:43:52 crc kubenswrapper[4771]: E0227 01:43:52.247209 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="303e4502-b658-46e9-93cd-d92c53e9c10a" containerName="extract-utilities" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.247218 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="303e4502-b658-46e9-93cd-d92c53e9c10a" containerName="extract-utilities" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.247410 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="303e4502-b658-46e9-93cd-d92c53e9c10a" containerName="registry-server" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.247438 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c7ae0e-123b-42cf-99cf-57309d7c22b0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.248193 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.251812 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.252056 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.252068 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.251644 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.251948 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.251843 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kwjcj" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.252727 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.259509 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2"] Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.322747 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.322801 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.322824 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.322870 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.322912 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp6x8\" (UniqueName: \"kubernetes.io/projected/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-kube-api-access-qp6x8\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.322938 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.322967 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.323047 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.323111 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.323143 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.323193 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.424978 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp6x8\" (UniqueName: \"kubernetes.io/projected/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-kube-api-access-qp6x8\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.425031 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.425058 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.425133 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.425205 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.425243 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.425293 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.425346 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.425370 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.425397 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.425443 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.426322 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.431297 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.431829 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.431886 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.431955 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.432088 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.432292 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.432781 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.433254 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.433265 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.445603 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp6x8\" (UniqueName: \"kubernetes.io/projected/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-kube-api-access-qp6x8\") pod \"nova-edpm-deployment-openstack-edpm-ipam-6lnt2\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:52 crc kubenswrapper[4771]: I0227 01:43:52.576891 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:43:53 crc kubenswrapper[4771]: I0227 01:43:53.108043 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2"] Feb 27 01:43:53 crc kubenswrapper[4771]: I0227 01:43:53.145009 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" event={"ID":"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99","Type":"ContainerStarted","Data":"620656884f7a96380e02f683d8e29832458353ef22d3af316fceef3ac515ebe3"} Feb 27 01:43:54 crc kubenswrapper[4771]: I0227 01:43:54.154743 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" event={"ID":"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99","Type":"ContainerStarted","Data":"e29e0aa1f6d3b7aceb072df6517545167f21c0e3aa4a377f4ba845f3ff7cf533"} Feb 27 01:43:54 crc kubenswrapper[4771]: I0227 01:43:54.180212 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" podStartSLOduration=1.7366302 podStartE2EDuration="2.180196621s" podCreationTimestamp="2026-02-27 01:43:52 +0000 UTC" firstStartedPulling="2026-02-27 01:43:53.112872039 +0000 UTC m=+2346.050433337" lastFinishedPulling="2026-02-27 01:43:53.55643847 +0000 UTC m=+2346.493999758" observedRunningTime="2026-02-27 01:43:54.172677015 +0000 UTC m=+2347.110238303" watchObservedRunningTime="2026-02-27 01:43:54.180196621 +0000 UTC m=+2347.117757909" Feb 27 01:43:54 crc kubenswrapper[4771]: E0227 01:43:54.335561 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod303e4502_b658_46e9_93cd_d92c53e9c10a.slice/crio-7ba540e0ed1bf0958cfb27f590646a954f41996d22a890e1f303ef0e34b9ff6f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod303e4502_b658_46e9_93cd_d92c53e9c10a.slice\": RecentStats: unable to find data in memory cache]" Feb 27 01:44:00 crc kubenswrapper[4771]: I0227 01:44:00.145835 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535944-dvp6w"] Feb 27 01:44:00 crc kubenswrapper[4771]: I0227 01:44:00.147667 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535944-dvp6w" Feb 27 01:44:00 crc kubenswrapper[4771]: I0227 01:44:00.150704 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 01:44:00 crc kubenswrapper[4771]: I0227 01:44:00.150942 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:44:00 crc kubenswrapper[4771]: I0227 01:44:00.151455 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:44:00 crc kubenswrapper[4771]: I0227 01:44:00.158093 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535944-dvp6w"] Feb 27 01:44:00 crc kubenswrapper[4771]: I0227 01:44:00.182578 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ttzj\" (UniqueName: \"kubernetes.io/projected/8ef05ae1-b240-4abf-a4d4-0605ec393956-kube-api-access-8ttzj\") pod \"auto-csr-approver-29535944-dvp6w\" (UID: \"8ef05ae1-b240-4abf-a4d4-0605ec393956\") " pod="openshift-infra/auto-csr-approver-29535944-dvp6w" Feb 27 01:44:00 crc kubenswrapper[4771]: I0227 01:44:00.284856 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ttzj\" (UniqueName: \"kubernetes.io/projected/8ef05ae1-b240-4abf-a4d4-0605ec393956-kube-api-access-8ttzj\") pod \"auto-csr-approver-29535944-dvp6w\" (UID: \"8ef05ae1-b240-4abf-a4d4-0605ec393956\") " pod="openshift-infra/auto-csr-approver-29535944-dvp6w" Feb 27 01:44:00 crc kubenswrapper[4771]: I0227 01:44:00.312447 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ttzj\" (UniqueName: \"kubernetes.io/projected/8ef05ae1-b240-4abf-a4d4-0605ec393956-kube-api-access-8ttzj\") pod \"auto-csr-approver-29535944-dvp6w\" (UID: \"8ef05ae1-b240-4abf-a4d4-0605ec393956\") " pod="openshift-infra/auto-csr-approver-29535944-dvp6w" Feb 27 01:44:00 crc kubenswrapper[4771]: I0227 01:44:00.482243 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535944-dvp6w" Feb 27 01:44:00 crc kubenswrapper[4771]: I0227 01:44:00.924648 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535944-dvp6w"] Feb 27 01:44:01 crc kubenswrapper[4771]: I0227 01:44:01.226462 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535944-dvp6w" event={"ID":"8ef05ae1-b240-4abf-a4d4-0605ec393956","Type":"ContainerStarted","Data":"1441198e1d65a5471679c4506e720c4d119c435e4a6d0cfcbdccee7622d00579"} Feb 27 01:44:02 crc kubenswrapper[4771]: I0227 01:44:02.235093 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535944-dvp6w" event={"ID":"8ef05ae1-b240-4abf-a4d4-0605ec393956","Type":"ContainerStarted","Data":"ed4cfb69ce9a6c93c545ae6a84ee2299a88174de2d369eb8caa33840cbc1047f"} Feb 27 01:44:02 crc kubenswrapper[4771]: I0227 01:44:02.271027 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535944-dvp6w" podStartSLOduration=1.249576282 podStartE2EDuration="2.270999182s" podCreationTimestamp="2026-02-27 01:44:00 +0000 UTC" firstStartedPulling="2026-02-27 01:44:00.941928417 +0000 UTC m=+2353.879489705" lastFinishedPulling="2026-02-27 01:44:01.963351317 +0000 UTC m=+2354.900912605" observedRunningTime="2026-02-27 01:44:02.248579856 +0000 UTC m=+2355.186141154" watchObservedRunningTime="2026-02-27 01:44:02.270999182 +0000 UTC m=+2355.208560500" Feb 27 01:44:03 crc kubenswrapper[4771]: I0227 01:44:03.249805 4771 generic.go:334] "Generic (PLEG): container finished" podID="8ef05ae1-b240-4abf-a4d4-0605ec393956" containerID="ed4cfb69ce9a6c93c545ae6a84ee2299a88174de2d369eb8caa33840cbc1047f" exitCode=0 Feb 27 01:44:03 crc kubenswrapper[4771]: I0227 01:44:03.249851 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535944-dvp6w" event={"ID":"8ef05ae1-b240-4abf-a4d4-0605ec393956","Type":"ContainerDied","Data":"ed4cfb69ce9a6c93c545ae6a84ee2299a88174de2d369eb8caa33840cbc1047f"} Feb 27 01:44:04 crc kubenswrapper[4771]: E0227 01:44:04.591249 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod303e4502_b658_46e9_93cd_d92c53e9c10a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod303e4502_b658_46e9_93cd_d92c53e9c10a.slice/crio-7ba540e0ed1bf0958cfb27f590646a954f41996d22a890e1f303ef0e34b9ff6f\": RecentStats: unable to find data in memory cache]" Feb 27 01:44:04 crc kubenswrapper[4771]: I0227 01:44:04.597235 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535944-dvp6w" Feb 27 01:44:04 crc kubenswrapper[4771]: I0227 01:44:04.766536 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ttzj\" (UniqueName: \"kubernetes.io/projected/8ef05ae1-b240-4abf-a4d4-0605ec393956-kube-api-access-8ttzj\") pod \"8ef05ae1-b240-4abf-a4d4-0605ec393956\" (UID: \"8ef05ae1-b240-4abf-a4d4-0605ec393956\") " Feb 27 01:44:04 crc kubenswrapper[4771]: I0227 01:44:04.772342 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ef05ae1-b240-4abf-a4d4-0605ec393956-kube-api-access-8ttzj" (OuterVolumeSpecName: "kube-api-access-8ttzj") pod "8ef05ae1-b240-4abf-a4d4-0605ec393956" (UID: "8ef05ae1-b240-4abf-a4d4-0605ec393956"). InnerVolumeSpecName "kube-api-access-8ttzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:44:04 crc kubenswrapper[4771]: I0227 01:44:04.869292 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ttzj\" (UniqueName: \"kubernetes.io/projected/8ef05ae1-b240-4abf-a4d4-0605ec393956-kube-api-access-8ttzj\") on node \"crc\" DevicePath \"\"" Feb 27 01:44:05 crc kubenswrapper[4771]: I0227 01:44:05.271483 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535944-dvp6w" event={"ID":"8ef05ae1-b240-4abf-a4d4-0605ec393956","Type":"ContainerDied","Data":"1441198e1d65a5471679c4506e720c4d119c435e4a6d0cfcbdccee7622d00579"} Feb 27 01:44:05 crc kubenswrapper[4771]: I0227 01:44:05.273597 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1441198e1d65a5471679c4506e720c4d119c435e4a6d0cfcbdccee7622d00579" Feb 27 01:44:05 crc kubenswrapper[4771]: I0227 01:44:05.271612 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535944-dvp6w" Feb 27 01:44:05 crc kubenswrapper[4771]: I0227 01:44:05.332676 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535938-jkl79"] Feb 27 01:44:05 crc kubenswrapper[4771]: I0227 01:44:05.342386 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535938-jkl79"] Feb 27 01:44:05 crc kubenswrapper[4771]: I0227 01:44:05.784386 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8e0385d-ba62-4fcc-a059-5114bb130263" path="/var/lib/kubelet/pods/c8e0385d-ba62-4fcc-a059-5114bb130263/volumes" Feb 27 01:44:14 crc kubenswrapper[4771]: E0227 01:44:14.819970 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod303e4502_b658_46e9_93cd_d92c53e9c10a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod303e4502_b658_46e9_93cd_d92c53e9c10a.slice/crio-7ba540e0ed1bf0958cfb27f590646a954f41996d22a890e1f303ef0e34b9ff6f\": RecentStats: unable to find data in memory cache]" Feb 27 01:44:25 crc kubenswrapper[4771]: E0227 01:44:25.048096 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod303e4502_b658_46e9_93cd_d92c53e9c10a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod303e4502_b658_46e9_93cd_d92c53e9c10a.slice/crio-7ba540e0ed1bf0958cfb27f590646a954f41996d22a890e1f303ef0e34b9ff6f\": RecentStats: unable to find data in memory cache]" Feb 27 01:44:29 crc kubenswrapper[4771]: I0227 01:44:29.458437 4771 scope.go:117] "RemoveContainer" containerID="636e297e3d86ea96342139341e1b675169250e885d08f049b5c0e5a8a772580d" Feb 27 01:44:35 crc kubenswrapper[4771]: E0227 01:44:35.278966 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod303e4502_b658_46e9_93cd_d92c53e9c10a.slice/crio-7ba540e0ed1bf0958cfb27f590646a954f41996d22a890e1f303ef0e34b9ff6f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod303e4502_b658_46e9_93cd_d92c53e9c10a.slice\": RecentStats: unable to find data in memory cache]" Feb 27 01:44:45 crc kubenswrapper[4771]: E0227 01:44:45.492443 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod303e4502_b658_46e9_93cd_d92c53e9c10a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod303e4502_b658_46e9_93cd_d92c53e9c10a.slice/crio-7ba540e0ed1bf0958cfb27f590646a954f41996d22a890e1f303ef0e34b9ff6f\": RecentStats: unable to find data in memory cache]" Feb 27 01:44:47 crc kubenswrapper[4771]: E0227 01:44:47.812425 4771 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/03f69983b078b27075629bff2a5eadc694e6269eb99dacf15c41d77bb951ca5a/diff" to get inode usage: stat /var/lib/containers/storage/overlay/03f69983b078b27075629bff2a5eadc694e6269eb99dacf15c41d77bb951ca5a/diff: no such file or directory, extraDiskErr: Feb 27 01:45:00 crc kubenswrapper[4771]: I0227 01:45:00.164334 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535945-kfw8q"] Feb 27 01:45:00 crc kubenswrapper[4771]: E0227 01:45:00.166741 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef05ae1-b240-4abf-a4d4-0605ec393956" containerName="oc" Feb 27 01:45:00 crc kubenswrapper[4771]: I0227 01:45:00.166837 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef05ae1-b240-4abf-a4d4-0605ec393956" containerName="oc" Feb 27 01:45:00 crc kubenswrapper[4771]: I0227 01:45:00.167126 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ef05ae1-b240-4abf-a4d4-0605ec393956" containerName="oc" Feb 27 01:45:00 crc kubenswrapper[4771]: I0227 01:45:00.167972 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535945-kfw8q" Feb 27 01:45:00 crc kubenswrapper[4771]: I0227 01:45:00.170365 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 01:45:00 crc kubenswrapper[4771]: I0227 01:45:00.170671 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 01:45:00 crc kubenswrapper[4771]: I0227 01:45:00.189475 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535945-kfw8q"] Feb 27 01:45:00 crc kubenswrapper[4771]: I0227 01:45:00.333290 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ldqn\" (UniqueName: \"kubernetes.io/projected/4e858307-f3ef-4a04-8bf3-dcf5081e599a-kube-api-access-5ldqn\") pod \"collect-profiles-29535945-kfw8q\" (UID: \"4e858307-f3ef-4a04-8bf3-dcf5081e599a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535945-kfw8q" Feb 27 01:45:00 crc kubenswrapper[4771]: I0227 01:45:00.333407 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4e858307-f3ef-4a04-8bf3-dcf5081e599a-secret-volume\") pod \"collect-profiles-29535945-kfw8q\" (UID: \"4e858307-f3ef-4a04-8bf3-dcf5081e599a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535945-kfw8q" Feb 27 01:45:00 crc kubenswrapper[4771]: I0227 01:45:00.333714 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e858307-f3ef-4a04-8bf3-dcf5081e599a-config-volume\") pod \"collect-profiles-29535945-kfw8q\" (UID: \"4e858307-f3ef-4a04-8bf3-dcf5081e599a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535945-kfw8q" Feb 27 01:45:00 crc kubenswrapper[4771]: I0227 01:45:00.435477 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e858307-f3ef-4a04-8bf3-dcf5081e599a-config-volume\") pod \"collect-profiles-29535945-kfw8q\" (UID: \"4e858307-f3ef-4a04-8bf3-dcf5081e599a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535945-kfw8q" Feb 27 01:45:00 crc kubenswrapper[4771]: I0227 01:45:00.435698 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ldqn\" (UniqueName: \"kubernetes.io/projected/4e858307-f3ef-4a04-8bf3-dcf5081e599a-kube-api-access-5ldqn\") pod \"collect-profiles-29535945-kfw8q\" (UID: \"4e858307-f3ef-4a04-8bf3-dcf5081e599a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535945-kfw8q" Feb 27 01:45:00 crc kubenswrapper[4771]: I0227 01:45:00.435726 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4e858307-f3ef-4a04-8bf3-dcf5081e599a-secret-volume\") pod \"collect-profiles-29535945-kfw8q\" (UID: \"4e858307-f3ef-4a04-8bf3-dcf5081e599a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535945-kfw8q" Feb 27 01:45:00 crc kubenswrapper[4771]: I0227 01:45:00.436604 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e858307-f3ef-4a04-8bf3-dcf5081e599a-config-volume\") pod \"collect-profiles-29535945-kfw8q\" (UID: \"4e858307-f3ef-4a04-8bf3-dcf5081e599a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535945-kfw8q" Feb 27 01:45:00 crc kubenswrapper[4771]: I0227 01:45:00.444329 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4e858307-f3ef-4a04-8bf3-dcf5081e599a-secret-volume\") pod \"collect-profiles-29535945-kfw8q\" (UID: \"4e858307-f3ef-4a04-8bf3-dcf5081e599a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535945-kfw8q" Feb 27 01:45:00 crc kubenswrapper[4771]: I0227 01:45:00.452766 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ldqn\" (UniqueName: \"kubernetes.io/projected/4e858307-f3ef-4a04-8bf3-dcf5081e599a-kube-api-access-5ldqn\") pod \"collect-profiles-29535945-kfw8q\" (UID: \"4e858307-f3ef-4a04-8bf3-dcf5081e599a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535945-kfw8q" Feb 27 01:45:00 crc kubenswrapper[4771]: I0227 01:45:00.525092 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535945-kfw8q" Feb 27 01:45:00 crc kubenswrapper[4771]: I0227 01:45:00.952687 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535945-kfw8q"] Feb 27 01:45:01 crc kubenswrapper[4771]: I0227 01:45:01.889398 4771 generic.go:334] "Generic (PLEG): container finished" podID="4e858307-f3ef-4a04-8bf3-dcf5081e599a" containerID="d47bcd0c75d1dcc8e425b98e37d716747976d1dab14c4ef4ab65ddca2f2c847a" exitCode=0 Feb 27 01:45:01 crc kubenswrapper[4771]: I0227 01:45:01.889445 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535945-kfw8q" event={"ID":"4e858307-f3ef-4a04-8bf3-dcf5081e599a","Type":"ContainerDied","Data":"d47bcd0c75d1dcc8e425b98e37d716747976d1dab14c4ef4ab65ddca2f2c847a"} Feb 27 01:45:01 crc kubenswrapper[4771]: I0227 01:45:01.889473 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535945-kfw8q" event={"ID":"4e858307-f3ef-4a04-8bf3-dcf5081e599a","Type":"ContainerStarted","Data":"f1406ac707715ced594203b6229e7d6543a328bcd960496be27b092154921403"} Feb 27 01:45:03 crc kubenswrapper[4771]: I0227 01:45:03.293346 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535945-kfw8q" Feb 27 01:45:03 crc kubenswrapper[4771]: I0227 01:45:03.394594 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ldqn\" (UniqueName: \"kubernetes.io/projected/4e858307-f3ef-4a04-8bf3-dcf5081e599a-kube-api-access-5ldqn\") pod \"4e858307-f3ef-4a04-8bf3-dcf5081e599a\" (UID: \"4e858307-f3ef-4a04-8bf3-dcf5081e599a\") " Feb 27 01:45:03 crc kubenswrapper[4771]: I0227 01:45:03.394749 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4e858307-f3ef-4a04-8bf3-dcf5081e599a-secret-volume\") pod \"4e858307-f3ef-4a04-8bf3-dcf5081e599a\" (UID: \"4e858307-f3ef-4a04-8bf3-dcf5081e599a\") " Feb 27 01:45:03 crc kubenswrapper[4771]: I0227 01:45:03.394858 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e858307-f3ef-4a04-8bf3-dcf5081e599a-config-volume\") pod \"4e858307-f3ef-4a04-8bf3-dcf5081e599a\" (UID: \"4e858307-f3ef-4a04-8bf3-dcf5081e599a\") " Feb 27 01:45:03 crc kubenswrapper[4771]: I0227 01:45:03.396024 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e858307-f3ef-4a04-8bf3-dcf5081e599a-config-volume" (OuterVolumeSpecName: "config-volume") pod "4e858307-f3ef-4a04-8bf3-dcf5081e599a" (UID: "4e858307-f3ef-4a04-8bf3-dcf5081e599a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:45:03 crc kubenswrapper[4771]: I0227 01:45:03.403038 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e858307-f3ef-4a04-8bf3-dcf5081e599a-kube-api-access-5ldqn" (OuterVolumeSpecName: "kube-api-access-5ldqn") pod "4e858307-f3ef-4a04-8bf3-dcf5081e599a" (UID: "4e858307-f3ef-4a04-8bf3-dcf5081e599a"). InnerVolumeSpecName "kube-api-access-5ldqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:45:03 crc kubenswrapper[4771]: I0227 01:45:03.417161 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e858307-f3ef-4a04-8bf3-dcf5081e599a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4e858307-f3ef-4a04-8bf3-dcf5081e599a" (UID: "4e858307-f3ef-4a04-8bf3-dcf5081e599a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:45:03 crc kubenswrapper[4771]: I0227 01:45:03.497710 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4e858307-f3ef-4a04-8bf3-dcf5081e599a-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 01:45:03 crc kubenswrapper[4771]: I0227 01:45:03.497769 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e858307-f3ef-4a04-8bf3-dcf5081e599a-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 01:45:03 crc kubenswrapper[4771]: I0227 01:45:03.497794 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ldqn\" (UniqueName: \"kubernetes.io/projected/4e858307-f3ef-4a04-8bf3-dcf5081e599a-kube-api-access-5ldqn\") on node \"crc\" DevicePath \"\"" Feb 27 01:45:03 crc kubenswrapper[4771]: I0227 01:45:03.912768 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535945-kfw8q" event={"ID":"4e858307-f3ef-4a04-8bf3-dcf5081e599a","Type":"ContainerDied","Data":"f1406ac707715ced594203b6229e7d6543a328bcd960496be27b092154921403"} Feb 27 01:45:03 crc kubenswrapper[4771]: I0227 01:45:03.912827 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1406ac707715ced594203b6229e7d6543a328bcd960496be27b092154921403" Feb 27 01:45:03 crc kubenswrapper[4771]: I0227 01:45:03.912843 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535945-kfw8q" Feb 27 01:45:04 crc kubenswrapper[4771]: I0227 01:45:04.381476 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535900-m4svc"] Feb 27 01:45:04 crc kubenswrapper[4771]: I0227 01:45:04.395160 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535900-m4svc"] Feb 27 01:45:05 crc kubenswrapper[4771]: I0227 01:45:05.792455 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6de81df-af0d-4ebe-b254-7a45c4eb5312" path="/var/lib/kubelet/pods/b6de81df-af0d-4ebe-b254-7a45c4eb5312/volumes" Feb 27 01:45:28 crc kubenswrapper[4771]: I0227 01:45:28.953004 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:45:28 crc kubenswrapper[4771]: I0227 01:45:28.953699 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:45:29 crc kubenswrapper[4771]: I0227 01:45:29.553385 4771 scope.go:117] "RemoveContainer" containerID="e0e334c29109c38d41bb92b4427f3ba3625e86f2c9191671d44c6e07d0b9487f" Feb 27 01:45:58 crc kubenswrapper[4771]: I0227 01:45:58.953159 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:45:58 crc kubenswrapper[4771]: I0227 01:45:58.953666 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:46:00 crc kubenswrapper[4771]: I0227 01:46:00.160791 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535946-9w266"] Feb 27 01:46:00 crc kubenswrapper[4771]: E0227 01:46:00.161836 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e858307-f3ef-4a04-8bf3-dcf5081e599a" containerName="collect-profiles" Feb 27 01:46:00 crc kubenswrapper[4771]: I0227 01:46:00.161853 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e858307-f3ef-4a04-8bf3-dcf5081e599a" containerName="collect-profiles" Feb 27 01:46:00 crc kubenswrapper[4771]: I0227 01:46:00.162140 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e858307-f3ef-4a04-8bf3-dcf5081e599a" containerName="collect-profiles" Feb 27 01:46:00 crc kubenswrapper[4771]: I0227 01:46:00.163050 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535946-9w266" Feb 27 01:46:00 crc kubenswrapper[4771]: I0227 01:46:00.166982 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:46:00 crc kubenswrapper[4771]: I0227 01:46:00.167185 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 01:46:00 crc kubenswrapper[4771]: I0227 01:46:00.167339 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:46:00 crc kubenswrapper[4771]: I0227 01:46:00.170317 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535946-9w266"] Feb 27 01:46:00 crc kubenswrapper[4771]: I0227 01:46:00.194229 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clkbk\" (UniqueName: \"kubernetes.io/projected/8c2d3b83-c0f9-48e9-9e64-1ad5d78fb32a-kube-api-access-clkbk\") pod \"auto-csr-approver-29535946-9w266\" (UID: \"8c2d3b83-c0f9-48e9-9e64-1ad5d78fb32a\") " pod="openshift-infra/auto-csr-approver-29535946-9w266" Feb 27 01:46:00 crc kubenswrapper[4771]: I0227 01:46:00.296159 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clkbk\" (UniqueName: \"kubernetes.io/projected/8c2d3b83-c0f9-48e9-9e64-1ad5d78fb32a-kube-api-access-clkbk\") pod \"auto-csr-approver-29535946-9w266\" (UID: \"8c2d3b83-c0f9-48e9-9e64-1ad5d78fb32a\") " pod="openshift-infra/auto-csr-approver-29535946-9w266" Feb 27 01:46:00 crc kubenswrapper[4771]: I0227 01:46:00.317765 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clkbk\" (UniqueName: \"kubernetes.io/projected/8c2d3b83-c0f9-48e9-9e64-1ad5d78fb32a-kube-api-access-clkbk\") pod \"auto-csr-approver-29535946-9w266\" (UID: \"8c2d3b83-c0f9-48e9-9e64-1ad5d78fb32a\") " pod="openshift-infra/auto-csr-approver-29535946-9w266" Feb 27 01:46:00 crc kubenswrapper[4771]: I0227 01:46:00.508052 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535946-9w266" Feb 27 01:46:01 crc kubenswrapper[4771]: I0227 01:46:01.060725 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535946-9w266"] Feb 27 01:46:01 crc kubenswrapper[4771]: W0227 01:46:01.066766 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c2d3b83_c0f9_48e9_9e64_1ad5d78fb32a.slice/crio-7ce07bbd0fd7ca2ee2ae90204b881038448576947162ed2a5e4eb6f68b97fe1a WatchSource:0}: Error finding container 7ce07bbd0fd7ca2ee2ae90204b881038448576947162ed2a5e4eb6f68b97fe1a: Status 404 returned error can't find the container with id 7ce07bbd0fd7ca2ee2ae90204b881038448576947162ed2a5e4eb6f68b97fe1a Feb 27 01:46:01 crc kubenswrapper[4771]: I0227 01:46:01.070511 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 01:46:01 crc kubenswrapper[4771]: I0227 01:46:01.475681 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535946-9w266" event={"ID":"8c2d3b83-c0f9-48e9-9e64-1ad5d78fb32a","Type":"ContainerStarted","Data":"7ce07bbd0fd7ca2ee2ae90204b881038448576947162ed2a5e4eb6f68b97fe1a"} Feb 27 01:46:02 crc kubenswrapper[4771]: I0227 01:46:02.489859 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535946-9w266" event={"ID":"8c2d3b83-c0f9-48e9-9e64-1ad5d78fb32a","Type":"ContainerStarted","Data":"78c36c48b13c35705a9cfdf04dbf6a52be5a934d1eaedd73687393b5792908b4"} Feb 27 01:46:02 crc kubenswrapper[4771]: I0227 01:46:02.511082 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535946-9w266" podStartSLOduration=1.575002233 podStartE2EDuration="2.511063478s" podCreationTimestamp="2026-02-27 01:46:00 +0000 UTC" firstStartedPulling="2026-02-27 01:46:01.070242441 +0000 UTC m=+2474.007803739" lastFinishedPulling="2026-02-27 01:46:02.006303666 +0000 UTC m=+2474.943864984" observedRunningTime="2026-02-27 01:46:02.501819884 +0000 UTC m=+2475.439381172" watchObservedRunningTime="2026-02-27 01:46:02.511063478 +0000 UTC m=+2475.448624766" Feb 27 01:46:03 crc kubenswrapper[4771]: I0227 01:46:03.503058 4771 generic.go:334] "Generic (PLEG): container finished" podID="8c2d3b83-c0f9-48e9-9e64-1ad5d78fb32a" containerID="78c36c48b13c35705a9cfdf04dbf6a52be5a934d1eaedd73687393b5792908b4" exitCode=0 Feb 27 01:46:03 crc kubenswrapper[4771]: I0227 01:46:03.503110 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535946-9w266" event={"ID":"8c2d3b83-c0f9-48e9-9e64-1ad5d78fb32a","Type":"ContainerDied","Data":"78c36c48b13c35705a9cfdf04dbf6a52be5a934d1eaedd73687393b5792908b4"} Feb 27 01:46:04 crc kubenswrapper[4771]: I0227 01:46:04.998896 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535946-9w266" Feb 27 01:46:05 crc kubenswrapper[4771]: I0227 01:46:05.196882 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clkbk\" (UniqueName: \"kubernetes.io/projected/8c2d3b83-c0f9-48e9-9e64-1ad5d78fb32a-kube-api-access-clkbk\") pod \"8c2d3b83-c0f9-48e9-9e64-1ad5d78fb32a\" (UID: \"8c2d3b83-c0f9-48e9-9e64-1ad5d78fb32a\") " Feb 27 01:46:05 crc kubenswrapper[4771]: I0227 01:46:05.203932 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c2d3b83-c0f9-48e9-9e64-1ad5d78fb32a-kube-api-access-clkbk" (OuterVolumeSpecName: "kube-api-access-clkbk") pod "8c2d3b83-c0f9-48e9-9e64-1ad5d78fb32a" (UID: "8c2d3b83-c0f9-48e9-9e64-1ad5d78fb32a"). InnerVolumeSpecName "kube-api-access-clkbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:46:05 crc kubenswrapper[4771]: I0227 01:46:05.298809 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clkbk\" (UniqueName: \"kubernetes.io/projected/8c2d3b83-c0f9-48e9-9e64-1ad5d78fb32a-kube-api-access-clkbk\") on node \"crc\" DevicePath \"\"" Feb 27 01:46:05 crc kubenswrapper[4771]: I0227 01:46:05.527636 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535946-9w266" event={"ID":"8c2d3b83-c0f9-48e9-9e64-1ad5d78fb32a","Type":"ContainerDied","Data":"7ce07bbd0fd7ca2ee2ae90204b881038448576947162ed2a5e4eb6f68b97fe1a"} Feb 27 01:46:05 crc kubenswrapper[4771]: I0227 01:46:05.527695 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ce07bbd0fd7ca2ee2ae90204b881038448576947162ed2a5e4eb6f68b97fe1a" Feb 27 01:46:05 crc kubenswrapper[4771]: I0227 01:46:05.527725 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535946-9w266" Feb 27 01:46:05 crc kubenswrapper[4771]: I0227 01:46:05.597622 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535940-t6x6t"] Feb 27 01:46:05 crc kubenswrapper[4771]: I0227 01:46:05.605410 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535940-t6x6t"] Feb 27 01:46:05 crc kubenswrapper[4771]: I0227 01:46:05.785867 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bb33d2c-2a90-4f39-b5a4-1f18141ad41d" path="/var/lib/kubelet/pods/2bb33d2c-2a90-4f39-b5a4-1f18141ad41d/volumes" Feb 27 01:46:23 crc kubenswrapper[4771]: I0227 01:46:23.727532 4771 generic.go:334] "Generic (PLEG): container finished" podID="d2a7b19f-a0a4-4aa8-80c5-f05300c19d99" containerID="e29e0aa1f6d3b7aceb072df6517545167f21c0e3aa4a377f4ba845f3ff7cf533" exitCode=0 Feb 27 01:46:23 crc kubenswrapper[4771]: I0227 01:46:23.727634 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" event={"ID":"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99","Type":"ContainerDied","Data":"e29e0aa1f6d3b7aceb072df6517545167f21c0e3aa4a377f4ba845f3ff7cf533"} Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.194010 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.237341 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-combined-ca-bundle\") pod \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.237671 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-inventory\") pod \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.237898 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp6x8\" (UniqueName: \"kubernetes.io/projected/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-kube-api-access-qp6x8\") pod \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.238097 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-cell1-compute-config-3\") pod \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.238346 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-cell1-compute-config-2\") pod \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.238521 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-cell1-compute-config-1\") pod \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.238693 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-extra-config-0\") pod \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.238862 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-migration-ssh-key-0\") pod \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.239006 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-ssh-key-openstack-edpm-ipam\") pod \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.239169 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-migration-ssh-key-1\") pod \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.239328 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-cell1-compute-config-0\") pod \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\" (UID: \"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99\") " Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.244792 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "d2a7b19f-a0a4-4aa8-80c5-f05300c19d99" (UID: "d2a7b19f-a0a4-4aa8-80c5-f05300c19d99"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.260659 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-kube-api-access-qp6x8" (OuterVolumeSpecName: "kube-api-access-qp6x8") pod "d2a7b19f-a0a4-4aa8-80c5-f05300c19d99" (UID: "d2a7b19f-a0a4-4aa8-80c5-f05300c19d99"). InnerVolumeSpecName "kube-api-access-qp6x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.269123 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "d2a7b19f-a0a4-4aa8-80c5-f05300c19d99" (UID: "d2a7b19f-a0a4-4aa8-80c5-f05300c19d99"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.283050 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-inventory" (OuterVolumeSpecName: "inventory") pod "d2a7b19f-a0a4-4aa8-80c5-f05300c19d99" (UID: "d2a7b19f-a0a4-4aa8-80c5-f05300c19d99"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.286341 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "d2a7b19f-a0a4-4aa8-80c5-f05300c19d99" (UID: "d2a7b19f-a0a4-4aa8-80c5-f05300c19d99"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.296844 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "d2a7b19f-a0a4-4aa8-80c5-f05300c19d99" (UID: "d2a7b19f-a0a4-4aa8-80c5-f05300c19d99"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.297964 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "d2a7b19f-a0a4-4aa8-80c5-f05300c19d99" (UID: "d2a7b19f-a0a4-4aa8-80c5-f05300c19d99"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.299495 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "d2a7b19f-a0a4-4aa8-80c5-f05300c19d99" (UID: "d2a7b19f-a0a4-4aa8-80c5-f05300c19d99"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.305290 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "d2a7b19f-a0a4-4aa8-80c5-f05300c19d99" (UID: "d2a7b19f-a0a4-4aa8-80c5-f05300c19d99"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.318591 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d2a7b19f-a0a4-4aa8-80c5-f05300c19d99" (UID: "d2a7b19f-a0a4-4aa8-80c5-f05300c19d99"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.319858 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "d2a7b19f-a0a4-4aa8-80c5-f05300c19d99" (UID: "d2a7b19f-a0a4-4aa8-80c5-f05300c19d99"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.344025 4771 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.344063 4771 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.344075 4771 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.344088 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.344100 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp6x8\" (UniqueName: \"kubernetes.io/projected/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-kube-api-access-qp6x8\") on node \"crc\" DevicePath \"\"" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.344112 4771 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.344122 4771 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.344134 4771 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.344144 4771 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.344154 4771 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.344164 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d2a7b19f-a0a4-4aa8-80c5-f05300c19d99-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.750225 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" event={"ID":"d2a7b19f-a0a4-4aa8-80c5-f05300c19d99","Type":"ContainerDied","Data":"620656884f7a96380e02f683d8e29832458353ef22d3af316fceef3ac515ebe3"} Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.750812 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="620656884f7a96380e02f683d8e29832458353ef22d3af316fceef3ac515ebe3" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.750262 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-6lnt2" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.968432 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk"] Feb 27 01:46:25 crc kubenswrapper[4771]: E0227 01:46:25.969082 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c2d3b83-c0f9-48e9-9e64-1ad5d78fb32a" containerName="oc" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.969103 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c2d3b83-c0f9-48e9-9e64-1ad5d78fb32a" containerName="oc" Feb 27 01:46:25 crc kubenswrapper[4771]: E0227 01:46:25.969128 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a7b19f-a0a4-4aa8-80c5-f05300c19d99" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.969137 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a7b19f-a0a4-4aa8-80c5-f05300c19d99" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.969361 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2a7b19f-a0a4-4aa8-80c5-f05300c19d99" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.969403 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c2d3b83-c0f9-48e9-9e64-1ad5d78fb32a" containerName="oc" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.970078 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.973001 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.973229 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.973269 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.973459 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kwjcj" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.975500 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 27 01:46:25 crc kubenswrapper[4771]: I0227 01:46:25.979665 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk"] Feb 27 01:46:26 crc kubenswrapper[4771]: I0227 01:46:26.057792 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hlghk\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" Feb 27 01:46:26 crc kubenswrapper[4771]: I0227 01:46:26.057877 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f7hp\" (UniqueName: \"kubernetes.io/projected/dc880077-8590-47a1-a434-e8cebcf3fff1-kube-api-access-2f7hp\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hlghk\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" Feb 27 01:46:26 crc kubenswrapper[4771]: I0227 01:46:26.057931 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hlghk\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" Feb 27 01:46:26 crc kubenswrapper[4771]: I0227 01:46:26.057961 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hlghk\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" Feb 27 01:46:26 crc kubenswrapper[4771]: I0227 01:46:26.058007 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hlghk\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" Feb 27 01:46:26 crc kubenswrapper[4771]: I0227 01:46:26.058026 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hlghk\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" Feb 27 01:46:26 crc kubenswrapper[4771]: I0227 01:46:26.058050 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hlghk\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" Feb 27 01:46:26 crc kubenswrapper[4771]: I0227 01:46:26.159681 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hlghk\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" Feb 27 01:46:26 crc kubenswrapper[4771]: I0227 01:46:26.159781 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f7hp\" (UniqueName: \"kubernetes.io/projected/dc880077-8590-47a1-a434-e8cebcf3fff1-kube-api-access-2f7hp\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hlghk\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" Feb 27 01:46:26 crc kubenswrapper[4771]: I0227 01:46:26.159853 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hlghk\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" Feb 27 01:46:26 crc kubenswrapper[4771]: I0227 01:46:26.159888 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hlghk\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" Feb 27 01:46:26 crc kubenswrapper[4771]: I0227 01:46:26.160709 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hlghk\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" Feb 27 01:46:26 crc kubenswrapper[4771]: I0227 01:46:26.160762 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hlghk\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" Feb 27 01:46:26 crc kubenswrapper[4771]: I0227 01:46:26.160795 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hlghk\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" Feb 27 01:46:26 crc kubenswrapper[4771]: I0227 01:46:26.165089 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hlghk\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" Feb 27 01:46:26 crc kubenswrapper[4771]: I0227 01:46:26.165612 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hlghk\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" Feb 27 01:46:26 crc kubenswrapper[4771]: I0227 01:46:26.166621 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hlghk\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" Feb 27 01:46:26 crc kubenswrapper[4771]: I0227 01:46:26.167260 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hlghk\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" Feb 27 01:46:26 crc kubenswrapper[4771]: I0227 01:46:26.168848 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hlghk\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" Feb 27 01:46:26 crc kubenswrapper[4771]: I0227 01:46:26.172037 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hlghk\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" Feb 27 01:46:26 crc kubenswrapper[4771]: I0227 01:46:26.183756 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f7hp\" (UniqueName: \"kubernetes.io/projected/dc880077-8590-47a1-a434-e8cebcf3fff1-kube-api-access-2f7hp\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hlghk\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" Feb 27 01:46:26 crc kubenswrapper[4771]: I0227 01:46:26.288290 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" Feb 27 01:46:26 crc kubenswrapper[4771]: I0227 01:46:26.856578 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk"] Feb 27 01:46:27 crc kubenswrapper[4771]: I0227 01:46:27.769957 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" event={"ID":"dc880077-8590-47a1-a434-e8cebcf3fff1","Type":"ContainerStarted","Data":"d1abbfbf45fbcd592f23bebd59a7f559cc9dc025fc5cd29adf35affc4c8f3670"} Feb 27 01:46:27 crc kubenswrapper[4771]: I0227 01:46:27.770573 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" event={"ID":"dc880077-8590-47a1-a434-e8cebcf3fff1","Type":"ContainerStarted","Data":"70df9b3a8d5358c70fe405eaff23dceddb3c70e4e493a5f3ab7dc9e435100af5"} Feb 27 01:46:27 crc kubenswrapper[4771]: I0227 01:46:27.795159 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" podStartSLOduration=2.293039604 podStartE2EDuration="2.795142012s" podCreationTimestamp="2026-02-27 01:46:25 +0000 UTC" firstStartedPulling="2026-02-27 01:46:26.861887195 +0000 UTC m=+2499.799448483" lastFinishedPulling="2026-02-27 01:46:27.363989603 +0000 UTC m=+2500.301550891" observedRunningTime="2026-02-27 01:46:27.793874547 +0000 UTC m=+2500.731435835" watchObservedRunningTime="2026-02-27 01:46:27.795142012 +0000 UTC m=+2500.732703300" Feb 27 01:46:28 crc kubenswrapper[4771]: I0227 01:46:28.953720 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:46:28 crc kubenswrapper[4771]: I0227 01:46:28.954042 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:46:28 crc kubenswrapper[4771]: I0227 01:46:28.954083 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 01:46:28 crc kubenswrapper[4771]: I0227 01:46:28.954750 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f"} pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 01:46:28 crc kubenswrapper[4771]: I0227 01:46:28.954804 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" containerID="cri-o://8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" gracePeriod=600 Feb 27 01:46:29 crc kubenswrapper[4771]: E0227 01:46:29.113763 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:46:29 crc kubenswrapper[4771]: I0227 01:46:29.647821 4771 scope.go:117] "RemoveContainer" containerID="825dec4b2239079b87f038b9becca27e61ea99a634550930605fb7928de59bea" Feb 27 01:46:29 crc kubenswrapper[4771]: I0227 01:46:29.811942 4771 generic.go:334] "Generic (PLEG): container finished" podID="ca81e505-d53f-496e-bd26-7cec669591e4" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" exitCode=0 Feb 27 01:46:29 crc kubenswrapper[4771]: I0227 01:46:29.812006 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerDied","Data":"8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f"} Feb 27 01:46:29 crc kubenswrapper[4771]: I0227 01:46:29.812048 4771 scope.go:117] "RemoveContainer" containerID="2fa58e5c69c6875961ade4a259d074511e93c8575b54e900bfb9f5dcc26be68a" Feb 27 01:46:29 crc kubenswrapper[4771]: I0227 01:46:29.812884 4771 scope.go:117] "RemoveContainer" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" Feb 27 01:46:29 crc kubenswrapper[4771]: E0227 01:46:29.813194 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:46:41 crc kubenswrapper[4771]: I0227 01:46:41.773296 4771 scope.go:117] "RemoveContainer" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" Feb 27 01:46:41 crc kubenswrapper[4771]: E0227 01:46:41.774115 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:46:55 crc kubenswrapper[4771]: I0227 01:46:55.774466 4771 scope.go:117] "RemoveContainer" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" Feb 27 01:46:55 crc kubenswrapper[4771]: E0227 01:46:55.775244 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:47:09 crc kubenswrapper[4771]: I0227 01:47:09.773491 4771 scope.go:117] "RemoveContainer" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" Feb 27 01:47:09 crc kubenswrapper[4771]: E0227 01:47:09.774361 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:47:22 crc kubenswrapper[4771]: I0227 01:47:22.772964 4771 scope.go:117] "RemoveContainer" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" Feb 27 01:47:22 crc kubenswrapper[4771]: E0227 01:47:22.773867 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:47:37 crc kubenswrapper[4771]: I0227 01:47:37.779361 4771 scope.go:117] "RemoveContainer" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" Feb 27 01:47:37 crc kubenswrapper[4771]: E0227 01:47:37.780132 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:47:48 crc kubenswrapper[4771]: I0227 01:47:48.774184 4771 scope.go:117] "RemoveContainer" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" Feb 27 01:47:48 crc kubenswrapper[4771]: E0227 01:47:48.775465 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:47:59 crc kubenswrapper[4771]: I0227 01:47:59.774137 4771 scope.go:117] "RemoveContainer" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" Feb 27 01:47:59 crc kubenswrapper[4771]: E0227 01:47:59.775223 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:48:00 crc kubenswrapper[4771]: I0227 01:48:00.162262 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535948-dcslg"] Feb 27 01:48:00 crc kubenswrapper[4771]: I0227 01:48:00.164585 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535948-dcslg" Feb 27 01:48:00 crc kubenswrapper[4771]: I0227 01:48:00.167819 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 01:48:00 crc kubenswrapper[4771]: I0227 01:48:00.168512 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:48:00 crc kubenswrapper[4771]: I0227 01:48:00.172502 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:48:00 crc kubenswrapper[4771]: I0227 01:48:00.183115 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535948-dcslg"] Feb 27 01:48:00 crc kubenswrapper[4771]: I0227 01:48:00.285122 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pcbz\" (UniqueName: \"kubernetes.io/projected/feb3590d-ec9d-4437-82c7-24e0f748d130-kube-api-access-4pcbz\") pod \"auto-csr-approver-29535948-dcslg\" (UID: \"feb3590d-ec9d-4437-82c7-24e0f748d130\") " pod="openshift-infra/auto-csr-approver-29535948-dcslg" Feb 27 01:48:00 crc kubenswrapper[4771]: I0227 01:48:00.386684 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pcbz\" (UniqueName: \"kubernetes.io/projected/feb3590d-ec9d-4437-82c7-24e0f748d130-kube-api-access-4pcbz\") pod \"auto-csr-approver-29535948-dcslg\" (UID: \"feb3590d-ec9d-4437-82c7-24e0f748d130\") " pod="openshift-infra/auto-csr-approver-29535948-dcslg" Feb 27 01:48:00 crc kubenswrapper[4771]: I0227 01:48:00.408946 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pcbz\" (UniqueName: \"kubernetes.io/projected/feb3590d-ec9d-4437-82c7-24e0f748d130-kube-api-access-4pcbz\") pod \"auto-csr-approver-29535948-dcslg\" (UID: \"feb3590d-ec9d-4437-82c7-24e0f748d130\") " pod="openshift-infra/auto-csr-approver-29535948-dcslg" Feb 27 01:48:00 crc kubenswrapper[4771]: I0227 01:48:00.497599 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535948-dcslg" Feb 27 01:48:00 crc kubenswrapper[4771]: I0227 01:48:00.954883 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535948-dcslg"] Feb 27 01:48:01 crc kubenswrapper[4771]: I0227 01:48:01.871031 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535948-dcslg" event={"ID":"feb3590d-ec9d-4437-82c7-24e0f748d130","Type":"ContainerStarted","Data":"1595744d6ed97cc5906971291e840d1469ee82fe8529380d2667964be0286679"} Feb 27 01:48:02 crc kubenswrapper[4771]: I0227 01:48:02.880351 4771 generic.go:334] "Generic (PLEG): container finished" podID="feb3590d-ec9d-4437-82c7-24e0f748d130" containerID="0dfdbf95667659f25c1a91c3038129c9723547af061466b8cb8a1a926f8019f3" exitCode=0 Feb 27 01:48:02 crc kubenswrapper[4771]: I0227 01:48:02.881502 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535948-dcslg" event={"ID":"feb3590d-ec9d-4437-82c7-24e0f748d130","Type":"ContainerDied","Data":"0dfdbf95667659f25c1a91c3038129c9723547af061466b8cb8a1a926f8019f3"} Feb 27 01:48:04 crc kubenswrapper[4771]: I0227 01:48:04.253620 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535948-dcslg" Feb 27 01:48:04 crc kubenswrapper[4771]: I0227 01:48:04.269263 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pcbz\" (UniqueName: \"kubernetes.io/projected/feb3590d-ec9d-4437-82c7-24e0f748d130-kube-api-access-4pcbz\") pod \"feb3590d-ec9d-4437-82c7-24e0f748d130\" (UID: \"feb3590d-ec9d-4437-82c7-24e0f748d130\") " Feb 27 01:48:04 crc kubenswrapper[4771]: I0227 01:48:04.275434 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feb3590d-ec9d-4437-82c7-24e0f748d130-kube-api-access-4pcbz" (OuterVolumeSpecName: "kube-api-access-4pcbz") pod "feb3590d-ec9d-4437-82c7-24e0f748d130" (UID: "feb3590d-ec9d-4437-82c7-24e0f748d130"). InnerVolumeSpecName "kube-api-access-4pcbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:48:04 crc kubenswrapper[4771]: I0227 01:48:04.371986 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pcbz\" (UniqueName: \"kubernetes.io/projected/feb3590d-ec9d-4437-82c7-24e0f748d130-kube-api-access-4pcbz\") on node \"crc\" DevicePath \"\"" Feb 27 01:48:04 crc kubenswrapper[4771]: I0227 01:48:04.906204 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535948-dcslg" event={"ID":"feb3590d-ec9d-4437-82c7-24e0f748d130","Type":"ContainerDied","Data":"1595744d6ed97cc5906971291e840d1469ee82fe8529380d2667964be0286679"} Feb 27 01:48:04 crc kubenswrapper[4771]: I0227 01:48:04.906245 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1595744d6ed97cc5906971291e840d1469ee82fe8529380d2667964be0286679" Feb 27 01:48:04 crc kubenswrapper[4771]: I0227 01:48:04.906292 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535948-dcslg" Feb 27 01:48:05 crc kubenswrapper[4771]: I0227 01:48:05.344356 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535942-hdqwj"] Feb 27 01:48:05 crc kubenswrapper[4771]: I0227 01:48:05.351471 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535942-hdqwj"] Feb 27 01:48:05 crc kubenswrapper[4771]: I0227 01:48:05.787011 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04d72e49-e603-4837-88de-df823ea62b8e" path="/var/lib/kubelet/pods/04d72e49-e603-4837-88de-df823ea62b8e/volumes" Feb 27 01:48:12 crc kubenswrapper[4771]: I0227 01:48:12.773229 4771 scope.go:117] "RemoveContainer" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" Feb 27 01:48:12 crc kubenswrapper[4771]: E0227 01:48:12.774164 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:48:26 crc kubenswrapper[4771]: I0227 01:48:26.773202 4771 scope.go:117] "RemoveContainer" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" Feb 27 01:48:26 crc kubenswrapper[4771]: E0227 01:48:26.773929 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:48:29 crc kubenswrapper[4771]: I0227 01:48:29.764456 4771 scope.go:117] "RemoveContainer" containerID="28af7a170fd2d45d61f4c3a7255ca8ead24b6ebecc75e0624862258f6030bcc4" Feb 27 01:48:39 crc kubenswrapper[4771]: I0227 01:48:39.774329 4771 scope.go:117] "RemoveContainer" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" Feb 27 01:48:39 crc kubenswrapper[4771]: E0227 01:48:39.776817 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:48:53 crc kubenswrapper[4771]: I0227 01:48:53.774003 4771 scope.go:117] "RemoveContainer" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" Feb 27 01:48:53 crc kubenswrapper[4771]: E0227 01:48:53.775010 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:49:00 crc kubenswrapper[4771]: I0227 01:49:00.480324 4771 generic.go:334] "Generic (PLEG): container finished" podID="dc880077-8590-47a1-a434-e8cebcf3fff1" containerID="d1abbfbf45fbcd592f23bebd59a7f559cc9dc025fc5cd29adf35affc4c8f3670" exitCode=0 Feb 27 01:49:00 crc kubenswrapper[4771]: I0227 01:49:00.482403 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" event={"ID":"dc880077-8590-47a1-a434-e8cebcf3fff1","Type":"ContainerDied","Data":"d1abbfbf45fbcd592f23bebd59a7f559cc9dc025fc5cd29adf35affc4c8f3670"} Feb 27 01:49:01 crc kubenswrapper[4771]: I0227 01:49:01.897131 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" Feb 27 01:49:02 crc kubenswrapper[4771]: I0227 01:49:02.050628 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-ceilometer-compute-config-data-2\") pod \"dc880077-8590-47a1-a434-e8cebcf3fff1\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " Feb 27 01:49:02 crc kubenswrapper[4771]: I0227 01:49:02.050942 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-ssh-key-openstack-edpm-ipam\") pod \"dc880077-8590-47a1-a434-e8cebcf3fff1\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " Feb 27 01:49:02 crc kubenswrapper[4771]: I0227 01:49:02.050971 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-ceilometer-compute-config-data-1\") pod \"dc880077-8590-47a1-a434-e8cebcf3fff1\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " Feb 27 01:49:02 crc kubenswrapper[4771]: I0227 01:49:02.051006 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-telemetry-combined-ca-bundle\") pod \"dc880077-8590-47a1-a434-e8cebcf3fff1\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " Feb 27 01:49:02 crc kubenswrapper[4771]: I0227 01:49:02.051055 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f7hp\" (UniqueName: \"kubernetes.io/projected/dc880077-8590-47a1-a434-e8cebcf3fff1-kube-api-access-2f7hp\") pod \"dc880077-8590-47a1-a434-e8cebcf3fff1\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " Feb 27 01:49:02 crc kubenswrapper[4771]: I0227 01:49:02.051255 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-ceilometer-compute-config-data-0\") pod \"dc880077-8590-47a1-a434-e8cebcf3fff1\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " Feb 27 01:49:02 crc kubenswrapper[4771]: I0227 01:49:02.051449 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-inventory\") pod \"dc880077-8590-47a1-a434-e8cebcf3fff1\" (UID: \"dc880077-8590-47a1-a434-e8cebcf3fff1\") " Feb 27 01:49:02 crc kubenswrapper[4771]: I0227 01:49:02.056401 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc880077-8590-47a1-a434-e8cebcf3fff1-kube-api-access-2f7hp" (OuterVolumeSpecName: "kube-api-access-2f7hp") pod "dc880077-8590-47a1-a434-e8cebcf3fff1" (UID: "dc880077-8590-47a1-a434-e8cebcf3fff1"). InnerVolumeSpecName "kube-api-access-2f7hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:49:02 crc kubenswrapper[4771]: I0227 01:49:02.058646 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "dc880077-8590-47a1-a434-e8cebcf3fff1" (UID: "dc880077-8590-47a1-a434-e8cebcf3fff1"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:49:02 crc kubenswrapper[4771]: I0227 01:49:02.078130 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "dc880077-8590-47a1-a434-e8cebcf3fff1" (UID: "dc880077-8590-47a1-a434-e8cebcf3fff1"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:49:02 crc kubenswrapper[4771]: I0227 01:49:02.078201 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-inventory" (OuterVolumeSpecName: "inventory") pod "dc880077-8590-47a1-a434-e8cebcf3fff1" (UID: "dc880077-8590-47a1-a434-e8cebcf3fff1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:49:02 crc kubenswrapper[4771]: I0227 01:49:02.080298 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "dc880077-8590-47a1-a434-e8cebcf3fff1" (UID: "dc880077-8590-47a1-a434-e8cebcf3fff1"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:49:02 crc kubenswrapper[4771]: I0227 01:49:02.088378 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "dc880077-8590-47a1-a434-e8cebcf3fff1" (UID: "dc880077-8590-47a1-a434-e8cebcf3fff1"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:49:02 crc kubenswrapper[4771]: I0227 01:49:02.106657 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "dc880077-8590-47a1-a434-e8cebcf3fff1" (UID: "dc880077-8590-47a1-a434-e8cebcf3fff1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 01:49:02 crc kubenswrapper[4771]: I0227 01:49:02.153822 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-inventory\") on node \"crc\" DevicePath \"\"" Feb 27 01:49:02 crc kubenswrapper[4771]: I0227 01:49:02.153877 4771 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 27 01:49:02 crc kubenswrapper[4771]: I0227 01:49:02.153896 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 27 01:49:02 crc kubenswrapper[4771]: I0227 01:49:02.153912 4771 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 27 01:49:02 crc kubenswrapper[4771]: I0227 01:49:02.153924 4771 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 01:49:02 crc kubenswrapper[4771]: I0227 01:49:02.153936 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f7hp\" (UniqueName: \"kubernetes.io/projected/dc880077-8590-47a1-a434-e8cebcf3fff1-kube-api-access-2f7hp\") on node \"crc\" DevicePath \"\"" Feb 27 01:49:02 crc kubenswrapper[4771]: I0227 01:49:02.153947 4771 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/dc880077-8590-47a1-a434-e8cebcf3fff1-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 27 01:49:02 crc kubenswrapper[4771]: I0227 01:49:02.507248 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" event={"ID":"dc880077-8590-47a1-a434-e8cebcf3fff1","Type":"ContainerDied","Data":"70df9b3a8d5358c70fe405eaff23dceddb3c70e4e493a5f3ab7dc9e435100af5"} Feb 27 01:49:02 crc kubenswrapper[4771]: I0227 01:49:02.507285 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70df9b3a8d5358c70fe405eaff23dceddb3c70e4e493a5f3ab7dc9e435100af5" Feb 27 01:49:02 crc kubenswrapper[4771]: I0227 01:49:02.507290 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hlghk" Feb 27 01:49:05 crc kubenswrapper[4771]: I0227 01:49:05.774304 4771 scope.go:117] "RemoveContainer" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" Feb 27 01:49:05 crc kubenswrapper[4771]: E0227 01:49:05.775063 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:49:19 crc kubenswrapper[4771]: I0227 01:49:19.773059 4771 scope.go:117] "RemoveContainer" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" Feb 27 01:49:19 crc kubenswrapper[4771]: E0227 01:49:19.774402 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:49:32 crc kubenswrapper[4771]: I0227 01:49:32.774008 4771 scope.go:117] "RemoveContainer" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" Feb 27 01:49:32 crc kubenswrapper[4771]: E0227 01:49:32.774732 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:49:45 crc kubenswrapper[4771]: I0227 01:49:45.773038 4771 scope.go:117] "RemoveContainer" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" Feb 27 01:49:45 crc kubenswrapper[4771]: E0227 01:49:45.773999 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:49:53 crc kubenswrapper[4771]: I0227 01:49:53.025235 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nbhjn"] Feb 27 01:49:53 crc kubenswrapper[4771]: E0227 01:49:53.027237 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb3590d-ec9d-4437-82c7-24e0f748d130" containerName="oc" Feb 27 01:49:53 crc kubenswrapper[4771]: I0227 01:49:53.027274 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb3590d-ec9d-4437-82c7-24e0f748d130" containerName="oc" Feb 27 01:49:53 crc kubenswrapper[4771]: E0227 01:49:53.027325 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc880077-8590-47a1-a434-e8cebcf3fff1" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 27 01:49:53 crc kubenswrapper[4771]: I0227 01:49:53.027344 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc880077-8590-47a1-a434-e8cebcf3fff1" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 27 01:49:53 crc kubenswrapper[4771]: I0227 01:49:53.027851 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc880077-8590-47a1-a434-e8cebcf3fff1" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 27 01:49:53 crc kubenswrapper[4771]: I0227 01:49:53.027916 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="feb3590d-ec9d-4437-82c7-24e0f748d130" containerName="oc" Feb 27 01:49:53 crc kubenswrapper[4771]: I0227 01:49:53.030339 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nbhjn" Feb 27 01:49:53 crc kubenswrapper[4771]: I0227 01:49:53.041922 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nbhjn"] Feb 27 01:49:53 crc kubenswrapper[4771]: I0227 01:49:53.203536 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51299394-f03e-4d80-8271-9c725f0feed9-utilities\") pod \"community-operators-nbhjn\" (UID: \"51299394-f03e-4d80-8271-9c725f0feed9\") " pod="openshift-marketplace/community-operators-nbhjn" Feb 27 01:49:53 crc kubenswrapper[4771]: I0227 01:49:53.203997 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51299394-f03e-4d80-8271-9c725f0feed9-catalog-content\") pod \"community-operators-nbhjn\" (UID: \"51299394-f03e-4d80-8271-9c725f0feed9\") " pod="openshift-marketplace/community-operators-nbhjn" Feb 27 01:49:53 crc kubenswrapper[4771]: I0227 01:49:53.204034 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fqmn\" (UniqueName: \"kubernetes.io/projected/51299394-f03e-4d80-8271-9c725f0feed9-kube-api-access-8fqmn\") pod \"community-operators-nbhjn\" (UID: \"51299394-f03e-4d80-8271-9c725f0feed9\") " pod="openshift-marketplace/community-operators-nbhjn" Feb 27 01:49:53 crc kubenswrapper[4771]: I0227 01:49:53.305902 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51299394-f03e-4d80-8271-9c725f0feed9-catalog-content\") pod \"community-operators-nbhjn\" (UID: \"51299394-f03e-4d80-8271-9c725f0feed9\") " pod="openshift-marketplace/community-operators-nbhjn" Feb 27 01:49:53 crc kubenswrapper[4771]: I0227 01:49:53.305949 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fqmn\" (UniqueName: \"kubernetes.io/projected/51299394-f03e-4d80-8271-9c725f0feed9-kube-api-access-8fqmn\") pod \"community-operators-nbhjn\" (UID: \"51299394-f03e-4d80-8271-9c725f0feed9\") " pod="openshift-marketplace/community-operators-nbhjn" Feb 27 01:49:53 crc kubenswrapper[4771]: I0227 01:49:53.306382 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51299394-f03e-4d80-8271-9c725f0feed9-catalog-content\") pod \"community-operators-nbhjn\" (UID: \"51299394-f03e-4d80-8271-9c725f0feed9\") " pod="openshift-marketplace/community-operators-nbhjn" Feb 27 01:49:53 crc kubenswrapper[4771]: I0227 01:49:53.306513 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51299394-f03e-4d80-8271-9c725f0feed9-utilities\") pod \"community-operators-nbhjn\" (UID: \"51299394-f03e-4d80-8271-9c725f0feed9\") " pod="openshift-marketplace/community-operators-nbhjn" Feb 27 01:49:53 crc kubenswrapper[4771]: I0227 01:49:53.306884 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51299394-f03e-4d80-8271-9c725f0feed9-utilities\") pod \"community-operators-nbhjn\" (UID: \"51299394-f03e-4d80-8271-9c725f0feed9\") " pod="openshift-marketplace/community-operators-nbhjn" Feb 27 01:49:53 crc kubenswrapper[4771]: I0227 01:49:53.327794 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fqmn\" (UniqueName: \"kubernetes.io/projected/51299394-f03e-4d80-8271-9c725f0feed9-kube-api-access-8fqmn\") pod \"community-operators-nbhjn\" (UID: \"51299394-f03e-4d80-8271-9c725f0feed9\") " pod="openshift-marketplace/community-operators-nbhjn" Feb 27 01:49:53 crc kubenswrapper[4771]: I0227 01:49:53.380504 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nbhjn" Feb 27 01:49:53 crc kubenswrapper[4771]: I0227 01:49:53.881842 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nbhjn"] Feb 27 01:49:54 crc kubenswrapper[4771]: I0227 01:49:54.037891 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbhjn" event={"ID":"51299394-f03e-4d80-8271-9c725f0feed9","Type":"ContainerStarted","Data":"831190ab21b3120a1c4cfe95bbbf8d685700e5b7a119e131bd800ea9fadd325f"} Feb 27 01:49:55 crc kubenswrapper[4771]: I0227 01:49:55.053812 4771 generic.go:334] "Generic (PLEG): container finished" podID="51299394-f03e-4d80-8271-9c725f0feed9" containerID="b2e711043542811726134b425518bfd9b61b9dd9b6a282825740fbaf1c3631aa" exitCode=0 Feb 27 01:49:55 crc kubenswrapper[4771]: I0227 01:49:55.053917 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbhjn" event={"ID":"51299394-f03e-4d80-8271-9c725f0feed9","Type":"ContainerDied","Data":"b2e711043542811726134b425518bfd9b61b9dd9b6a282825740fbaf1c3631aa"} Feb 27 01:49:56 crc kubenswrapper[4771]: I0227 01:49:56.064975 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbhjn" event={"ID":"51299394-f03e-4d80-8271-9c725f0feed9","Type":"ContainerStarted","Data":"899001edff904031747e9db195078da25f6580901392492667a294af331c7e06"} Feb 27 01:49:56 crc kubenswrapper[4771]: I0227 01:49:56.773697 4771 scope.go:117] "RemoveContainer" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" Feb 27 01:49:56 crc kubenswrapper[4771]: E0227 01:49:56.774257 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:49:57 crc kubenswrapper[4771]: I0227 01:49:57.074918 4771 generic.go:334] "Generic (PLEG): container finished" podID="51299394-f03e-4d80-8271-9c725f0feed9" containerID="899001edff904031747e9db195078da25f6580901392492667a294af331c7e06" exitCode=0 Feb 27 01:49:57 crc kubenswrapper[4771]: I0227 01:49:57.074967 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbhjn" event={"ID":"51299394-f03e-4d80-8271-9c725f0feed9","Type":"ContainerDied","Data":"899001edff904031747e9db195078da25f6580901392492667a294af331c7e06"} Feb 27 01:49:58 crc kubenswrapper[4771]: I0227 01:49:58.087474 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbhjn" event={"ID":"51299394-f03e-4d80-8271-9c725f0feed9","Type":"ContainerStarted","Data":"88a1df5e1591d74faa216dd872ab323e48f6f4eb45e39f78a58da5aec8c990e8"} Feb 27 01:49:58 crc kubenswrapper[4771]: I0227 01:49:58.138126 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nbhjn" podStartSLOduration=3.69532694 podStartE2EDuration="6.138102082s" podCreationTimestamp="2026-02-27 01:49:52 +0000 UTC" firstStartedPulling="2026-02-27 01:49:55.058401257 +0000 UTC m=+2707.995962575" lastFinishedPulling="2026-02-27 01:49:57.501176429 +0000 UTC m=+2710.438737717" observedRunningTime="2026-02-27 01:49:58.130989037 +0000 UTC m=+2711.068550375" watchObservedRunningTime="2026-02-27 01:49:58.138102082 +0000 UTC m=+2711.075663380" Feb 27 01:49:58 crc kubenswrapper[4771]: I0227 01:49:58.906894 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 27 01:49:58 crc kubenswrapper[4771]: I0227 01:49:58.908778 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 27 01:49:58 crc kubenswrapper[4771]: I0227 01:49:58.913390 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 27 01:49:58 crc kubenswrapper[4771]: I0227 01:49:58.913989 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 27 01:49:58 crc kubenswrapper[4771]: I0227 01:49:58.914044 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 27 01:49:58 crc kubenswrapper[4771]: I0227 01:49:58.914327 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-q58qr" Feb 27 01:49:58 crc kubenswrapper[4771]: I0227 01:49:58.934166 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.084179 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4b362ce5-5892-43a0-8ec9-e280131b32ee-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.084337 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4b362ce5-5892-43a0-8ec9-e280131b32ee-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.084467 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.084570 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4b362ce5-5892-43a0-8ec9-e280131b32ee-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.084619 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzlmx\" (UniqueName: \"kubernetes.io/projected/4b362ce5-5892-43a0-8ec9-e280131b32ee-kube-api-access-mzlmx\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.084685 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4b362ce5-5892-43a0-8ec9-e280131b32ee-config-data\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.084706 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4b362ce5-5892-43a0-8ec9-e280131b32ee-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.084759 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b362ce5-5892-43a0-8ec9-e280131b32ee-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.084787 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4b362ce5-5892-43a0-8ec9-e280131b32ee-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.187405 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4b362ce5-5892-43a0-8ec9-e280131b32ee-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.187615 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4b362ce5-5892-43a0-8ec9-e280131b32ee-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.187809 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.188307 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.188520 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4b362ce5-5892-43a0-8ec9-e280131b32ee-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.188544 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4b362ce5-5892-43a0-8ec9-e280131b32ee-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.188667 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzlmx\" (UniqueName: \"kubernetes.io/projected/4b362ce5-5892-43a0-8ec9-e280131b32ee-kube-api-access-mzlmx\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.188752 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4b362ce5-5892-43a0-8ec9-e280131b32ee-config-data\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.188795 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4b362ce5-5892-43a0-8ec9-e280131b32ee-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.188823 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4b362ce5-5892-43a0-8ec9-e280131b32ee-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.188832 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b362ce5-5892-43a0-8ec9-e280131b32ee-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.188877 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4b362ce5-5892-43a0-8ec9-e280131b32ee-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.189296 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4b362ce5-5892-43a0-8ec9-e280131b32ee-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.189681 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4b362ce5-5892-43a0-8ec9-e280131b32ee-config-data\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.195990 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b362ce5-5892-43a0-8ec9-e280131b32ee-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.196247 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4b362ce5-5892-43a0-8ec9-e280131b32ee-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.198358 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4b362ce5-5892-43a0-8ec9-e280131b32ee-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.208415 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzlmx\" (UniqueName: \"kubernetes.io/projected/4b362ce5-5892-43a0-8ec9-e280131b32ee-kube-api-access-mzlmx\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.244846 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " pod="openstack/tempest-tests-tempest" Feb 27 01:49:59 crc kubenswrapper[4771]: I0227 01:49:59.536043 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 27 01:50:00 crc kubenswrapper[4771]: I0227 01:50:00.006858 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 27 01:50:00 crc kubenswrapper[4771]: I0227 01:50:00.112629 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4b362ce5-5892-43a0-8ec9-e280131b32ee","Type":"ContainerStarted","Data":"3dea908ac4d6f63aee8c97dc34c8eaa1b6d1c11b0ebce05954f743fe66bad5c4"} Feb 27 01:50:00 crc kubenswrapper[4771]: I0227 01:50:00.137503 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535950-flbxw"] Feb 27 01:50:00 crc kubenswrapper[4771]: I0227 01:50:00.138978 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535950-flbxw" Feb 27 01:50:00 crc kubenswrapper[4771]: I0227 01:50:00.140944 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:50:00 crc kubenswrapper[4771]: I0227 01:50:00.141210 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 01:50:00 crc kubenswrapper[4771]: I0227 01:50:00.142011 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:50:00 crc kubenswrapper[4771]: I0227 01:50:00.153093 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535950-flbxw"] Feb 27 01:50:00 crc kubenswrapper[4771]: I0227 01:50:00.314831 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkqh4\" (UniqueName: \"kubernetes.io/projected/1de3ac97-64a7-45a4-9363-dff9ee8d5c9f-kube-api-access-wkqh4\") pod \"auto-csr-approver-29535950-flbxw\" (UID: \"1de3ac97-64a7-45a4-9363-dff9ee8d5c9f\") " pod="openshift-infra/auto-csr-approver-29535950-flbxw" Feb 27 01:50:00 crc kubenswrapper[4771]: I0227 01:50:00.417010 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkqh4\" (UniqueName: \"kubernetes.io/projected/1de3ac97-64a7-45a4-9363-dff9ee8d5c9f-kube-api-access-wkqh4\") pod \"auto-csr-approver-29535950-flbxw\" (UID: \"1de3ac97-64a7-45a4-9363-dff9ee8d5c9f\") " pod="openshift-infra/auto-csr-approver-29535950-flbxw" Feb 27 01:50:00 crc kubenswrapper[4771]: I0227 01:50:00.448703 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkqh4\" (UniqueName: \"kubernetes.io/projected/1de3ac97-64a7-45a4-9363-dff9ee8d5c9f-kube-api-access-wkqh4\") pod \"auto-csr-approver-29535950-flbxw\" (UID: \"1de3ac97-64a7-45a4-9363-dff9ee8d5c9f\") " pod="openshift-infra/auto-csr-approver-29535950-flbxw" Feb 27 01:50:00 crc kubenswrapper[4771]: I0227 01:50:00.467363 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535950-flbxw" Feb 27 01:50:00 crc kubenswrapper[4771]: I0227 01:50:00.995681 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535950-flbxw"] Feb 27 01:50:01 crc kubenswrapper[4771]: I0227 01:50:01.124964 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535950-flbxw" event={"ID":"1de3ac97-64a7-45a4-9363-dff9ee8d5c9f","Type":"ContainerStarted","Data":"aca7255e40dcfdc0f5d312bf32b46d288ffca51bb37f9498ce8bdb4437b8253d"} Feb 27 01:50:03 crc kubenswrapper[4771]: I0227 01:50:03.150797 4771 generic.go:334] "Generic (PLEG): container finished" podID="1de3ac97-64a7-45a4-9363-dff9ee8d5c9f" containerID="65991994110bdd7e030a864b76e72e95c2a27dc16d3b0305d959c2e847b7f0dc" exitCode=0 Feb 27 01:50:03 crc kubenswrapper[4771]: I0227 01:50:03.151248 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535950-flbxw" event={"ID":"1de3ac97-64a7-45a4-9363-dff9ee8d5c9f","Type":"ContainerDied","Data":"65991994110bdd7e030a864b76e72e95c2a27dc16d3b0305d959c2e847b7f0dc"} Feb 27 01:50:03 crc kubenswrapper[4771]: I0227 01:50:03.381139 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nbhjn" Feb 27 01:50:03 crc kubenswrapper[4771]: I0227 01:50:03.381220 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nbhjn" Feb 27 01:50:03 crc kubenswrapper[4771]: I0227 01:50:03.434524 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nbhjn" Feb 27 01:50:04 crc kubenswrapper[4771]: I0227 01:50:04.210492 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nbhjn" Feb 27 01:50:04 crc kubenswrapper[4771]: I0227 01:50:04.266590 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nbhjn"] Feb 27 01:50:06 crc kubenswrapper[4771]: I0227 01:50:06.183461 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nbhjn" podUID="51299394-f03e-4d80-8271-9c725f0feed9" containerName="registry-server" containerID="cri-o://88a1df5e1591d74faa216dd872ab323e48f6f4eb45e39f78a58da5aec8c990e8" gracePeriod=2 Feb 27 01:50:06 crc kubenswrapper[4771]: I0227 01:50:06.813097 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535950-flbxw" Feb 27 01:50:06 crc kubenswrapper[4771]: I0227 01:50:06.960027 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkqh4\" (UniqueName: \"kubernetes.io/projected/1de3ac97-64a7-45a4-9363-dff9ee8d5c9f-kube-api-access-wkqh4\") pod \"1de3ac97-64a7-45a4-9363-dff9ee8d5c9f\" (UID: \"1de3ac97-64a7-45a4-9363-dff9ee8d5c9f\") " Feb 27 01:50:06 crc kubenswrapper[4771]: I0227 01:50:06.967520 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1de3ac97-64a7-45a4-9363-dff9ee8d5c9f-kube-api-access-wkqh4" (OuterVolumeSpecName: "kube-api-access-wkqh4") pod "1de3ac97-64a7-45a4-9363-dff9ee8d5c9f" (UID: "1de3ac97-64a7-45a4-9363-dff9ee8d5c9f"). InnerVolumeSpecName "kube-api-access-wkqh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.063401 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkqh4\" (UniqueName: \"kubernetes.io/projected/1de3ac97-64a7-45a4-9363-dff9ee8d5c9f-kube-api-access-wkqh4\") on node \"crc\" DevicePath \"\"" Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.097183 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nbhjn" Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.164451 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51299394-f03e-4d80-8271-9c725f0feed9-utilities\") pod \"51299394-f03e-4d80-8271-9c725f0feed9\" (UID: \"51299394-f03e-4d80-8271-9c725f0feed9\") " Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.164770 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51299394-f03e-4d80-8271-9c725f0feed9-catalog-content\") pod \"51299394-f03e-4d80-8271-9c725f0feed9\" (UID: \"51299394-f03e-4d80-8271-9c725f0feed9\") " Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.164803 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fqmn\" (UniqueName: \"kubernetes.io/projected/51299394-f03e-4d80-8271-9c725f0feed9-kube-api-access-8fqmn\") pod \"51299394-f03e-4d80-8271-9c725f0feed9\" (UID: \"51299394-f03e-4d80-8271-9c725f0feed9\") " Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.165479 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51299394-f03e-4d80-8271-9c725f0feed9-utilities" (OuterVolumeSpecName: "utilities") pod "51299394-f03e-4d80-8271-9c725f0feed9" (UID: "51299394-f03e-4d80-8271-9c725f0feed9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.179191 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51299394-f03e-4d80-8271-9c725f0feed9-kube-api-access-8fqmn" (OuterVolumeSpecName: "kube-api-access-8fqmn") pod "51299394-f03e-4d80-8271-9c725f0feed9" (UID: "51299394-f03e-4d80-8271-9c725f0feed9"). InnerVolumeSpecName "kube-api-access-8fqmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.195864 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535950-flbxw" Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.195867 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535950-flbxw" event={"ID":"1de3ac97-64a7-45a4-9363-dff9ee8d5c9f","Type":"ContainerDied","Data":"aca7255e40dcfdc0f5d312bf32b46d288ffca51bb37f9498ce8bdb4437b8253d"} Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.196028 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aca7255e40dcfdc0f5d312bf32b46d288ffca51bb37f9498ce8bdb4437b8253d" Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.209774 4771 generic.go:334] "Generic (PLEG): container finished" podID="51299394-f03e-4d80-8271-9c725f0feed9" containerID="88a1df5e1591d74faa216dd872ab323e48f6f4eb45e39f78a58da5aec8c990e8" exitCode=0 Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.209828 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbhjn" event={"ID":"51299394-f03e-4d80-8271-9c725f0feed9","Type":"ContainerDied","Data":"88a1df5e1591d74faa216dd872ab323e48f6f4eb45e39f78a58da5aec8c990e8"} Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.209865 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbhjn" event={"ID":"51299394-f03e-4d80-8271-9c725f0feed9","Type":"ContainerDied","Data":"831190ab21b3120a1c4cfe95bbbf8d685700e5b7a119e131bd800ea9fadd325f"} Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.209895 4771 scope.go:117] "RemoveContainer" containerID="88a1df5e1591d74faa216dd872ab323e48f6f4eb45e39f78a58da5aec8c990e8" Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.210097 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nbhjn" Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.217510 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51299394-f03e-4d80-8271-9c725f0feed9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51299394-f03e-4d80-8271-9c725f0feed9" (UID: "51299394-f03e-4d80-8271-9c725f0feed9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.255105 4771 scope.go:117] "RemoveContainer" containerID="899001edff904031747e9db195078da25f6580901392492667a294af331c7e06" Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.266895 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51299394-f03e-4d80-8271-9c725f0feed9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.266940 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fqmn\" (UniqueName: \"kubernetes.io/projected/51299394-f03e-4d80-8271-9c725f0feed9-kube-api-access-8fqmn\") on node \"crc\" DevicePath \"\"" Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.266957 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51299394-f03e-4d80-8271-9c725f0feed9-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.300840 4771 scope.go:117] "RemoveContainer" containerID="b2e711043542811726134b425518bfd9b61b9dd9b6a282825740fbaf1c3631aa" Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.320627 4771 scope.go:117] "RemoveContainer" containerID="88a1df5e1591d74faa216dd872ab323e48f6f4eb45e39f78a58da5aec8c990e8" Feb 27 01:50:07 crc kubenswrapper[4771]: E0227 01:50:07.321070 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88a1df5e1591d74faa216dd872ab323e48f6f4eb45e39f78a58da5aec8c990e8\": container with ID starting with 88a1df5e1591d74faa216dd872ab323e48f6f4eb45e39f78a58da5aec8c990e8 not found: ID does not exist" containerID="88a1df5e1591d74faa216dd872ab323e48f6f4eb45e39f78a58da5aec8c990e8" Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.321101 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88a1df5e1591d74faa216dd872ab323e48f6f4eb45e39f78a58da5aec8c990e8"} err="failed to get container status \"88a1df5e1591d74faa216dd872ab323e48f6f4eb45e39f78a58da5aec8c990e8\": rpc error: code = NotFound desc = could not find container \"88a1df5e1591d74faa216dd872ab323e48f6f4eb45e39f78a58da5aec8c990e8\": container with ID starting with 88a1df5e1591d74faa216dd872ab323e48f6f4eb45e39f78a58da5aec8c990e8 not found: ID does not exist" Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.321121 4771 scope.go:117] "RemoveContainer" containerID="899001edff904031747e9db195078da25f6580901392492667a294af331c7e06" Feb 27 01:50:07 crc kubenswrapper[4771]: E0227 01:50:07.321676 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"899001edff904031747e9db195078da25f6580901392492667a294af331c7e06\": container with ID starting with 899001edff904031747e9db195078da25f6580901392492667a294af331c7e06 not found: ID does not exist" containerID="899001edff904031747e9db195078da25f6580901392492667a294af331c7e06" Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.321701 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"899001edff904031747e9db195078da25f6580901392492667a294af331c7e06"} err="failed to get container status \"899001edff904031747e9db195078da25f6580901392492667a294af331c7e06\": rpc error: code = NotFound desc = could not find container \"899001edff904031747e9db195078da25f6580901392492667a294af331c7e06\": container with ID starting with 899001edff904031747e9db195078da25f6580901392492667a294af331c7e06 not found: ID does not exist" Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.321717 4771 scope.go:117] "RemoveContainer" containerID="b2e711043542811726134b425518bfd9b61b9dd9b6a282825740fbaf1c3631aa" Feb 27 01:50:07 crc kubenswrapper[4771]: E0227 01:50:07.321991 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2e711043542811726134b425518bfd9b61b9dd9b6a282825740fbaf1c3631aa\": container with ID starting with b2e711043542811726134b425518bfd9b61b9dd9b6a282825740fbaf1c3631aa not found: ID does not exist" containerID="b2e711043542811726134b425518bfd9b61b9dd9b6a282825740fbaf1c3631aa" Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.322014 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2e711043542811726134b425518bfd9b61b9dd9b6a282825740fbaf1c3631aa"} err="failed to get container status \"b2e711043542811726134b425518bfd9b61b9dd9b6a282825740fbaf1c3631aa\": rpc error: code = NotFound desc = could not find container \"b2e711043542811726134b425518bfd9b61b9dd9b6a282825740fbaf1c3631aa\": container with ID starting with b2e711043542811726134b425518bfd9b61b9dd9b6a282825740fbaf1c3631aa not found: ID does not exist" Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.562981 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nbhjn"] Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.571149 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nbhjn"] Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.794620 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51299394-f03e-4d80-8271-9c725f0feed9" path="/var/lib/kubelet/pods/51299394-f03e-4d80-8271-9c725f0feed9/volumes" Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.888331 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535944-dvp6w"] Feb 27 01:50:07 crc kubenswrapper[4771]: I0227 01:50:07.895838 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535944-dvp6w"] Feb 27 01:50:09 crc kubenswrapper[4771]: I0227 01:50:09.790036 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ef05ae1-b240-4abf-a4d4-0605ec393956" path="/var/lib/kubelet/pods/8ef05ae1-b240-4abf-a4d4-0605ec393956/volumes" Feb 27 01:50:11 crc kubenswrapper[4771]: I0227 01:50:11.773300 4771 scope.go:117] "RemoveContainer" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" Feb 27 01:50:11 crc kubenswrapper[4771]: E0227 01:50:11.773722 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:50:22 crc kubenswrapper[4771]: I0227 01:50:22.773049 4771 scope.go:117] "RemoveContainer" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" Feb 27 01:50:22 crc kubenswrapper[4771]: E0227 01:50:22.773899 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:50:28 crc kubenswrapper[4771]: E0227 01:50:28.637578 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Feb 27 01:50:28 crc kubenswrapper[4771]: E0227 01:50:28.638287 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mzlmx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(4b362ce5-5892-43a0-8ec9-e280131b32ee): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 27 01:50:28 crc kubenswrapper[4771]: E0227 01:50:28.639445 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="4b362ce5-5892-43a0-8ec9-e280131b32ee" Feb 27 01:50:29 crc kubenswrapper[4771]: E0227 01:50:29.420346 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="4b362ce5-5892-43a0-8ec9-e280131b32ee" Feb 27 01:50:29 crc kubenswrapper[4771]: I0227 01:50:29.880860 4771 scope.go:117] "RemoveContainer" containerID="ed4cfb69ce9a6c93c545ae6a84ee2299a88174de2d369eb8caa33840cbc1047f" Feb 27 01:50:34 crc kubenswrapper[4771]: I0227 01:50:34.773557 4771 scope.go:117] "RemoveContainer" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" Feb 27 01:50:34 crc kubenswrapper[4771]: E0227 01:50:34.775315 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:50:44 crc kubenswrapper[4771]: I0227 01:50:44.574446 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4b362ce5-5892-43a0-8ec9-e280131b32ee","Type":"ContainerStarted","Data":"12d228d3dc7206d21966d9d081e359bb92788e3bc231ff9e7f545fb2af557236"} Feb 27 01:50:44 crc kubenswrapper[4771]: I0227 01:50:44.602661 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.420220542 podStartE2EDuration="47.602645442s" podCreationTimestamp="2026-02-27 01:49:57 +0000 UTC" firstStartedPulling="2026-02-27 01:50:00.008055283 +0000 UTC m=+2712.945616571" lastFinishedPulling="2026-02-27 01:50:43.190480143 +0000 UTC m=+2756.128041471" observedRunningTime="2026-02-27 01:50:44.598737834 +0000 UTC m=+2757.536299132" watchObservedRunningTime="2026-02-27 01:50:44.602645442 +0000 UTC m=+2757.540206730" Feb 27 01:50:47 crc kubenswrapper[4771]: I0227 01:50:47.784901 4771 scope.go:117] "RemoveContainer" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" Feb 27 01:50:47 crc kubenswrapper[4771]: E0227 01:50:47.786402 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:50:58 crc kubenswrapper[4771]: I0227 01:50:58.774051 4771 scope.go:117] "RemoveContainer" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" Feb 27 01:50:58 crc kubenswrapper[4771]: E0227 01:50:58.774621 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:51:09 crc kubenswrapper[4771]: I0227 01:51:09.774167 4771 scope.go:117] "RemoveContainer" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" Feb 27 01:51:09 crc kubenswrapper[4771]: E0227 01:51:09.775156 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:51:21 crc kubenswrapper[4771]: I0227 01:51:21.774496 4771 scope.go:117] "RemoveContainer" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" Feb 27 01:51:21 crc kubenswrapper[4771]: E0227 01:51:21.776146 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:51:30 crc kubenswrapper[4771]: I0227 01:51:30.156243 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8gx45"] Feb 27 01:51:30 crc kubenswrapper[4771]: E0227 01:51:30.157421 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1de3ac97-64a7-45a4-9363-dff9ee8d5c9f" containerName="oc" Feb 27 01:51:30 crc kubenswrapper[4771]: I0227 01:51:30.157434 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1de3ac97-64a7-45a4-9363-dff9ee8d5c9f" containerName="oc" Feb 27 01:51:30 crc kubenswrapper[4771]: E0227 01:51:30.157449 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51299394-f03e-4d80-8271-9c725f0feed9" containerName="registry-server" Feb 27 01:51:30 crc kubenswrapper[4771]: I0227 01:51:30.157455 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="51299394-f03e-4d80-8271-9c725f0feed9" containerName="registry-server" Feb 27 01:51:30 crc kubenswrapper[4771]: E0227 01:51:30.157467 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51299394-f03e-4d80-8271-9c725f0feed9" containerName="extract-utilities" Feb 27 01:51:30 crc kubenswrapper[4771]: I0227 01:51:30.157473 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="51299394-f03e-4d80-8271-9c725f0feed9" containerName="extract-utilities" Feb 27 01:51:30 crc kubenswrapper[4771]: E0227 01:51:30.157482 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51299394-f03e-4d80-8271-9c725f0feed9" containerName="extract-content" Feb 27 01:51:30 crc kubenswrapper[4771]: I0227 01:51:30.157487 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="51299394-f03e-4d80-8271-9c725f0feed9" containerName="extract-content" Feb 27 01:51:30 crc kubenswrapper[4771]: I0227 01:51:30.157683 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="51299394-f03e-4d80-8271-9c725f0feed9" containerName="registry-server" Feb 27 01:51:30 crc kubenswrapper[4771]: I0227 01:51:30.157706 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1de3ac97-64a7-45a4-9363-dff9ee8d5c9f" containerName="oc" Feb 27 01:51:30 crc kubenswrapper[4771]: I0227 01:51:30.159031 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8gx45" Feb 27 01:51:30 crc kubenswrapper[4771]: I0227 01:51:30.170404 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8gx45"] Feb 27 01:51:30 crc kubenswrapper[4771]: I0227 01:51:30.200496 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb326a80-d5dd-43b5-a208-a8f0d0ace903-utilities\") pod \"certified-operators-8gx45\" (UID: \"eb326a80-d5dd-43b5-a208-a8f0d0ace903\") " pod="openshift-marketplace/certified-operators-8gx45" Feb 27 01:51:30 crc kubenswrapper[4771]: I0227 01:51:30.200634 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb326a80-d5dd-43b5-a208-a8f0d0ace903-catalog-content\") pod \"certified-operators-8gx45\" (UID: \"eb326a80-d5dd-43b5-a208-a8f0d0ace903\") " pod="openshift-marketplace/certified-operators-8gx45" Feb 27 01:51:30 crc kubenswrapper[4771]: I0227 01:51:30.200927 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtjs5\" (UniqueName: \"kubernetes.io/projected/eb326a80-d5dd-43b5-a208-a8f0d0ace903-kube-api-access-rtjs5\") pod \"certified-operators-8gx45\" (UID: \"eb326a80-d5dd-43b5-a208-a8f0d0ace903\") " pod="openshift-marketplace/certified-operators-8gx45" Feb 27 01:51:30 crc kubenswrapper[4771]: I0227 01:51:30.303023 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb326a80-d5dd-43b5-a208-a8f0d0ace903-utilities\") pod \"certified-operators-8gx45\" (UID: \"eb326a80-d5dd-43b5-a208-a8f0d0ace903\") " pod="openshift-marketplace/certified-operators-8gx45" Feb 27 01:51:30 crc kubenswrapper[4771]: I0227 01:51:30.303132 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb326a80-d5dd-43b5-a208-a8f0d0ace903-catalog-content\") pod \"certified-operators-8gx45\" (UID: \"eb326a80-d5dd-43b5-a208-a8f0d0ace903\") " pod="openshift-marketplace/certified-operators-8gx45" Feb 27 01:51:30 crc kubenswrapper[4771]: I0227 01:51:30.303234 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtjs5\" (UniqueName: \"kubernetes.io/projected/eb326a80-d5dd-43b5-a208-a8f0d0ace903-kube-api-access-rtjs5\") pod \"certified-operators-8gx45\" (UID: \"eb326a80-d5dd-43b5-a208-a8f0d0ace903\") " pod="openshift-marketplace/certified-operators-8gx45" Feb 27 01:51:30 crc kubenswrapper[4771]: I0227 01:51:30.303485 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb326a80-d5dd-43b5-a208-a8f0d0ace903-utilities\") pod \"certified-operators-8gx45\" (UID: \"eb326a80-d5dd-43b5-a208-a8f0d0ace903\") " pod="openshift-marketplace/certified-operators-8gx45" Feb 27 01:51:30 crc kubenswrapper[4771]: I0227 01:51:30.303527 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb326a80-d5dd-43b5-a208-a8f0d0ace903-catalog-content\") pod \"certified-operators-8gx45\" (UID: \"eb326a80-d5dd-43b5-a208-a8f0d0ace903\") " pod="openshift-marketplace/certified-operators-8gx45" Feb 27 01:51:30 crc kubenswrapper[4771]: I0227 01:51:30.320643 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtjs5\" (UniqueName: \"kubernetes.io/projected/eb326a80-d5dd-43b5-a208-a8f0d0ace903-kube-api-access-rtjs5\") pod \"certified-operators-8gx45\" (UID: \"eb326a80-d5dd-43b5-a208-a8f0d0ace903\") " pod="openshift-marketplace/certified-operators-8gx45" Feb 27 01:51:30 crc kubenswrapper[4771]: I0227 01:51:30.495490 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8gx45" Feb 27 01:51:31 crc kubenswrapper[4771]: I0227 01:51:31.102644 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8gx45"] Feb 27 01:51:32 crc kubenswrapper[4771]: I0227 01:51:32.059442 4771 generic.go:334] "Generic (PLEG): container finished" podID="eb326a80-d5dd-43b5-a208-a8f0d0ace903" containerID="e60bf77f7993df9aa5c59f7a30be10ccd5d451f846c5860a27d40fddbfe102a6" exitCode=0 Feb 27 01:51:32 crc kubenswrapper[4771]: I0227 01:51:32.059650 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gx45" event={"ID":"eb326a80-d5dd-43b5-a208-a8f0d0ace903","Type":"ContainerDied","Data":"e60bf77f7993df9aa5c59f7a30be10ccd5d451f846c5860a27d40fddbfe102a6"} Feb 27 01:51:32 crc kubenswrapper[4771]: I0227 01:51:32.059972 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gx45" event={"ID":"eb326a80-d5dd-43b5-a208-a8f0d0ace903","Type":"ContainerStarted","Data":"c87355cae7ec73986a0bc5b137e89ec1141d805a571af8c0894476eb03060a41"} Feb 27 01:51:32 crc kubenswrapper[4771]: I0227 01:51:32.062656 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 01:51:33 crc kubenswrapper[4771]: I0227 01:51:33.100238 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gx45" event={"ID":"eb326a80-d5dd-43b5-a208-a8f0d0ace903","Type":"ContainerStarted","Data":"9c8c4d94b5e68c0f01931b693d2c181ffaa2b265fe8bc062b15fe9748bf1adc8"} Feb 27 01:51:33 crc kubenswrapper[4771]: I0227 01:51:33.773723 4771 scope.go:117] "RemoveContainer" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" Feb 27 01:51:34 crc kubenswrapper[4771]: I0227 01:51:34.109747 4771 generic.go:334] "Generic (PLEG): container finished" podID="eb326a80-d5dd-43b5-a208-a8f0d0ace903" containerID="9c8c4d94b5e68c0f01931b693d2c181ffaa2b265fe8bc062b15fe9748bf1adc8" exitCode=0 Feb 27 01:51:34 crc kubenswrapper[4771]: I0227 01:51:34.109855 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gx45" event={"ID":"eb326a80-d5dd-43b5-a208-a8f0d0ace903","Type":"ContainerDied","Data":"9c8c4d94b5e68c0f01931b693d2c181ffaa2b265fe8bc062b15fe9748bf1adc8"} Feb 27 01:51:34 crc kubenswrapper[4771]: I0227 01:51:34.115426 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerStarted","Data":"9bc7e891ab68febf267f0a069a80d85ee866b60dfced10cc635f46adae460c1b"} Feb 27 01:51:35 crc kubenswrapper[4771]: I0227 01:51:35.127852 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gx45" event={"ID":"eb326a80-d5dd-43b5-a208-a8f0d0ace903","Type":"ContainerStarted","Data":"7c90de4b514e1eb7f35d961f8d63643cf8f0c5f4a0c562fce17f2a220d8607bc"} Feb 27 01:51:35 crc kubenswrapper[4771]: I0227 01:51:35.153895 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8gx45" podStartSLOduration=2.688654033 podStartE2EDuration="5.153872801s" podCreationTimestamp="2026-02-27 01:51:30 +0000 UTC" firstStartedPulling="2026-02-27 01:51:32.062202387 +0000 UTC m=+2804.999763685" lastFinishedPulling="2026-02-27 01:51:34.527421165 +0000 UTC m=+2807.464982453" observedRunningTime="2026-02-27 01:51:35.146676513 +0000 UTC m=+2808.084237811" watchObservedRunningTime="2026-02-27 01:51:35.153872801 +0000 UTC m=+2808.091434089" Feb 27 01:51:39 crc kubenswrapper[4771]: I0227 01:51:39.980750 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kxvj7"] Feb 27 01:51:39 crc kubenswrapper[4771]: I0227 01:51:39.991721 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxvj7" Feb 27 01:51:39 crc kubenswrapper[4771]: I0227 01:51:39.999543 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kxvj7"] Feb 27 01:51:40 crc kubenswrapper[4771]: I0227 01:51:40.130024 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1127bb8a-8b0b-41e9-84d9-3af25c157a94-utilities\") pod \"redhat-operators-kxvj7\" (UID: \"1127bb8a-8b0b-41e9-84d9-3af25c157a94\") " pod="openshift-marketplace/redhat-operators-kxvj7" Feb 27 01:51:40 crc kubenswrapper[4771]: I0227 01:51:40.130116 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1127bb8a-8b0b-41e9-84d9-3af25c157a94-catalog-content\") pod \"redhat-operators-kxvj7\" (UID: \"1127bb8a-8b0b-41e9-84d9-3af25c157a94\") " pod="openshift-marketplace/redhat-operators-kxvj7" Feb 27 01:51:40 crc kubenswrapper[4771]: I0227 01:51:40.130244 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxkm8\" (UniqueName: \"kubernetes.io/projected/1127bb8a-8b0b-41e9-84d9-3af25c157a94-kube-api-access-sxkm8\") pod \"redhat-operators-kxvj7\" (UID: \"1127bb8a-8b0b-41e9-84d9-3af25c157a94\") " pod="openshift-marketplace/redhat-operators-kxvj7" Feb 27 01:51:40 crc kubenswrapper[4771]: I0227 01:51:40.232283 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1127bb8a-8b0b-41e9-84d9-3af25c157a94-catalog-content\") pod \"redhat-operators-kxvj7\" (UID: \"1127bb8a-8b0b-41e9-84d9-3af25c157a94\") " pod="openshift-marketplace/redhat-operators-kxvj7" Feb 27 01:51:40 crc kubenswrapper[4771]: I0227 01:51:40.232404 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxkm8\" (UniqueName: \"kubernetes.io/projected/1127bb8a-8b0b-41e9-84d9-3af25c157a94-kube-api-access-sxkm8\") pod \"redhat-operators-kxvj7\" (UID: \"1127bb8a-8b0b-41e9-84d9-3af25c157a94\") " pod="openshift-marketplace/redhat-operators-kxvj7" Feb 27 01:51:40 crc kubenswrapper[4771]: I0227 01:51:40.232496 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1127bb8a-8b0b-41e9-84d9-3af25c157a94-utilities\") pod \"redhat-operators-kxvj7\" (UID: \"1127bb8a-8b0b-41e9-84d9-3af25c157a94\") " pod="openshift-marketplace/redhat-operators-kxvj7" Feb 27 01:51:40 crc kubenswrapper[4771]: I0227 01:51:40.232850 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1127bb8a-8b0b-41e9-84d9-3af25c157a94-utilities\") pod \"redhat-operators-kxvj7\" (UID: \"1127bb8a-8b0b-41e9-84d9-3af25c157a94\") " pod="openshift-marketplace/redhat-operators-kxvj7" Feb 27 01:51:40 crc kubenswrapper[4771]: I0227 01:51:40.232912 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1127bb8a-8b0b-41e9-84d9-3af25c157a94-catalog-content\") pod \"redhat-operators-kxvj7\" (UID: \"1127bb8a-8b0b-41e9-84d9-3af25c157a94\") " pod="openshift-marketplace/redhat-operators-kxvj7" Feb 27 01:51:40 crc kubenswrapper[4771]: I0227 01:51:40.249687 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxkm8\" (UniqueName: \"kubernetes.io/projected/1127bb8a-8b0b-41e9-84d9-3af25c157a94-kube-api-access-sxkm8\") pod \"redhat-operators-kxvj7\" (UID: \"1127bb8a-8b0b-41e9-84d9-3af25c157a94\") " pod="openshift-marketplace/redhat-operators-kxvj7" Feb 27 01:51:40 crc kubenswrapper[4771]: I0227 01:51:40.331057 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxvj7" Feb 27 01:51:40 crc kubenswrapper[4771]: I0227 01:51:40.496330 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8gx45" Feb 27 01:51:40 crc kubenswrapper[4771]: I0227 01:51:40.496861 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8gx45" Feb 27 01:51:40 crc kubenswrapper[4771]: I0227 01:51:40.562632 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8gx45" Feb 27 01:51:40 crc kubenswrapper[4771]: I0227 01:51:40.840318 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kxvj7"] Feb 27 01:51:41 crc kubenswrapper[4771]: I0227 01:51:41.178415 4771 generic.go:334] "Generic (PLEG): container finished" podID="1127bb8a-8b0b-41e9-84d9-3af25c157a94" containerID="7504bfa61dcceee2a9f1ab999c8b0f4efa51d1630f844cae040340e4903fd4bd" exitCode=0 Feb 27 01:51:41 crc kubenswrapper[4771]: I0227 01:51:41.178487 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxvj7" event={"ID":"1127bb8a-8b0b-41e9-84d9-3af25c157a94","Type":"ContainerDied","Data":"7504bfa61dcceee2a9f1ab999c8b0f4efa51d1630f844cae040340e4903fd4bd"} Feb 27 01:51:41 crc kubenswrapper[4771]: I0227 01:51:41.178516 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxvj7" event={"ID":"1127bb8a-8b0b-41e9-84d9-3af25c157a94","Type":"ContainerStarted","Data":"a4b44cb16ecab1167a67cf419fd4375a7794a02c14093a34d60e04fe22527b77"} Feb 27 01:51:41 crc kubenswrapper[4771]: I0227 01:51:41.229740 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8gx45" Feb 27 01:51:42 crc kubenswrapper[4771]: I0227 01:51:42.956246 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8gx45"] Feb 27 01:51:43 crc kubenswrapper[4771]: I0227 01:51:43.207833 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxvj7" event={"ID":"1127bb8a-8b0b-41e9-84d9-3af25c157a94","Type":"ContainerStarted","Data":"2bf2513b1c3b3668edd8e9665403e49e8a82c582a67e99c4b30b173e16352b9c"} Feb 27 01:51:43 crc kubenswrapper[4771]: I0227 01:51:43.208007 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8gx45" podUID="eb326a80-d5dd-43b5-a208-a8f0d0ace903" containerName="registry-server" containerID="cri-o://7c90de4b514e1eb7f35d961f8d63643cf8f0c5f4a0c562fce17f2a220d8607bc" gracePeriod=2 Feb 27 01:51:45 crc kubenswrapper[4771]: I0227 01:51:45.231744 4771 generic.go:334] "Generic (PLEG): container finished" podID="eb326a80-d5dd-43b5-a208-a8f0d0ace903" containerID="7c90de4b514e1eb7f35d961f8d63643cf8f0c5f4a0c562fce17f2a220d8607bc" exitCode=0 Feb 27 01:51:45 crc kubenswrapper[4771]: I0227 01:51:45.231912 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gx45" event={"ID":"eb326a80-d5dd-43b5-a208-a8f0d0ace903","Type":"ContainerDied","Data":"7c90de4b514e1eb7f35d961f8d63643cf8f0c5f4a0c562fce17f2a220d8607bc"} Feb 27 01:51:45 crc kubenswrapper[4771]: I0227 01:51:45.605145 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8gx45" Feb 27 01:51:45 crc kubenswrapper[4771]: I0227 01:51:45.659999 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb326a80-d5dd-43b5-a208-a8f0d0ace903-catalog-content\") pod \"eb326a80-d5dd-43b5-a208-a8f0d0ace903\" (UID: \"eb326a80-d5dd-43b5-a208-a8f0d0ace903\") " Feb 27 01:51:45 crc kubenswrapper[4771]: I0227 01:51:45.660225 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb326a80-d5dd-43b5-a208-a8f0d0ace903-utilities\") pod \"eb326a80-d5dd-43b5-a208-a8f0d0ace903\" (UID: \"eb326a80-d5dd-43b5-a208-a8f0d0ace903\") " Feb 27 01:51:45 crc kubenswrapper[4771]: I0227 01:51:45.662030 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb326a80-d5dd-43b5-a208-a8f0d0ace903-utilities" (OuterVolumeSpecName: "utilities") pod "eb326a80-d5dd-43b5-a208-a8f0d0ace903" (UID: "eb326a80-d5dd-43b5-a208-a8f0d0ace903"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:51:45 crc kubenswrapper[4771]: I0227 01:51:45.662151 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtjs5\" (UniqueName: \"kubernetes.io/projected/eb326a80-d5dd-43b5-a208-a8f0d0ace903-kube-api-access-rtjs5\") pod \"eb326a80-d5dd-43b5-a208-a8f0d0ace903\" (UID: \"eb326a80-d5dd-43b5-a208-a8f0d0ace903\") " Feb 27 01:51:45 crc kubenswrapper[4771]: I0227 01:51:45.663899 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb326a80-d5dd-43b5-a208-a8f0d0ace903-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:51:45 crc kubenswrapper[4771]: I0227 01:51:45.683564 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb326a80-d5dd-43b5-a208-a8f0d0ace903-kube-api-access-rtjs5" (OuterVolumeSpecName: "kube-api-access-rtjs5") pod "eb326a80-d5dd-43b5-a208-a8f0d0ace903" (UID: "eb326a80-d5dd-43b5-a208-a8f0d0ace903"). InnerVolumeSpecName "kube-api-access-rtjs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:51:45 crc kubenswrapper[4771]: I0227 01:51:45.710569 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb326a80-d5dd-43b5-a208-a8f0d0ace903-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb326a80-d5dd-43b5-a208-a8f0d0ace903" (UID: "eb326a80-d5dd-43b5-a208-a8f0d0ace903"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:51:45 crc kubenswrapper[4771]: I0227 01:51:45.765648 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb326a80-d5dd-43b5-a208-a8f0d0ace903-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:51:45 crc kubenswrapper[4771]: I0227 01:51:45.765705 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtjs5\" (UniqueName: \"kubernetes.io/projected/eb326a80-d5dd-43b5-a208-a8f0d0ace903-kube-api-access-rtjs5\") on node \"crc\" DevicePath \"\"" Feb 27 01:51:46 crc kubenswrapper[4771]: I0227 01:51:46.247350 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gx45" event={"ID":"eb326a80-d5dd-43b5-a208-a8f0d0ace903","Type":"ContainerDied","Data":"c87355cae7ec73986a0bc5b137e89ec1141d805a571af8c0894476eb03060a41"} Feb 27 01:51:46 crc kubenswrapper[4771]: I0227 01:51:46.247437 4771 scope.go:117] "RemoveContainer" containerID="7c90de4b514e1eb7f35d961f8d63643cf8f0c5f4a0c562fce17f2a220d8607bc" Feb 27 01:51:46 crc kubenswrapper[4771]: I0227 01:51:46.247467 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8gx45" Feb 27 01:51:46 crc kubenswrapper[4771]: I0227 01:51:46.290860 4771 scope.go:117] "RemoveContainer" containerID="9c8c4d94b5e68c0f01931b693d2c181ffaa2b265fe8bc062b15fe9748bf1adc8" Feb 27 01:51:46 crc kubenswrapper[4771]: I0227 01:51:46.295353 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8gx45"] Feb 27 01:51:46 crc kubenswrapper[4771]: I0227 01:51:46.311433 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8gx45"] Feb 27 01:51:46 crc kubenswrapper[4771]: I0227 01:51:46.315326 4771 scope.go:117] "RemoveContainer" containerID="e60bf77f7993df9aa5c59f7a30be10ccd5d451f846c5860a27d40fddbfe102a6" Feb 27 01:51:47 crc kubenswrapper[4771]: I0227 01:51:47.783931 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb326a80-d5dd-43b5-a208-a8f0d0ace903" path="/var/lib/kubelet/pods/eb326a80-d5dd-43b5-a208-a8f0d0ace903/volumes" Feb 27 01:51:48 crc kubenswrapper[4771]: I0227 01:51:48.275273 4771 generic.go:334] "Generic (PLEG): container finished" podID="1127bb8a-8b0b-41e9-84d9-3af25c157a94" containerID="2bf2513b1c3b3668edd8e9665403e49e8a82c582a67e99c4b30b173e16352b9c" exitCode=0 Feb 27 01:51:48 crc kubenswrapper[4771]: I0227 01:51:48.275363 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxvj7" event={"ID":"1127bb8a-8b0b-41e9-84d9-3af25c157a94","Type":"ContainerDied","Data":"2bf2513b1c3b3668edd8e9665403e49e8a82c582a67e99c4b30b173e16352b9c"} Feb 27 01:51:49 crc kubenswrapper[4771]: I0227 01:51:49.288784 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxvj7" event={"ID":"1127bb8a-8b0b-41e9-84d9-3af25c157a94","Type":"ContainerStarted","Data":"155e247b343d952d1e7825c862fe581e4a84fec4e7ce40dae38c8746c32945b6"} Feb 27 01:51:49 crc kubenswrapper[4771]: I0227 01:51:49.334431 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kxvj7" podStartSLOduration=2.832990526 podStartE2EDuration="10.334404608s" podCreationTimestamp="2026-02-27 01:51:39 +0000 UTC" firstStartedPulling="2026-02-27 01:51:41.180455604 +0000 UTC m=+2814.118016892" lastFinishedPulling="2026-02-27 01:51:48.681869646 +0000 UTC m=+2821.619430974" observedRunningTime="2026-02-27 01:51:49.32608741 +0000 UTC m=+2822.263648698" watchObservedRunningTime="2026-02-27 01:51:49.334404608 +0000 UTC m=+2822.271965916" Feb 27 01:51:50 crc kubenswrapper[4771]: I0227 01:51:50.331591 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kxvj7" Feb 27 01:51:50 crc kubenswrapper[4771]: I0227 01:51:50.331946 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kxvj7" Feb 27 01:51:51 crc kubenswrapper[4771]: I0227 01:51:51.376344 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kxvj7" podUID="1127bb8a-8b0b-41e9-84d9-3af25c157a94" containerName="registry-server" probeResult="failure" output=< Feb 27 01:51:51 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 27 01:51:51 crc kubenswrapper[4771]: > Feb 27 01:52:00 crc kubenswrapper[4771]: I0227 01:52:00.147464 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535952-vpk4t"] Feb 27 01:52:00 crc kubenswrapper[4771]: E0227 01:52:00.149062 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb326a80-d5dd-43b5-a208-a8f0d0ace903" containerName="extract-utilities" Feb 27 01:52:00 crc kubenswrapper[4771]: I0227 01:52:00.149096 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb326a80-d5dd-43b5-a208-a8f0d0ace903" containerName="extract-utilities" Feb 27 01:52:00 crc kubenswrapper[4771]: E0227 01:52:00.149120 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb326a80-d5dd-43b5-a208-a8f0d0ace903" containerName="extract-content" Feb 27 01:52:00 crc kubenswrapper[4771]: I0227 01:52:00.149127 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb326a80-d5dd-43b5-a208-a8f0d0ace903" containerName="extract-content" Feb 27 01:52:00 crc kubenswrapper[4771]: E0227 01:52:00.149144 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb326a80-d5dd-43b5-a208-a8f0d0ace903" containerName="registry-server" Feb 27 01:52:00 crc kubenswrapper[4771]: I0227 01:52:00.149150 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb326a80-d5dd-43b5-a208-a8f0d0ace903" containerName="registry-server" Feb 27 01:52:00 crc kubenswrapper[4771]: I0227 01:52:00.149339 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb326a80-d5dd-43b5-a208-a8f0d0ace903" containerName="registry-server" Feb 27 01:52:00 crc kubenswrapper[4771]: I0227 01:52:00.150100 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535952-vpk4t" Feb 27 01:52:00 crc kubenswrapper[4771]: I0227 01:52:00.151926 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 01:52:00 crc kubenswrapper[4771]: I0227 01:52:00.152456 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:52:00 crc kubenswrapper[4771]: I0227 01:52:00.153131 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:52:00 crc kubenswrapper[4771]: I0227 01:52:00.161813 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535952-vpk4t"] Feb 27 01:52:00 crc kubenswrapper[4771]: I0227 01:52:00.271918 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f4ch\" (UniqueName: \"kubernetes.io/projected/1222f2a7-debd-4e47-bc0c-c16c371aaa9e-kube-api-access-2f4ch\") pod \"auto-csr-approver-29535952-vpk4t\" (UID: \"1222f2a7-debd-4e47-bc0c-c16c371aaa9e\") " pod="openshift-infra/auto-csr-approver-29535952-vpk4t" Feb 27 01:52:00 crc kubenswrapper[4771]: I0227 01:52:00.373215 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f4ch\" (UniqueName: \"kubernetes.io/projected/1222f2a7-debd-4e47-bc0c-c16c371aaa9e-kube-api-access-2f4ch\") pod \"auto-csr-approver-29535952-vpk4t\" (UID: \"1222f2a7-debd-4e47-bc0c-c16c371aaa9e\") " pod="openshift-infra/auto-csr-approver-29535952-vpk4t" Feb 27 01:52:00 crc kubenswrapper[4771]: I0227 01:52:00.400117 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kxvj7" Feb 27 01:52:00 crc kubenswrapper[4771]: I0227 01:52:00.411507 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f4ch\" (UniqueName: \"kubernetes.io/projected/1222f2a7-debd-4e47-bc0c-c16c371aaa9e-kube-api-access-2f4ch\") pod \"auto-csr-approver-29535952-vpk4t\" (UID: \"1222f2a7-debd-4e47-bc0c-c16c371aaa9e\") " pod="openshift-infra/auto-csr-approver-29535952-vpk4t" Feb 27 01:52:00 crc kubenswrapper[4771]: I0227 01:52:00.456599 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kxvj7" Feb 27 01:52:00 crc kubenswrapper[4771]: I0227 01:52:00.470122 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535952-vpk4t" Feb 27 01:52:00 crc kubenswrapper[4771]: W0227 01:52:00.984535 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1222f2a7_debd_4e47_bc0c_c16c371aaa9e.slice/crio-8bb68cb03ea4698c9a9770278ad0e081099d43e3d18b21f8f86122cc9b574b44 WatchSource:0}: Error finding container 8bb68cb03ea4698c9a9770278ad0e081099d43e3d18b21f8f86122cc9b574b44: Status 404 returned error can't find the container with id 8bb68cb03ea4698c9a9770278ad0e081099d43e3d18b21f8f86122cc9b574b44 Feb 27 01:52:00 crc kubenswrapper[4771]: I0227 01:52:00.984986 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535952-vpk4t"] Feb 27 01:52:01 crc kubenswrapper[4771]: I0227 01:52:01.365793 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kxvj7"] Feb 27 01:52:01 crc kubenswrapper[4771]: I0227 01:52:01.413606 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535952-vpk4t" event={"ID":"1222f2a7-debd-4e47-bc0c-c16c371aaa9e","Type":"ContainerStarted","Data":"8bb68cb03ea4698c9a9770278ad0e081099d43e3d18b21f8f86122cc9b574b44"} Feb 27 01:52:02 crc kubenswrapper[4771]: I0227 01:52:02.421125 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kxvj7" podUID="1127bb8a-8b0b-41e9-84d9-3af25c157a94" containerName="registry-server" containerID="cri-o://155e247b343d952d1e7825c862fe581e4a84fec4e7ce40dae38c8746c32945b6" gracePeriod=2 Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.131094 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxvj7" Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.233907 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1127bb8a-8b0b-41e9-84d9-3af25c157a94-utilities\") pod \"1127bb8a-8b0b-41e9-84d9-3af25c157a94\" (UID: \"1127bb8a-8b0b-41e9-84d9-3af25c157a94\") " Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.234533 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1127bb8a-8b0b-41e9-84d9-3af25c157a94-catalog-content\") pod \"1127bb8a-8b0b-41e9-84d9-3af25c157a94\" (UID: \"1127bb8a-8b0b-41e9-84d9-3af25c157a94\") " Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.235039 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxkm8\" (UniqueName: \"kubernetes.io/projected/1127bb8a-8b0b-41e9-84d9-3af25c157a94-kube-api-access-sxkm8\") pod \"1127bb8a-8b0b-41e9-84d9-3af25c157a94\" (UID: \"1127bb8a-8b0b-41e9-84d9-3af25c157a94\") " Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.235061 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1127bb8a-8b0b-41e9-84d9-3af25c157a94-utilities" (OuterVolumeSpecName: "utilities") pod "1127bb8a-8b0b-41e9-84d9-3af25c157a94" (UID: "1127bb8a-8b0b-41e9-84d9-3af25c157a94"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.235862 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1127bb8a-8b0b-41e9-84d9-3af25c157a94-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.242759 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1127bb8a-8b0b-41e9-84d9-3af25c157a94-kube-api-access-sxkm8" (OuterVolumeSpecName: "kube-api-access-sxkm8") pod "1127bb8a-8b0b-41e9-84d9-3af25c157a94" (UID: "1127bb8a-8b0b-41e9-84d9-3af25c157a94"). InnerVolumeSpecName "kube-api-access-sxkm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.337483 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxkm8\" (UniqueName: \"kubernetes.io/projected/1127bb8a-8b0b-41e9-84d9-3af25c157a94-kube-api-access-sxkm8\") on node \"crc\" DevicePath \"\"" Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.376091 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1127bb8a-8b0b-41e9-84d9-3af25c157a94-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1127bb8a-8b0b-41e9-84d9-3af25c157a94" (UID: "1127bb8a-8b0b-41e9-84d9-3af25c157a94"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.434839 4771 generic.go:334] "Generic (PLEG): container finished" podID="1222f2a7-debd-4e47-bc0c-c16c371aaa9e" containerID="7e9c9bcf4dea774e33bddbc642e9cc21f2082ca598f5b2bf09a1df7ec2138c32" exitCode=0 Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.434939 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535952-vpk4t" event={"ID":"1222f2a7-debd-4e47-bc0c-c16c371aaa9e","Type":"ContainerDied","Data":"7e9c9bcf4dea774e33bddbc642e9cc21f2082ca598f5b2bf09a1df7ec2138c32"} Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.439058 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1127bb8a-8b0b-41e9-84d9-3af25c157a94-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.440001 4771 generic.go:334] "Generic (PLEG): container finished" podID="1127bb8a-8b0b-41e9-84d9-3af25c157a94" containerID="155e247b343d952d1e7825c862fe581e4a84fec4e7ce40dae38c8746c32945b6" exitCode=0 Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.440074 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxvj7" Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.440173 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxvj7" event={"ID":"1127bb8a-8b0b-41e9-84d9-3af25c157a94","Type":"ContainerDied","Data":"155e247b343d952d1e7825c862fe581e4a84fec4e7ce40dae38c8746c32945b6"} Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.440338 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxvj7" event={"ID":"1127bb8a-8b0b-41e9-84d9-3af25c157a94","Type":"ContainerDied","Data":"a4b44cb16ecab1167a67cf419fd4375a7794a02c14093a34d60e04fe22527b77"} Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.440374 4771 scope.go:117] "RemoveContainer" containerID="155e247b343d952d1e7825c862fe581e4a84fec4e7ce40dae38c8746c32945b6" Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.475137 4771 scope.go:117] "RemoveContainer" containerID="2bf2513b1c3b3668edd8e9665403e49e8a82c582a67e99c4b30b173e16352b9c" Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.479304 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kxvj7"] Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.488039 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kxvj7"] Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.510875 4771 scope.go:117] "RemoveContainer" containerID="7504bfa61dcceee2a9f1ab999c8b0f4efa51d1630f844cae040340e4903fd4bd" Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.553688 4771 scope.go:117] "RemoveContainer" containerID="155e247b343d952d1e7825c862fe581e4a84fec4e7ce40dae38c8746c32945b6" Feb 27 01:52:03 crc kubenswrapper[4771]: E0227 01:52:03.554172 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"155e247b343d952d1e7825c862fe581e4a84fec4e7ce40dae38c8746c32945b6\": container with ID starting with 155e247b343d952d1e7825c862fe581e4a84fec4e7ce40dae38c8746c32945b6 not found: ID does not exist" containerID="155e247b343d952d1e7825c862fe581e4a84fec4e7ce40dae38c8746c32945b6" Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.554295 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"155e247b343d952d1e7825c862fe581e4a84fec4e7ce40dae38c8746c32945b6"} err="failed to get container status \"155e247b343d952d1e7825c862fe581e4a84fec4e7ce40dae38c8746c32945b6\": rpc error: code = NotFound desc = could not find container \"155e247b343d952d1e7825c862fe581e4a84fec4e7ce40dae38c8746c32945b6\": container with ID starting with 155e247b343d952d1e7825c862fe581e4a84fec4e7ce40dae38c8746c32945b6 not found: ID does not exist" Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.554329 4771 scope.go:117] "RemoveContainer" containerID="2bf2513b1c3b3668edd8e9665403e49e8a82c582a67e99c4b30b173e16352b9c" Feb 27 01:52:03 crc kubenswrapper[4771]: E0227 01:52:03.554907 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bf2513b1c3b3668edd8e9665403e49e8a82c582a67e99c4b30b173e16352b9c\": container with ID starting with 2bf2513b1c3b3668edd8e9665403e49e8a82c582a67e99c4b30b173e16352b9c not found: ID does not exist" containerID="2bf2513b1c3b3668edd8e9665403e49e8a82c582a67e99c4b30b173e16352b9c" Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.554934 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bf2513b1c3b3668edd8e9665403e49e8a82c582a67e99c4b30b173e16352b9c"} err="failed to get container status \"2bf2513b1c3b3668edd8e9665403e49e8a82c582a67e99c4b30b173e16352b9c\": rpc error: code = NotFound desc = could not find container \"2bf2513b1c3b3668edd8e9665403e49e8a82c582a67e99c4b30b173e16352b9c\": container with ID starting with 2bf2513b1c3b3668edd8e9665403e49e8a82c582a67e99c4b30b173e16352b9c not found: ID does not exist" Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.554952 4771 scope.go:117] "RemoveContainer" containerID="7504bfa61dcceee2a9f1ab999c8b0f4efa51d1630f844cae040340e4903fd4bd" Feb 27 01:52:03 crc kubenswrapper[4771]: E0227 01:52:03.555928 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7504bfa61dcceee2a9f1ab999c8b0f4efa51d1630f844cae040340e4903fd4bd\": container with ID starting with 7504bfa61dcceee2a9f1ab999c8b0f4efa51d1630f844cae040340e4903fd4bd not found: ID does not exist" containerID="7504bfa61dcceee2a9f1ab999c8b0f4efa51d1630f844cae040340e4903fd4bd" Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.555991 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7504bfa61dcceee2a9f1ab999c8b0f4efa51d1630f844cae040340e4903fd4bd"} err="failed to get container status \"7504bfa61dcceee2a9f1ab999c8b0f4efa51d1630f844cae040340e4903fd4bd\": rpc error: code = NotFound desc = could not find container \"7504bfa61dcceee2a9f1ab999c8b0f4efa51d1630f844cae040340e4903fd4bd\": container with ID starting with 7504bfa61dcceee2a9f1ab999c8b0f4efa51d1630f844cae040340e4903fd4bd not found: ID does not exist" Feb 27 01:52:03 crc kubenswrapper[4771]: I0227 01:52:03.783741 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1127bb8a-8b0b-41e9-84d9-3af25c157a94" path="/var/lib/kubelet/pods/1127bb8a-8b0b-41e9-84d9-3af25c157a94/volumes" Feb 27 01:52:04 crc kubenswrapper[4771]: I0227 01:52:04.872244 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535952-vpk4t" Feb 27 01:52:04 crc kubenswrapper[4771]: I0227 01:52:04.973407 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f4ch\" (UniqueName: \"kubernetes.io/projected/1222f2a7-debd-4e47-bc0c-c16c371aaa9e-kube-api-access-2f4ch\") pod \"1222f2a7-debd-4e47-bc0c-c16c371aaa9e\" (UID: \"1222f2a7-debd-4e47-bc0c-c16c371aaa9e\") " Feb 27 01:52:04 crc kubenswrapper[4771]: I0227 01:52:04.979396 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1222f2a7-debd-4e47-bc0c-c16c371aaa9e-kube-api-access-2f4ch" (OuterVolumeSpecName: "kube-api-access-2f4ch") pod "1222f2a7-debd-4e47-bc0c-c16c371aaa9e" (UID: "1222f2a7-debd-4e47-bc0c-c16c371aaa9e"). InnerVolumeSpecName "kube-api-access-2f4ch". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:52:05 crc kubenswrapper[4771]: I0227 01:52:05.075821 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f4ch\" (UniqueName: \"kubernetes.io/projected/1222f2a7-debd-4e47-bc0c-c16c371aaa9e-kube-api-access-2f4ch\") on node \"crc\" DevicePath \"\"" Feb 27 01:52:05 crc kubenswrapper[4771]: I0227 01:52:05.461399 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535952-vpk4t" event={"ID":"1222f2a7-debd-4e47-bc0c-c16c371aaa9e","Type":"ContainerDied","Data":"8bb68cb03ea4698c9a9770278ad0e081099d43e3d18b21f8f86122cc9b574b44"} Feb 27 01:52:05 crc kubenswrapper[4771]: I0227 01:52:05.461699 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bb68cb03ea4698c9a9770278ad0e081099d43e3d18b21f8f86122cc9b574b44" Feb 27 01:52:05 crc kubenswrapper[4771]: I0227 01:52:05.461454 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535952-vpk4t" Feb 27 01:52:05 crc kubenswrapper[4771]: I0227 01:52:05.968158 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535946-9w266"] Feb 27 01:52:06 crc kubenswrapper[4771]: I0227 01:52:06.002236 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535946-9w266"] Feb 27 01:52:07 crc kubenswrapper[4771]: I0227 01:52:07.785333 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c2d3b83-c0f9-48e9-9e64-1ad5d78fb32a" path="/var/lib/kubelet/pods/8c2d3b83-c0f9-48e9-9e64-1ad5d78fb32a/volumes" Feb 27 01:52:30 crc kubenswrapper[4771]: I0227 01:52:30.020592 4771 scope.go:117] "RemoveContainer" containerID="78c36c48b13c35705a9cfdf04dbf6a52be5a934d1eaedd73687393b5792908b4" Feb 27 01:53:57 crc kubenswrapper[4771]: I0227 01:53:57.661522 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sfwjc"] Feb 27 01:53:57 crc kubenswrapper[4771]: E0227 01:53:57.662382 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1127bb8a-8b0b-41e9-84d9-3af25c157a94" containerName="registry-server" Feb 27 01:53:57 crc kubenswrapper[4771]: I0227 01:53:57.662395 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1127bb8a-8b0b-41e9-84d9-3af25c157a94" containerName="registry-server" Feb 27 01:53:57 crc kubenswrapper[4771]: E0227 01:53:57.662419 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1127bb8a-8b0b-41e9-84d9-3af25c157a94" containerName="extract-utilities" Feb 27 01:53:57 crc kubenswrapper[4771]: I0227 01:53:57.662426 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1127bb8a-8b0b-41e9-84d9-3af25c157a94" containerName="extract-utilities" Feb 27 01:53:57 crc kubenswrapper[4771]: E0227 01:53:57.662439 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1222f2a7-debd-4e47-bc0c-c16c371aaa9e" containerName="oc" Feb 27 01:53:57 crc kubenswrapper[4771]: I0227 01:53:57.662445 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1222f2a7-debd-4e47-bc0c-c16c371aaa9e" containerName="oc" Feb 27 01:53:57 crc kubenswrapper[4771]: E0227 01:53:57.662471 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1127bb8a-8b0b-41e9-84d9-3af25c157a94" containerName="extract-content" Feb 27 01:53:57 crc kubenswrapper[4771]: I0227 01:53:57.662476 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="1127bb8a-8b0b-41e9-84d9-3af25c157a94" containerName="extract-content" Feb 27 01:53:57 crc kubenswrapper[4771]: I0227 01:53:57.662649 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1222f2a7-debd-4e47-bc0c-c16c371aaa9e" containerName="oc" Feb 27 01:53:57 crc kubenswrapper[4771]: I0227 01:53:57.662661 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="1127bb8a-8b0b-41e9-84d9-3af25c157a94" containerName="registry-server" Feb 27 01:53:57 crc kubenswrapper[4771]: I0227 01:53:57.663849 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfwjc" Feb 27 01:53:57 crc kubenswrapper[4771]: I0227 01:53:57.683124 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfwjc"] Feb 27 01:53:57 crc kubenswrapper[4771]: I0227 01:53:57.687270 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6txx\" (UniqueName: \"kubernetes.io/projected/00862d23-a5de-49d4-95fa-50b61debf25c-kube-api-access-d6txx\") pod \"redhat-marketplace-sfwjc\" (UID: \"00862d23-a5de-49d4-95fa-50b61debf25c\") " pod="openshift-marketplace/redhat-marketplace-sfwjc" Feb 27 01:53:57 crc kubenswrapper[4771]: I0227 01:53:57.687315 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00862d23-a5de-49d4-95fa-50b61debf25c-catalog-content\") pod \"redhat-marketplace-sfwjc\" (UID: \"00862d23-a5de-49d4-95fa-50b61debf25c\") " pod="openshift-marketplace/redhat-marketplace-sfwjc" Feb 27 01:53:57 crc kubenswrapper[4771]: I0227 01:53:57.687343 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00862d23-a5de-49d4-95fa-50b61debf25c-utilities\") pod \"redhat-marketplace-sfwjc\" (UID: \"00862d23-a5de-49d4-95fa-50b61debf25c\") " pod="openshift-marketplace/redhat-marketplace-sfwjc" Feb 27 01:53:57 crc kubenswrapper[4771]: I0227 01:53:57.789232 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6txx\" (UniqueName: \"kubernetes.io/projected/00862d23-a5de-49d4-95fa-50b61debf25c-kube-api-access-d6txx\") pod \"redhat-marketplace-sfwjc\" (UID: \"00862d23-a5de-49d4-95fa-50b61debf25c\") " pod="openshift-marketplace/redhat-marketplace-sfwjc" Feb 27 01:53:57 crc kubenswrapper[4771]: I0227 01:53:57.789320 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00862d23-a5de-49d4-95fa-50b61debf25c-catalog-content\") pod \"redhat-marketplace-sfwjc\" (UID: \"00862d23-a5de-49d4-95fa-50b61debf25c\") " pod="openshift-marketplace/redhat-marketplace-sfwjc" Feb 27 01:53:57 crc kubenswrapper[4771]: I0227 01:53:57.789354 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00862d23-a5de-49d4-95fa-50b61debf25c-utilities\") pod \"redhat-marketplace-sfwjc\" (UID: \"00862d23-a5de-49d4-95fa-50b61debf25c\") " pod="openshift-marketplace/redhat-marketplace-sfwjc" Feb 27 01:53:57 crc kubenswrapper[4771]: I0227 01:53:57.792156 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00862d23-a5de-49d4-95fa-50b61debf25c-catalog-content\") pod \"redhat-marketplace-sfwjc\" (UID: \"00862d23-a5de-49d4-95fa-50b61debf25c\") " pod="openshift-marketplace/redhat-marketplace-sfwjc" Feb 27 01:53:57 crc kubenswrapper[4771]: I0227 01:53:57.793233 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00862d23-a5de-49d4-95fa-50b61debf25c-utilities\") pod \"redhat-marketplace-sfwjc\" (UID: \"00862d23-a5de-49d4-95fa-50b61debf25c\") " pod="openshift-marketplace/redhat-marketplace-sfwjc" Feb 27 01:53:57 crc kubenswrapper[4771]: I0227 01:53:57.824668 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6txx\" (UniqueName: \"kubernetes.io/projected/00862d23-a5de-49d4-95fa-50b61debf25c-kube-api-access-d6txx\") pod \"redhat-marketplace-sfwjc\" (UID: \"00862d23-a5de-49d4-95fa-50b61debf25c\") " pod="openshift-marketplace/redhat-marketplace-sfwjc" Feb 27 01:53:58 crc kubenswrapper[4771]: I0227 01:53:58.057296 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfwjc" Feb 27 01:53:58 crc kubenswrapper[4771]: I0227 01:53:58.563418 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfwjc"] Feb 27 01:53:58 crc kubenswrapper[4771]: I0227 01:53:58.743160 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfwjc" event={"ID":"00862d23-a5de-49d4-95fa-50b61debf25c","Type":"ContainerStarted","Data":"3ae21b0f24c31905ffdb7a6a467a190503423f4b5203dd0531df08b9794b5a75"} Feb 27 01:53:58 crc kubenswrapper[4771]: I0227 01:53:58.953229 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:53:58 crc kubenswrapper[4771]: I0227 01:53:58.953323 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:53:59 crc kubenswrapper[4771]: I0227 01:53:59.754660 4771 generic.go:334] "Generic (PLEG): container finished" podID="00862d23-a5de-49d4-95fa-50b61debf25c" containerID="ec08883b62b3578fd14565f80f68937decf4fde7e54ffaa667b33f7ba407ed47" exitCode=0 Feb 27 01:53:59 crc kubenswrapper[4771]: I0227 01:53:59.754787 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfwjc" event={"ID":"00862d23-a5de-49d4-95fa-50b61debf25c","Type":"ContainerDied","Data":"ec08883b62b3578fd14565f80f68937decf4fde7e54ffaa667b33f7ba407ed47"} Feb 27 01:54:00 crc kubenswrapper[4771]: I0227 01:54:00.148850 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535954-k96n5"] Feb 27 01:54:00 crc kubenswrapper[4771]: I0227 01:54:00.151933 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535954-k96n5" Feb 27 01:54:00 crc kubenswrapper[4771]: I0227 01:54:00.154573 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:54:00 crc kubenswrapper[4771]: I0227 01:54:00.154669 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 01:54:00 crc kubenswrapper[4771]: I0227 01:54:00.163263 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:54:00 crc kubenswrapper[4771]: I0227 01:54:00.172187 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535954-k96n5"] Feb 27 01:54:00 crc kubenswrapper[4771]: I0227 01:54:00.228403 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jb7f\" (UniqueName: \"kubernetes.io/projected/a87ea5d6-cb36-4e80-9a90-ecda58c578ff-kube-api-access-4jb7f\") pod \"auto-csr-approver-29535954-k96n5\" (UID: \"a87ea5d6-cb36-4e80-9a90-ecda58c578ff\") " pod="openshift-infra/auto-csr-approver-29535954-k96n5" Feb 27 01:54:00 crc kubenswrapper[4771]: I0227 01:54:00.330101 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jb7f\" (UniqueName: \"kubernetes.io/projected/a87ea5d6-cb36-4e80-9a90-ecda58c578ff-kube-api-access-4jb7f\") pod \"auto-csr-approver-29535954-k96n5\" (UID: \"a87ea5d6-cb36-4e80-9a90-ecda58c578ff\") " pod="openshift-infra/auto-csr-approver-29535954-k96n5" Feb 27 01:54:00 crc kubenswrapper[4771]: I0227 01:54:00.354767 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jb7f\" (UniqueName: \"kubernetes.io/projected/a87ea5d6-cb36-4e80-9a90-ecda58c578ff-kube-api-access-4jb7f\") pod \"auto-csr-approver-29535954-k96n5\" (UID: \"a87ea5d6-cb36-4e80-9a90-ecda58c578ff\") " pod="openshift-infra/auto-csr-approver-29535954-k96n5" Feb 27 01:54:00 crc kubenswrapper[4771]: I0227 01:54:00.474718 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535954-k96n5" Feb 27 01:54:00 crc kubenswrapper[4771]: W0227 01:54:00.959866 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda87ea5d6_cb36_4e80_9a90_ecda58c578ff.slice/crio-17ea8f9da2d89123f9f0ed0122fc0c13c0f2e7e3c798ef25dba365d3e5016e59 WatchSource:0}: Error finding container 17ea8f9da2d89123f9f0ed0122fc0c13c0f2e7e3c798ef25dba365d3e5016e59: Status 404 returned error can't find the container with id 17ea8f9da2d89123f9f0ed0122fc0c13c0f2e7e3c798ef25dba365d3e5016e59 Feb 27 01:54:01 crc kubenswrapper[4771]: I0227 01:54:01.010846 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535954-k96n5"] Feb 27 01:54:01 crc kubenswrapper[4771]: I0227 01:54:01.772203 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535954-k96n5" event={"ID":"a87ea5d6-cb36-4e80-9a90-ecda58c578ff","Type":"ContainerStarted","Data":"17ea8f9da2d89123f9f0ed0122fc0c13c0f2e7e3c798ef25dba365d3e5016e59"} Feb 27 01:54:01 crc kubenswrapper[4771]: I0227 01:54:01.774723 4771 generic.go:334] "Generic (PLEG): container finished" podID="00862d23-a5de-49d4-95fa-50b61debf25c" containerID="d784e16225d76280b0e492a3ebacce432725ae94faaa2db5d26b03e7299668f2" exitCode=0 Feb 27 01:54:01 crc kubenswrapper[4771]: I0227 01:54:01.787181 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfwjc" event={"ID":"00862d23-a5de-49d4-95fa-50b61debf25c","Type":"ContainerDied","Data":"d784e16225d76280b0e492a3ebacce432725ae94faaa2db5d26b03e7299668f2"} Feb 27 01:54:02 crc kubenswrapper[4771]: I0227 01:54:02.790135 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfwjc" event={"ID":"00862d23-a5de-49d4-95fa-50b61debf25c","Type":"ContainerStarted","Data":"7a469e9c110892be49563c67a61816288757507a4b6ec1edda1e217a331711ed"} Feb 27 01:54:02 crc kubenswrapper[4771]: I0227 01:54:02.792244 4771 generic.go:334] "Generic (PLEG): container finished" podID="a87ea5d6-cb36-4e80-9a90-ecda58c578ff" containerID="3670fdbaac6840be3ff89483f6ef29647120e0da2783cd28a55ad11c415061dc" exitCode=0 Feb 27 01:54:02 crc kubenswrapper[4771]: I0227 01:54:02.792302 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535954-k96n5" event={"ID":"a87ea5d6-cb36-4e80-9a90-ecda58c578ff","Type":"ContainerDied","Data":"3670fdbaac6840be3ff89483f6ef29647120e0da2783cd28a55ad11c415061dc"} Feb 27 01:54:02 crc kubenswrapper[4771]: I0227 01:54:02.815263 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sfwjc" podStartSLOduration=3.360649858 podStartE2EDuration="5.815245194s" podCreationTimestamp="2026-02-27 01:53:57 +0000 UTC" firstStartedPulling="2026-02-27 01:53:59.756770561 +0000 UTC m=+2952.694331879" lastFinishedPulling="2026-02-27 01:54:02.211365927 +0000 UTC m=+2955.148927215" observedRunningTime="2026-02-27 01:54:02.812585641 +0000 UTC m=+2955.750146949" watchObservedRunningTime="2026-02-27 01:54:02.815245194 +0000 UTC m=+2955.752806482" Feb 27 01:54:04 crc kubenswrapper[4771]: I0227 01:54:04.225644 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535954-k96n5" Feb 27 01:54:04 crc kubenswrapper[4771]: I0227 01:54:04.410339 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jb7f\" (UniqueName: \"kubernetes.io/projected/a87ea5d6-cb36-4e80-9a90-ecda58c578ff-kube-api-access-4jb7f\") pod \"a87ea5d6-cb36-4e80-9a90-ecda58c578ff\" (UID: \"a87ea5d6-cb36-4e80-9a90-ecda58c578ff\") " Feb 27 01:54:04 crc kubenswrapper[4771]: I0227 01:54:04.415565 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a87ea5d6-cb36-4e80-9a90-ecda58c578ff-kube-api-access-4jb7f" (OuterVolumeSpecName: "kube-api-access-4jb7f") pod "a87ea5d6-cb36-4e80-9a90-ecda58c578ff" (UID: "a87ea5d6-cb36-4e80-9a90-ecda58c578ff"). InnerVolumeSpecName "kube-api-access-4jb7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:54:04 crc kubenswrapper[4771]: I0227 01:54:04.539837 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jb7f\" (UniqueName: \"kubernetes.io/projected/a87ea5d6-cb36-4e80-9a90-ecda58c578ff-kube-api-access-4jb7f\") on node \"crc\" DevicePath \"\"" Feb 27 01:54:04 crc kubenswrapper[4771]: I0227 01:54:04.816928 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535954-k96n5" event={"ID":"a87ea5d6-cb36-4e80-9a90-ecda58c578ff","Type":"ContainerDied","Data":"17ea8f9da2d89123f9f0ed0122fc0c13c0f2e7e3c798ef25dba365d3e5016e59"} Feb 27 01:54:04 crc kubenswrapper[4771]: I0227 01:54:04.816998 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17ea8f9da2d89123f9f0ed0122fc0c13c0f2e7e3c798ef25dba365d3e5016e59" Feb 27 01:54:04 crc kubenswrapper[4771]: I0227 01:54:04.817075 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535954-k96n5" Feb 27 01:54:05 crc kubenswrapper[4771]: I0227 01:54:05.319821 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535948-dcslg"] Feb 27 01:54:05 crc kubenswrapper[4771]: I0227 01:54:05.340186 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535948-dcslg"] Feb 27 01:54:05 crc kubenswrapper[4771]: I0227 01:54:05.785047 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feb3590d-ec9d-4437-82c7-24e0f748d130" path="/var/lib/kubelet/pods/feb3590d-ec9d-4437-82c7-24e0f748d130/volumes" Feb 27 01:54:08 crc kubenswrapper[4771]: I0227 01:54:08.058430 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sfwjc" Feb 27 01:54:08 crc kubenswrapper[4771]: I0227 01:54:08.059138 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sfwjc" Feb 27 01:54:08 crc kubenswrapper[4771]: I0227 01:54:08.157692 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sfwjc" Feb 27 01:54:08 crc kubenswrapper[4771]: I0227 01:54:08.957349 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sfwjc" Feb 27 01:54:09 crc kubenswrapper[4771]: I0227 01:54:09.017724 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfwjc"] Feb 27 01:54:10 crc kubenswrapper[4771]: I0227 01:54:10.906618 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sfwjc" podUID="00862d23-a5de-49d4-95fa-50b61debf25c" containerName="registry-server" containerID="cri-o://7a469e9c110892be49563c67a61816288757507a4b6ec1edda1e217a331711ed" gracePeriod=2 Feb 27 01:54:11 crc kubenswrapper[4771]: I0227 01:54:11.435606 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfwjc" Feb 27 01:54:11 crc kubenswrapper[4771]: I0227 01:54:11.585330 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00862d23-a5de-49d4-95fa-50b61debf25c-utilities\") pod \"00862d23-a5de-49d4-95fa-50b61debf25c\" (UID: \"00862d23-a5de-49d4-95fa-50b61debf25c\") " Feb 27 01:54:11 crc kubenswrapper[4771]: I0227 01:54:11.585615 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00862d23-a5de-49d4-95fa-50b61debf25c-catalog-content\") pod \"00862d23-a5de-49d4-95fa-50b61debf25c\" (UID: \"00862d23-a5de-49d4-95fa-50b61debf25c\") " Feb 27 01:54:11 crc kubenswrapper[4771]: I0227 01:54:11.585661 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6txx\" (UniqueName: \"kubernetes.io/projected/00862d23-a5de-49d4-95fa-50b61debf25c-kube-api-access-d6txx\") pod \"00862d23-a5de-49d4-95fa-50b61debf25c\" (UID: \"00862d23-a5de-49d4-95fa-50b61debf25c\") " Feb 27 01:54:11 crc kubenswrapper[4771]: I0227 01:54:11.586408 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00862d23-a5de-49d4-95fa-50b61debf25c-utilities" (OuterVolumeSpecName: "utilities") pod "00862d23-a5de-49d4-95fa-50b61debf25c" (UID: "00862d23-a5de-49d4-95fa-50b61debf25c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:54:11 crc kubenswrapper[4771]: I0227 01:54:11.593381 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00862d23-a5de-49d4-95fa-50b61debf25c-kube-api-access-d6txx" (OuterVolumeSpecName: "kube-api-access-d6txx") pod "00862d23-a5de-49d4-95fa-50b61debf25c" (UID: "00862d23-a5de-49d4-95fa-50b61debf25c"). InnerVolumeSpecName "kube-api-access-d6txx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:54:11 crc kubenswrapper[4771]: I0227 01:54:11.612051 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00862d23-a5de-49d4-95fa-50b61debf25c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "00862d23-a5de-49d4-95fa-50b61debf25c" (UID: "00862d23-a5de-49d4-95fa-50b61debf25c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 01:54:11 crc kubenswrapper[4771]: I0227 01:54:11.687685 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00862d23-a5de-49d4-95fa-50b61debf25c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 01:54:11 crc kubenswrapper[4771]: I0227 01:54:11.687722 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6txx\" (UniqueName: \"kubernetes.io/projected/00862d23-a5de-49d4-95fa-50b61debf25c-kube-api-access-d6txx\") on node \"crc\" DevicePath \"\"" Feb 27 01:54:11 crc kubenswrapper[4771]: I0227 01:54:11.687738 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00862d23-a5de-49d4-95fa-50b61debf25c-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 01:54:11 crc kubenswrapper[4771]: I0227 01:54:11.918083 4771 generic.go:334] "Generic (PLEG): container finished" podID="00862d23-a5de-49d4-95fa-50b61debf25c" containerID="7a469e9c110892be49563c67a61816288757507a4b6ec1edda1e217a331711ed" exitCode=0 Feb 27 01:54:11 crc kubenswrapper[4771]: I0227 01:54:11.918131 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfwjc" event={"ID":"00862d23-a5de-49d4-95fa-50b61debf25c","Type":"ContainerDied","Data":"7a469e9c110892be49563c67a61816288757507a4b6ec1edda1e217a331711ed"} Feb 27 01:54:11 crc kubenswrapper[4771]: I0227 01:54:11.918186 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfwjc" event={"ID":"00862d23-a5de-49d4-95fa-50b61debf25c","Type":"ContainerDied","Data":"3ae21b0f24c31905ffdb7a6a467a190503423f4b5203dd0531df08b9794b5a75"} Feb 27 01:54:11 crc kubenswrapper[4771]: I0227 01:54:11.918195 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfwjc" Feb 27 01:54:11 crc kubenswrapper[4771]: I0227 01:54:11.918206 4771 scope.go:117] "RemoveContainer" containerID="7a469e9c110892be49563c67a61816288757507a4b6ec1edda1e217a331711ed" Feb 27 01:54:11 crc kubenswrapper[4771]: I0227 01:54:11.956972 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfwjc"] Feb 27 01:54:11 crc kubenswrapper[4771]: I0227 01:54:11.964168 4771 scope.go:117] "RemoveContainer" containerID="d784e16225d76280b0e492a3ebacce432725ae94faaa2db5d26b03e7299668f2" Feb 27 01:54:11 crc kubenswrapper[4771]: I0227 01:54:11.971961 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfwjc"] Feb 27 01:54:11 crc kubenswrapper[4771]: I0227 01:54:11.991791 4771 scope.go:117] "RemoveContainer" containerID="ec08883b62b3578fd14565f80f68937decf4fde7e54ffaa667b33f7ba407ed47" Feb 27 01:54:12 crc kubenswrapper[4771]: I0227 01:54:12.063287 4771 scope.go:117] "RemoveContainer" containerID="7a469e9c110892be49563c67a61816288757507a4b6ec1edda1e217a331711ed" Feb 27 01:54:12 crc kubenswrapper[4771]: E0227 01:54:12.064005 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a469e9c110892be49563c67a61816288757507a4b6ec1edda1e217a331711ed\": container with ID starting with 7a469e9c110892be49563c67a61816288757507a4b6ec1edda1e217a331711ed not found: ID does not exist" containerID="7a469e9c110892be49563c67a61816288757507a4b6ec1edda1e217a331711ed" Feb 27 01:54:12 crc kubenswrapper[4771]: I0227 01:54:12.064090 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a469e9c110892be49563c67a61816288757507a4b6ec1edda1e217a331711ed"} err="failed to get container status \"7a469e9c110892be49563c67a61816288757507a4b6ec1edda1e217a331711ed\": rpc error: code = NotFound desc = could not find container \"7a469e9c110892be49563c67a61816288757507a4b6ec1edda1e217a331711ed\": container with ID starting with 7a469e9c110892be49563c67a61816288757507a4b6ec1edda1e217a331711ed not found: ID does not exist" Feb 27 01:54:12 crc kubenswrapper[4771]: I0227 01:54:12.064147 4771 scope.go:117] "RemoveContainer" containerID="d784e16225d76280b0e492a3ebacce432725ae94faaa2db5d26b03e7299668f2" Feb 27 01:54:12 crc kubenswrapper[4771]: E0227 01:54:12.064778 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d784e16225d76280b0e492a3ebacce432725ae94faaa2db5d26b03e7299668f2\": container with ID starting with d784e16225d76280b0e492a3ebacce432725ae94faaa2db5d26b03e7299668f2 not found: ID does not exist" containerID="d784e16225d76280b0e492a3ebacce432725ae94faaa2db5d26b03e7299668f2" Feb 27 01:54:12 crc kubenswrapper[4771]: I0227 01:54:12.064810 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d784e16225d76280b0e492a3ebacce432725ae94faaa2db5d26b03e7299668f2"} err="failed to get container status \"d784e16225d76280b0e492a3ebacce432725ae94faaa2db5d26b03e7299668f2\": rpc error: code = NotFound desc = could not find container \"d784e16225d76280b0e492a3ebacce432725ae94faaa2db5d26b03e7299668f2\": container with ID starting with d784e16225d76280b0e492a3ebacce432725ae94faaa2db5d26b03e7299668f2 not found: ID does not exist" Feb 27 01:54:12 crc kubenswrapper[4771]: I0227 01:54:12.064826 4771 scope.go:117] "RemoveContainer" containerID="ec08883b62b3578fd14565f80f68937decf4fde7e54ffaa667b33f7ba407ed47" Feb 27 01:54:12 crc kubenswrapper[4771]: E0227 01:54:12.065454 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec08883b62b3578fd14565f80f68937decf4fde7e54ffaa667b33f7ba407ed47\": container with ID starting with ec08883b62b3578fd14565f80f68937decf4fde7e54ffaa667b33f7ba407ed47 not found: ID does not exist" containerID="ec08883b62b3578fd14565f80f68937decf4fde7e54ffaa667b33f7ba407ed47" Feb 27 01:54:12 crc kubenswrapper[4771]: I0227 01:54:12.065479 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec08883b62b3578fd14565f80f68937decf4fde7e54ffaa667b33f7ba407ed47"} err="failed to get container status \"ec08883b62b3578fd14565f80f68937decf4fde7e54ffaa667b33f7ba407ed47\": rpc error: code = NotFound desc = could not find container \"ec08883b62b3578fd14565f80f68937decf4fde7e54ffaa667b33f7ba407ed47\": container with ID starting with ec08883b62b3578fd14565f80f68937decf4fde7e54ffaa667b33f7ba407ed47 not found: ID does not exist" Feb 27 01:54:13 crc kubenswrapper[4771]: I0227 01:54:13.797528 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00862d23-a5de-49d4-95fa-50b61debf25c" path="/var/lib/kubelet/pods/00862d23-a5de-49d4-95fa-50b61debf25c/volumes" Feb 27 01:54:28 crc kubenswrapper[4771]: I0227 01:54:28.953489 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:54:28 crc kubenswrapper[4771]: I0227 01:54:28.954090 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:54:30 crc kubenswrapper[4771]: I0227 01:54:30.158443 4771 scope.go:117] "RemoveContainer" containerID="0dfdbf95667659f25c1a91c3038129c9723547af061466b8cb8a1a926f8019f3" Feb 27 01:54:58 crc kubenswrapper[4771]: I0227 01:54:58.952993 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:54:58 crc kubenswrapper[4771]: I0227 01:54:58.953723 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:54:58 crc kubenswrapper[4771]: I0227 01:54:58.953827 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 01:54:58 crc kubenswrapper[4771]: I0227 01:54:58.954831 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9bc7e891ab68febf267f0a069a80d85ee866b60dfced10cc635f46adae460c1b"} pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 01:54:58 crc kubenswrapper[4771]: I0227 01:54:58.954920 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" containerID="cri-o://9bc7e891ab68febf267f0a069a80d85ee866b60dfced10cc635f46adae460c1b" gracePeriod=600 Feb 27 01:54:59 crc kubenswrapper[4771]: I0227 01:54:59.422124 4771 generic.go:334] "Generic (PLEG): container finished" podID="ca81e505-d53f-496e-bd26-7cec669591e4" containerID="9bc7e891ab68febf267f0a069a80d85ee866b60dfced10cc635f46adae460c1b" exitCode=0 Feb 27 01:54:59 crc kubenswrapper[4771]: I0227 01:54:59.422209 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerDied","Data":"9bc7e891ab68febf267f0a069a80d85ee866b60dfced10cc635f46adae460c1b"} Feb 27 01:54:59 crc kubenswrapper[4771]: I0227 01:54:59.422662 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerStarted","Data":"a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc"} Feb 27 01:54:59 crc kubenswrapper[4771]: I0227 01:54:59.422710 4771 scope.go:117] "RemoveContainer" containerID="8e29af5591b54464fb195f5f4706d4495ac8608be7fa1e6f71208ee25a83a83f" Feb 27 01:55:02 crc kubenswrapper[4771]: I0227 01:55:02.929093 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-7b8d8fb79c-qxz4q" podUID="d0f1ec21-667d-46de-abbb-cb95d29e861c" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 27 01:56:00 crc kubenswrapper[4771]: I0227 01:56:00.153677 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535956-fm6j9"] Feb 27 01:56:00 crc kubenswrapper[4771]: E0227 01:56:00.155134 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00862d23-a5de-49d4-95fa-50b61debf25c" containerName="extract-utilities" Feb 27 01:56:00 crc kubenswrapper[4771]: I0227 01:56:00.155163 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="00862d23-a5de-49d4-95fa-50b61debf25c" containerName="extract-utilities" Feb 27 01:56:00 crc kubenswrapper[4771]: E0227 01:56:00.155199 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a87ea5d6-cb36-4e80-9a90-ecda58c578ff" containerName="oc" Feb 27 01:56:00 crc kubenswrapper[4771]: I0227 01:56:00.155211 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a87ea5d6-cb36-4e80-9a90-ecda58c578ff" containerName="oc" Feb 27 01:56:00 crc kubenswrapper[4771]: E0227 01:56:00.155262 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00862d23-a5de-49d4-95fa-50b61debf25c" containerName="registry-server" Feb 27 01:56:00 crc kubenswrapper[4771]: I0227 01:56:00.155274 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="00862d23-a5de-49d4-95fa-50b61debf25c" containerName="registry-server" Feb 27 01:56:00 crc kubenswrapper[4771]: E0227 01:56:00.155316 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00862d23-a5de-49d4-95fa-50b61debf25c" containerName="extract-content" Feb 27 01:56:00 crc kubenswrapper[4771]: I0227 01:56:00.155330 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="00862d23-a5de-49d4-95fa-50b61debf25c" containerName="extract-content" Feb 27 01:56:00 crc kubenswrapper[4771]: I0227 01:56:00.155745 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a87ea5d6-cb36-4e80-9a90-ecda58c578ff" containerName="oc" Feb 27 01:56:00 crc kubenswrapper[4771]: I0227 01:56:00.155818 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="00862d23-a5de-49d4-95fa-50b61debf25c" containerName="registry-server" Feb 27 01:56:00 crc kubenswrapper[4771]: I0227 01:56:00.157018 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535956-fm6j9" Feb 27 01:56:00 crc kubenswrapper[4771]: I0227 01:56:00.160356 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 01:56:00 crc kubenswrapper[4771]: I0227 01:56:00.160833 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:56:00 crc kubenswrapper[4771]: I0227 01:56:00.162745 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:56:00 crc kubenswrapper[4771]: I0227 01:56:00.171084 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535956-fm6j9"] Feb 27 01:56:00 crc kubenswrapper[4771]: I0227 01:56:00.258065 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv8fz\" (UniqueName: \"kubernetes.io/projected/c85e9c07-b194-45d7-936a-33ade4484507-kube-api-access-lv8fz\") pod \"auto-csr-approver-29535956-fm6j9\" (UID: \"c85e9c07-b194-45d7-936a-33ade4484507\") " pod="openshift-infra/auto-csr-approver-29535956-fm6j9" Feb 27 01:56:00 crc kubenswrapper[4771]: I0227 01:56:00.359924 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv8fz\" (UniqueName: \"kubernetes.io/projected/c85e9c07-b194-45d7-936a-33ade4484507-kube-api-access-lv8fz\") pod \"auto-csr-approver-29535956-fm6j9\" (UID: \"c85e9c07-b194-45d7-936a-33ade4484507\") " pod="openshift-infra/auto-csr-approver-29535956-fm6j9" Feb 27 01:56:00 crc kubenswrapper[4771]: I0227 01:56:00.390289 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv8fz\" (UniqueName: \"kubernetes.io/projected/c85e9c07-b194-45d7-936a-33ade4484507-kube-api-access-lv8fz\") pod \"auto-csr-approver-29535956-fm6j9\" (UID: \"c85e9c07-b194-45d7-936a-33ade4484507\") " pod="openshift-infra/auto-csr-approver-29535956-fm6j9" Feb 27 01:56:00 crc kubenswrapper[4771]: I0227 01:56:00.479607 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535956-fm6j9" Feb 27 01:56:00 crc kubenswrapper[4771]: I0227 01:56:00.962431 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535956-fm6j9"] Feb 27 01:56:00 crc kubenswrapper[4771]: W0227 01:56:00.972634 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc85e9c07_b194_45d7_936a_33ade4484507.slice/crio-ab56058b6629c1311bc850ebd3b1fb9c12717f95770f364b6c129cc0332782b9 WatchSource:0}: Error finding container ab56058b6629c1311bc850ebd3b1fb9c12717f95770f364b6c129cc0332782b9: Status 404 returned error can't find the container with id ab56058b6629c1311bc850ebd3b1fb9c12717f95770f364b6c129cc0332782b9 Feb 27 01:56:01 crc kubenswrapper[4771]: I0227 01:56:01.108439 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535956-fm6j9" event={"ID":"c85e9c07-b194-45d7-936a-33ade4484507","Type":"ContainerStarted","Data":"ab56058b6629c1311bc850ebd3b1fb9c12717f95770f364b6c129cc0332782b9"} Feb 27 01:56:03 crc kubenswrapper[4771]: I0227 01:56:03.141783 4771 generic.go:334] "Generic (PLEG): container finished" podID="c85e9c07-b194-45d7-936a-33ade4484507" containerID="abe02359aba3470f4e2235c39f84b863bd1f52bba85116ab6a231d4f94e48ca3" exitCode=0 Feb 27 01:56:03 crc kubenswrapper[4771]: I0227 01:56:03.141941 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535956-fm6j9" event={"ID":"c85e9c07-b194-45d7-936a-33ade4484507","Type":"ContainerDied","Data":"abe02359aba3470f4e2235c39f84b863bd1f52bba85116ab6a231d4f94e48ca3"} Feb 27 01:56:04 crc kubenswrapper[4771]: I0227 01:56:04.574153 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535956-fm6j9" Feb 27 01:56:04 crc kubenswrapper[4771]: I0227 01:56:04.756907 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lv8fz\" (UniqueName: \"kubernetes.io/projected/c85e9c07-b194-45d7-936a-33ade4484507-kube-api-access-lv8fz\") pod \"c85e9c07-b194-45d7-936a-33ade4484507\" (UID: \"c85e9c07-b194-45d7-936a-33ade4484507\") " Feb 27 01:56:04 crc kubenswrapper[4771]: I0227 01:56:04.763177 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c85e9c07-b194-45d7-936a-33ade4484507-kube-api-access-lv8fz" (OuterVolumeSpecName: "kube-api-access-lv8fz") pod "c85e9c07-b194-45d7-936a-33ade4484507" (UID: "c85e9c07-b194-45d7-936a-33ade4484507"). InnerVolumeSpecName "kube-api-access-lv8fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:56:04 crc kubenswrapper[4771]: I0227 01:56:04.859338 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lv8fz\" (UniqueName: \"kubernetes.io/projected/c85e9c07-b194-45d7-936a-33ade4484507-kube-api-access-lv8fz\") on node \"crc\" DevicePath \"\"" Feb 27 01:56:05 crc kubenswrapper[4771]: I0227 01:56:05.169405 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535956-fm6j9" event={"ID":"c85e9c07-b194-45d7-936a-33ade4484507","Type":"ContainerDied","Data":"ab56058b6629c1311bc850ebd3b1fb9c12717f95770f364b6c129cc0332782b9"} Feb 27 01:56:05 crc kubenswrapper[4771]: I0227 01:56:05.170177 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab56058b6629c1311bc850ebd3b1fb9c12717f95770f364b6c129cc0332782b9" Feb 27 01:56:05 crc kubenswrapper[4771]: I0227 01:56:05.169469 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535956-fm6j9" Feb 27 01:56:05 crc kubenswrapper[4771]: I0227 01:56:05.673212 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535950-flbxw"] Feb 27 01:56:05 crc kubenswrapper[4771]: I0227 01:56:05.683503 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535950-flbxw"] Feb 27 01:56:05 crc kubenswrapper[4771]: I0227 01:56:05.788092 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1de3ac97-64a7-45a4-9363-dff9ee8d5c9f" path="/var/lib/kubelet/pods/1de3ac97-64a7-45a4-9363-dff9ee8d5c9f/volumes" Feb 27 01:56:30 crc kubenswrapper[4771]: I0227 01:56:30.272487 4771 scope.go:117] "RemoveContainer" containerID="65991994110bdd7e030a864b76e72e95c2a27dc16d3b0305d959c2e847b7f0dc" Feb 27 01:57:28 crc kubenswrapper[4771]: I0227 01:57:28.953371 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:57:28 crc kubenswrapper[4771]: I0227 01:57:28.953912 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:57:58 crc kubenswrapper[4771]: I0227 01:57:58.952863 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:57:58 crc kubenswrapper[4771]: I0227 01:57:58.953461 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:58:00 crc kubenswrapper[4771]: I0227 01:58:00.160736 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535958-7982l"] Feb 27 01:58:00 crc kubenswrapper[4771]: E0227 01:58:00.161488 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c85e9c07-b194-45d7-936a-33ade4484507" containerName="oc" Feb 27 01:58:00 crc kubenswrapper[4771]: I0227 01:58:00.161506 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c85e9c07-b194-45d7-936a-33ade4484507" containerName="oc" Feb 27 01:58:00 crc kubenswrapper[4771]: I0227 01:58:00.161784 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c85e9c07-b194-45d7-936a-33ade4484507" containerName="oc" Feb 27 01:58:00 crc kubenswrapper[4771]: I0227 01:58:00.162642 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535958-7982l" Feb 27 01:58:00 crc kubenswrapper[4771]: I0227 01:58:00.169932 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535958-7982l"] Feb 27 01:58:00 crc kubenswrapper[4771]: I0227 01:58:00.172241 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 01:58:00 crc kubenswrapper[4771]: I0227 01:58:00.172339 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 01:58:00 crc kubenswrapper[4771]: I0227 01:58:00.172241 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 01:58:00 crc kubenswrapper[4771]: I0227 01:58:00.246955 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cptgg\" (UniqueName: \"kubernetes.io/projected/193afeb9-addb-4815-bf2f-4eebd0e2dfac-kube-api-access-cptgg\") pod \"auto-csr-approver-29535958-7982l\" (UID: \"193afeb9-addb-4815-bf2f-4eebd0e2dfac\") " pod="openshift-infra/auto-csr-approver-29535958-7982l" Feb 27 01:58:00 crc kubenswrapper[4771]: I0227 01:58:00.348818 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cptgg\" (UniqueName: \"kubernetes.io/projected/193afeb9-addb-4815-bf2f-4eebd0e2dfac-kube-api-access-cptgg\") pod \"auto-csr-approver-29535958-7982l\" (UID: \"193afeb9-addb-4815-bf2f-4eebd0e2dfac\") " pod="openshift-infra/auto-csr-approver-29535958-7982l" Feb 27 01:58:00 crc kubenswrapper[4771]: I0227 01:58:00.368196 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cptgg\" (UniqueName: \"kubernetes.io/projected/193afeb9-addb-4815-bf2f-4eebd0e2dfac-kube-api-access-cptgg\") pod \"auto-csr-approver-29535958-7982l\" (UID: \"193afeb9-addb-4815-bf2f-4eebd0e2dfac\") " pod="openshift-infra/auto-csr-approver-29535958-7982l" Feb 27 01:58:00 crc kubenswrapper[4771]: I0227 01:58:00.527181 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535958-7982l" Feb 27 01:58:00 crc kubenswrapper[4771]: I0227 01:58:00.977047 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535958-7982l"] Feb 27 01:58:00 crc kubenswrapper[4771]: I0227 01:58:00.986868 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 01:58:01 crc kubenswrapper[4771]: I0227 01:58:01.305971 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535958-7982l" event={"ID":"193afeb9-addb-4815-bf2f-4eebd0e2dfac","Type":"ContainerStarted","Data":"ffc605296415414cd1590e9b42dc1058b4218056cbaa300aa08763e8a7c5c240"} Feb 27 01:58:02 crc kubenswrapper[4771]: I0227 01:58:02.315203 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535958-7982l" event={"ID":"193afeb9-addb-4815-bf2f-4eebd0e2dfac","Type":"ContainerStarted","Data":"0179f80dea866d464d740c87d427df2b96226568e75808bf880a34c5e2effebb"} Feb 27 01:58:02 crc kubenswrapper[4771]: I0227 01:58:02.340335 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535958-7982l" podStartSLOduration=1.50829342 podStartE2EDuration="2.340313299s" podCreationTimestamp="2026-02-27 01:58:00 +0000 UTC" firstStartedPulling="2026-02-27 01:58:00.986655243 +0000 UTC m=+3193.924216531" lastFinishedPulling="2026-02-27 01:58:01.818675122 +0000 UTC m=+3194.756236410" observedRunningTime="2026-02-27 01:58:02.326607265 +0000 UTC m=+3195.264168563" watchObservedRunningTime="2026-02-27 01:58:02.340313299 +0000 UTC m=+3195.277874587" Feb 27 01:58:03 crc kubenswrapper[4771]: I0227 01:58:03.327291 4771 generic.go:334] "Generic (PLEG): container finished" podID="193afeb9-addb-4815-bf2f-4eebd0e2dfac" containerID="0179f80dea866d464d740c87d427df2b96226568e75808bf880a34c5e2effebb" exitCode=0 Feb 27 01:58:03 crc kubenswrapper[4771]: I0227 01:58:03.327398 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535958-7982l" event={"ID":"193afeb9-addb-4815-bf2f-4eebd0e2dfac","Type":"ContainerDied","Data":"0179f80dea866d464d740c87d427df2b96226568e75808bf880a34c5e2effebb"} Feb 27 01:58:04 crc kubenswrapper[4771]: I0227 01:58:04.763928 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535958-7982l" Feb 27 01:58:04 crc kubenswrapper[4771]: I0227 01:58:04.831904 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cptgg\" (UniqueName: \"kubernetes.io/projected/193afeb9-addb-4815-bf2f-4eebd0e2dfac-kube-api-access-cptgg\") pod \"193afeb9-addb-4815-bf2f-4eebd0e2dfac\" (UID: \"193afeb9-addb-4815-bf2f-4eebd0e2dfac\") " Feb 27 01:58:04 crc kubenswrapper[4771]: I0227 01:58:04.838998 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/193afeb9-addb-4815-bf2f-4eebd0e2dfac-kube-api-access-cptgg" (OuterVolumeSpecName: "kube-api-access-cptgg") pod "193afeb9-addb-4815-bf2f-4eebd0e2dfac" (UID: "193afeb9-addb-4815-bf2f-4eebd0e2dfac"). InnerVolumeSpecName "kube-api-access-cptgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 01:58:04 crc kubenswrapper[4771]: I0227 01:58:04.933726 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cptgg\" (UniqueName: \"kubernetes.io/projected/193afeb9-addb-4815-bf2f-4eebd0e2dfac-kube-api-access-cptgg\") on node \"crc\" DevicePath \"\"" Feb 27 01:58:05 crc kubenswrapper[4771]: I0227 01:58:05.375202 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535958-7982l" event={"ID":"193afeb9-addb-4815-bf2f-4eebd0e2dfac","Type":"ContainerDied","Data":"ffc605296415414cd1590e9b42dc1058b4218056cbaa300aa08763e8a7c5c240"} Feb 27 01:58:05 crc kubenswrapper[4771]: I0227 01:58:05.375277 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffc605296415414cd1590e9b42dc1058b4218056cbaa300aa08763e8a7c5c240" Feb 27 01:58:05 crc kubenswrapper[4771]: I0227 01:58:05.375370 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535958-7982l" Feb 27 01:58:05 crc kubenswrapper[4771]: I0227 01:58:05.416510 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535952-vpk4t"] Feb 27 01:58:05 crc kubenswrapper[4771]: I0227 01:58:05.428727 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535952-vpk4t"] Feb 27 01:58:05 crc kubenswrapper[4771]: I0227 01:58:05.785921 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1222f2a7-debd-4e47-bc0c-c16c371aaa9e" path="/var/lib/kubelet/pods/1222f2a7-debd-4e47-bc0c-c16c371aaa9e/volumes" Feb 27 01:58:28 crc kubenswrapper[4771]: I0227 01:58:28.953916 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 01:58:28 crc kubenswrapper[4771]: I0227 01:58:28.954523 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 01:58:28 crc kubenswrapper[4771]: I0227 01:58:28.954611 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 01:58:28 crc kubenswrapper[4771]: I0227 01:58:28.955393 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc"} pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 01:58:28 crc kubenswrapper[4771]: I0227 01:58:28.955472 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" containerID="cri-o://a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" gracePeriod=600 Feb 27 01:58:29 crc kubenswrapper[4771]: E0227 01:58:29.081059 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:58:29 crc kubenswrapper[4771]: I0227 01:58:29.617633 4771 generic.go:334] "Generic (PLEG): container finished" podID="ca81e505-d53f-496e-bd26-7cec669591e4" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" exitCode=0 Feb 27 01:58:29 crc kubenswrapper[4771]: I0227 01:58:29.617844 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerDied","Data":"a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc"} Feb 27 01:58:29 crc kubenswrapper[4771]: I0227 01:58:29.617938 4771 scope.go:117] "RemoveContainer" containerID="9bc7e891ab68febf267f0a069a80d85ee866b60dfced10cc635f46adae460c1b" Feb 27 01:58:29 crc kubenswrapper[4771]: I0227 01:58:29.618644 4771 scope.go:117] "RemoveContainer" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" Feb 27 01:58:29 crc kubenswrapper[4771]: E0227 01:58:29.618925 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:58:30 crc kubenswrapper[4771]: I0227 01:58:30.372407 4771 scope.go:117] "RemoveContainer" containerID="7e9c9bcf4dea774e33bddbc642e9cc21f2082ca598f5b2bf09a1df7ec2138c32" Feb 27 01:58:41 crc kubenswrapper[4771]: I0227 01:58:41.773622 4771 scope.go:117] "RemoveContainer" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" Feb 27 01:58:41 crc kubenswrapper[4771]: E0227 01:58:41.777284 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:58:56 crc kubenswrapper[4771]: I0227 01:58:56.773437 4771 scope.go:117] "RemoveContainer" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" Feb 27 01:58:56 crc kubenswrapper[4771]: E0227 01:58:56.774115 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:59:07 crc kubenswrapper[4771]: I0227 01:59:07.781990 4771 scope.go:117] "RemoveContainer" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" Feb 27 01:59:07 crc kubenswrapper[4771]: E0227 01:59:07.782938 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:59:19 crc kubenswrapper[4771]: I0227 01:59:19.774062 4771 scope.go:117] "RemoveContainer" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" Feb 27 01:59:19 crc kubenswrapper[4771]: E0227 01:59:19.775249 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:59:33 crc kubenswrapper[4771]: I0227 01:59:33.775176 4771 scope.go:117] "RemoveContainer" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" Feb 27 01:59:33 crc kubenswrapper[4771]: E0227 01:59:33.776539 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 01:59:45 crc kubenswrapper[4771]: I0227 01:59:45.774111 4771 scope.go:117] "RemoveContainer" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" Feb 27 01:59:45 crc kubenswrapper[4771]: E0227 01:59:45.775048 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.186243 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535960-c4dcm"] Feb 27 02:00:00 crc kubenswrapper[4771]: E0227 02:00:00.187280 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="193afeb9-addb-4815-bf2f-4eebd0e2dfac" containerName="oc" Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.187297 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="193afeb9-addb-4815-bf2f-4eebd0e2dfac" containerName="oc" Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.187539 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="193afeb9-addb-4815-bf2f-4eebd0e2dfac" containerName="oc" Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.188339 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535960-c4dcm" Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.190773 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.191033 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.232309 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535960-plddw"] Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.234025 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535960-plddw" Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.237003 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.237024 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.239169 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.243601 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535960-c4dcm"] Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.267584 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535960-plddw"] Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.275458 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c525674-3bb7-46eb-82c8-ace60474f550-config-volume\") pod \"collect-profiles-29535960-c4dcm\" (UID: \"7c525674-3bb7-46eb-82c8-ace60474f550\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535960-c4dcm" Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.275564 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzg8z\" (UniqueName: \"kubernetes.io/projected/7c525674-3bb7-46eb-82c8-ace60474f550-kube-api-access-fzg8z\") pod \"collect-profiles-29535960-c4dcm\" (UID: \"7c525674-3bb7-46eb-82c8-ace60474f550\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535960-c4dcm" Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.275824 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c525674-3bb7-46eb-82c8-ace60474f550-secret-volume\") pod \"collect-profiles-29535960-c4dcm\" (UID: \"7c525674-3bb7-46eb-82c8-ace60474f550\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535960-c4dcm" Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.276083 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6q7l\" (UniqueName: \"kubernetes.io/projected/e1605ec4-f8ca-4b1f-a83e-6e2b3c47f0b5-kube-api-access-j6q7l\") pod \"auto-csr-approver-29535960-plddw\" (UID: \"e1605ec4-f8ca-4b1f-a83e-6e2b3c47f0b5\") " pod="openshift-infra/auto-csr-approver-29535960-plddw" Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.378094 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6q7l\" (UniqueName: \"kubernetes.io/projected/e1605ec4-f8ca-4b1f-a83e-6e2b3c47f0b5-kube-api-access-j6q7l\") pod \"auto-csr-approver-29535960-plddw\" (UID: \"e1605ec4-f8ca-4b1f-a83e-6e2b3c47f0b5\") " pod="openshift-infra/auto-csr-approver-29535960-plddw" Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.378423 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c525674-3bb7-46eb-82c8-ace60474f550-config-volume\") pod \"collect-profiles-29535960-c4dcm\" (UID: \"7c525674-3bb7-46eb-82c8-ace60474f550\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535960-c4dcm" Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.378561 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzg8z\" (UniqueName: \"kubernetes.io/projected/7c525674-3bb7-46eb-82c8-ace60474f550-kube-api-access-fzg8z\") pod \"collect-profiles-29535960-c4dcm\" (UID: \"7c525674-3bb7-46eb-82c8-ace60474f550\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535960-c4dcm" Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.378683 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c525674-3bb7-46eb-82c8-ace60474f550-secret-volume\") pod \"collect-profiles-29535960-c4dcm\" (UID: \"7c525674-3bb7-46eb-82c8-ace60474f550\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535960-c4dcm" Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.379598 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c525674-3bb7-46eb-82c8-ace60474f550-config-volume\") pod \"collect-profiles-29535960-c4dcm\" (UID: \"7c525674-3bb7-46eb-82c8-ace60474f550\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535960-c4dcm" Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.384745 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c525674-3bb7-46eb-82c8-ace60474f550-secret-volume\") pod \"collect-profiles-29535960-c4dcm\" (UID: \"7c525674-3bb7-46eb-82c8-ace60474f550\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535960-c4dcm" Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.396603 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6q7l\" (UniqueName: \"kubernetes.io/projected/e1605ec4-f8ca-4b1f-a83e-6e2b3c47f0b5-kube-api-access-j6q7l\") pod \"auto-csr-approver-29535960-plddw\" (UID: \"e1605ec4-f8ca-4b1f-a83e-6e2b3c47f0b5\") " pod="openshift-infra/auto-csr-approver-29535960-plddw" Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.399587 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzg8z\" (UniqueName: \"kubernetes.io/projected/7c525674-3bb7-46eb-82c8-ace60474f550-kube-api-access-fzg8z\") pod \"collect-profiles-29535960-c4dcm\" (UID: \"7c525674-3bb7-46eb-82c8-ace60474f550\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535960-c4dcm" Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.551948 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535960-c4dcm" Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.564111 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535960-plddw" Feb 27 02:00:00 crc kubenswrapper[4771]: I0227 02:00:00.776605 4771 scope.go:117] "RemoveContainer" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" Feb 27 02:00:00 crc kubenswrapper[4771]: E0227 02:00:00.777175 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:00:01 crc kubenswrapper[4771]: I0227 02:00:01.042703 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535960-plddw"] Feb 27 02:00:01 crc kubenswrapper[4771]: W0227 02:00:01.142853 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c525674_3bb7_46eb_82c8_ace60474f550.slice/crio-af35350a2378827c67e73109524db053355278ed6b9a4be5e728e6ea484d2f52 WatchSource:0}: Error finding container af35350a2378827c67e73109524db053355278ed6b9a4be5e728e6ea484d2f52: Status 404 returned error can't find the container with id af35350a2378827c67e73109524db053355278ed6b9a4be5e728e6ea484d2f52 Feb 27 02:00:01 crc kubenswrapper[4771]: I0227 02:00:01.143415 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535960-c4dcm"] Feb 27 02:00:01 crc kubenswrapper[4771]: I0227 02:00:01.586827 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535960-plddw" event={"ID":"e1605ec4-f8ca-4b1f-a83e-6e2b3c47f0b5","Type":"ContainerStarted","Data":"644621dc88ef0dcbda13bf2a4b9c001c59f77845f05c577a66417ad5ea193844"} Feb 27 02:00:01 crc kubenswrapper[4771]: I0227 02:00:01.588234 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535960-c4dcm" event={"ID":"7c525674-3bb7-46eb-82c8-ace60474f550","Type":"ContainerStarted","Data":"f1de2d783b742ba5938e1c63b67c348ae0d789e72aba2f52051364ec25e3d91f"} Feb 27 02:00:01 crc kubenswrapper[4771]: I0227 02:00:01.588285 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535960-c4dcm" event={"ID":"7c525674-3bb7-46eb-82c8-ace60474f550","Type":"ContainerStarted","Data":"af35350a2378827c67e73109524db053355278ed6b9a4be5e728e6ea484d2f52"} Feb 27 02:00:02 crc kubenswrapper[4771]: I0227 02:00:02.598084 4771 generic.go:334] "Generic (PLEG): container finished" podID="7c525674-3bb7-46eb-82c8-ace60474f550" containerID="f1de2d783b742ba5938e1c63b67c348ae0d789e72aba2f52051364ec25e3d91f" exitCode=0 Feb 27 02:00:02 crc kubenswrapper[4771]: I0227 02:00:02.598185 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535960-c4dcm" event={"ID":"7c525674-3bb7-46eb-82c8-ace60474f550","Type":"ContainerDied","Data":"f1de2d783b742ba5938e1c63b67c348ae0d789e72aba2f52051364ec25e3d91f"} Feb 27 02:00:04 crc kubenswrapper[4771]: I0227 02:00:04.010760 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535960-c4dcm" Feb 27 02:00:04 crc kubenswrapper[4771]: I0227 02:00:04.061226 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c525674-3bb7-46eb-82c8-ace60474f550-secret-volume\") pod \"7c525674-3bb7-46eb-82c8-ace60474f550\" (UID: \"7c525674-3bb7-46eb-82c8-ace60474f550\") " Feb 27 02:00:04 crc kubenswrapper[4771]: I0227 02:00:04.061415 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c525674-3bb7-46eb-82c8-ace60474f550-config-volume\") pod \"7c525674-3bb7-46eb-82c8-ace60474f550\" (UID: \"7c525674-3bb7-46eb-82c8-ace60474f550\") " Feb 27 02:00:04 crc kubenswrapper[4771]: I0227 02:00:04.061473 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzg8z\" (UniqueName: \"kubernetes.io/projected/7c525674-3bb7-46eb-82c8-ace60474f550-kube-api-access-fzg8z\") pod \"7c525674-3bb7-46eb-82c8-ace60474f550\" (UID: \"7c525674-3bb7-46eb-82c8-ace60474f550\") " Feb 27 02:00:04 crc kubenswrapper[4771]: I0227 02:00:04.062031 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c525674-3bb7-46eb-82c8-ace60474f550-config-volume" (OuterVolumeSpecName: "config-volume") pod "7c525674-3bb7-46eb-82c8-ace60474f550" (UID: "7c525674-3bb7-46eb-82c8-ace60474f550"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 02:00:04 crc kubenswrapper[4771]: I0227 02:00:04.069965 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c525674-3bb7-46eb-82c8-ace60474f550-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7c525674-3bb7-46eb-82c8-ace60474f550" (UID: "7c525674-3bb7-46eb-82c8-ace60474f550"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 02:00:04 crc kubenswrapper[4771]: I0227 02:00:04.069998 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c525674-3bb7-46eb-82c8-ace60474f550-kube-api-access-fzg8z" (OuterVolumeSpecName: "kube-api-access-fzg8z") pod "7c525674-3bb7-46eb-82c8-ace60474f550" (UID: "7c525674-3bb7-46eb-82c8-ace60474f550"). InnerVolumeSpecName "kube-api-access-fzg8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:00:04 crc kubenswrapper[4771]: I0227 02:00:04.164140 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c525674-3bb7-46eb-82c8-ace60474f550-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 02:00:04 crc kubenswrapper[4771]: I0227 02:00:04.164199 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c525674-3bb7-46eb-82c8-ace60474f550-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 02:00:04 crc kubenswrapper[4771]: I0227 02:00:04.164211 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzg8z\" (UniqueName: \"kubernetes.io/projected/7c525674-3bb7-46eb-82c8-ace60474f550-kube-api-access-fzg8z\") on node \"crc\" DevicePath \"\"" Feb 27 02:00:04 crc kubenswrapper[4771]: I0227 02:00:04.618911 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535960-c4dcm" event={"ID":"7c525674-3bb7-46eb-82c8-ace60474f550","Type":"ContainerDied","Data":"af35350a2378827c67e73109524db053355278ed6b9a4be5e728e6ea484d2f52"} Feb 27 02:00:04 crc kubenswrapper[4771]: I0227 02:00:04.619177 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af35350a2378827c67e73109524db053355278ed6b9a4be5e728e6ea484d2f52" Feb 27 02:00:04 crc kubenswrapper[4771]: I0227 02:00:04.618999 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535960-c4dcm" Feb 27 02:00:04 crc kubenswrapper[4771]: I0227 02:00:04.691120 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535915-54bjv"] Feb 27 02:00:04 crc kubenswrapper[4771]: I0227 02:00:04.700851 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535915-54bjv"] Feb 27 02:00:05 crc kubenswrapper[4771]: I0227 02:00:05.785848 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e692db9a-5217-48e9-a817-4ba90c53dc40" path="/var/lib/kubelet/pods/e692db9a-5217-48e9-a817-4ba90c53dc40/volumes" Feb 27 02:00:11 crc kubenswrapper[4771]: I0227 02:00:11.780267 4771 scope.go:117] "RemoveContainer" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" Feb 27 02:00:11 crc kubenswrapper[4771]: E0227 02:00:11.782759 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:00:13 crc kubenswrapper[4771]: I0227 02:00:13.712886 4771 generic.go:334] "Generic (PLEG): container finished" podID="e1605ec4-f8ca-4b1f-a83e-6e2b3c47f0b5" containerID="877f8124a79b3607236d0a11442fbf3d7a87931db76d21be89efbeb0295a1431" exitCode=0 Feb 27 02:00:13 crc kubenswrapper[4771]: I0227 02:00:13.712992 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535960-plddw" event={"ID":"e1605ec4-f8ca-4b1f-a83e-6e2b3c47f0b5","Type":"ContainerDied","Data":"877f8124a79b3607236d0a11442fbf3d7a87931db76d21be89efbeb0295a1431"} Feb 27 02:00:15 crc kubenswrapper[4771]: I0227 02:00:15.223017 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535960-plddw" Feb 27 02:00:15 crc kubenswrapper[4771]: I0227 02:00:15.406496 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6q7l\" (UniqueName: \"kubernetes.io/projected/e1605ec4-f8ca-4b1f-a83e-6e2b3c47f0b5-kube-api-access-j6q7l\") pod \"e1605ec4-f8ca-4b1f-a83e-6e2b3c47f0b5\" (UID: \"e1605ec4-f8ca-4b1f-a83e-6e2b3c47f0b5\") " Feb 27 02:00:15 crc kubenswrapper[4771]: I0227 02:00:15.416091 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1605ec4-f8ca-4b1f-a83e-6e2b3c47f0b5-kube-api-access-j6q7l" (OuterVolumeSpecName: "kube-api-access-j6q7l") pod "e1605ec4-f8ca-4b1f-a83e-6e2b3c47f0b5" (UID: "e1605ec4-f8ca-4b1f-a83e-6e2b3c47f0b5"). InnerVolumeSpecName "kube-api-access-j6q7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:00:15 crc kubenswrapper[4771]: I0227 02:00:15.509787 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6q7l\" (UniqueName: \"kubernetes.io/projected/e1605ec4-f8ca-4b1f-a83e-6e2b3c47f0b5-kube-api-access-j6q7l\") on node \"crc\" DevicePath \"\"" Feb 27 02:00:15 crc kubenswrapper[4771]: I0227 02:00:15.738895 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535960-plddw" event={"ID":"e1605ec4-f8ca-4b1f-a83e-6e2b3c47f0b5","Type":"ContainerDied","Data":"644621dc88ef0dcbda13bf2a4b9c001c59f77845f05c577a66417ad5ea193844"} Feb 27 02:00:15 crc kubenswrapper[4771]: I0227 02:00:15.739264 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="644621dc88ef0dcbda13bf2a4b9c001c59f77845f05c577a66417ad5ea193844" Feb 27 02:00:15 crc kubenswrapper[4771]: I0227 02:00:15.738970 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535960-plddw" Feb 27 02:00:16 crc kubenswrapper[4771]: I0227 02:00:16.294184 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535954-k96n5"] Feb 27 02:00:16 crc kubenswrapper[4771]: I0227 02:00:16.305133 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535954-k96n5"] Feb 27 02:00:17 crc kubenswrapper[4771]: I0227 02:00:17.785446 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a87ea5d6-cb36-4e80-9a90-ecda58c578ff" path="/var/lib/kubelet/pods/a87ea5d6-cb36-4e80-9a90-ecda58c578ff/volumes" Feb 27 02:00:22 crc kubenswrapper[4771]: I0227 02:00:22.774440 4771 scope.go:117] "RemoveContainer" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" Feb 27 02:00:22 crc kubenswrapper[4771]: E0227 02:00:22.775454 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:00:30 crc kubenswrapper[4771]: I0227 02:00:30.463064 4771 scope.go:117] "RemoveContainer" containerID="7c4907178ca1611c934099998063d527b93238a7ffb83d3e9030d58b6ba31ad3" Feb 27 02:00:30 crc kubenswrapper[4771]: I0227 02:00:30.507444 4771 scope.go:117] "RemoveContainer" containerID="3670fdbaac6840be3ff89483f6ef29647120e0da2783cd28a55ad11c415061dc" Feb 27 02:00:36 crc kubenswrapper[4771]: I0227 02:00:36.774228 4771 scope.go:117] "RemoveContainer" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" Feb 27 02:00:36 crc kubenswrapper[4771]: E0227 02:00:36.775464 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:00:51 crc kubenswrapper[4771]: I0227 02:00:51.773848 4771 scope.go:117] "RemoveContainer" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" Feb 27 02:00:51 crc kubenswrapper[4771]: E0227 02:00:51.774526 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:01:00 crc kubenswrapper[4771]: I0227 02:01:00.156164 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29535961-smmft"] Feb 27 02:01:00 crc kubenswrapper[4771]: E0227 02:01:00.157103 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c525674-3bb7-46eb-82c8-ace60474f550" containerName="collect-profiles" Feb 27 02:01:00 crc kubenswrapper[4771]: I0227 02:01:00.157118 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c525674-3bb7-46eb-82c8-ace60474f550" containerName="collect-profiles" Feb 27 02:01:00 crc kubenswrapper[4771]: E0227 02:01:00.157133 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1605ec4-f8ca-4b1f-a83e-6e2b3c47f0b5" containerName="oc" Feb 27 02:01:00 crc kubenswrapper[4771]: I0227 02:01:00.157139 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1605ec4-f8ca-4b1f-a83e-6e2b3c47f0b5" containerName="oc" Feb 27 02:01:00 crc kubenswrapper[4771]: I0227 02:01:00.157318 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c525674-3bb7-46eb-82c8-ace60474f550" containerName="collect-profiles" Feb 27 02:01:00 crc kubenswrapper[4771]: I0227 02:01:00.157343 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1605ec4-f8ca-4b1f-a83e-6e2b3c47f0b5" containerName="oc" Feb 27 02:01:00 crc kubenswrapper[4771]: I0227 02:01:00.157911 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535961-smmft" Feb 27 02:01:00 crc kubenswrapper[4771]: I0227 02:01:00.167100 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29535961-smmft"] Feb 27 02:01:00 crc kubenswrapper[4771]: I0227 02:01:00.325746 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c1ffed-6a0c-4b69-9dfe-2474731d06b7-combined-ca-bundle\") pod \"keystone-cron-29535961-smmft\" (UID: \"46c1ffed-6a0c-4b69-9dfe-2474731d06b7\") " pod="openstack/keystone-cron-29535961-smmft" Feb 27 02:01:00 crc kubenswrapper[4771]: I0227 02:01:00.325815 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnzfl\" (UniqueName: \"kubernetes.io/projected/46c1ffed-6a0c-4b69-9dfe-2474731d06b7-kube-api-access-hnzfl\") pod \"keystone-cron-29535961-smmft\" (UID: \"46c1ffed-6a0c-4b69-9dfe-2474731d06b7\") " pod="openstack/keystone-cron-29535961-smmft" Feb 27 02:01:00 crc kubenswrapper[4771]: I0227 02:01:00.325839 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46c1ffed-6a0c-4b69-9dfe-2474731d06b7-config-data\") pod \"keystone-cron-29535961-smmft\" (UID: \"46c1ffed-6a0c-4b69-9dfe-2474731d06b7\") " pod="openstack/keystone-cron-29535961-smmft" Feb 27 02:01:00 crc kubenswrapper[4771]: I0227 02:01:00.325868 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46c1ffed-6a0c-4b69-9dfe-2474731d06b7-fernet-keys\") pod \"keystone-cron-29535961-smmft\" (UID: \"46c1ffed-6a0c-4b69-9dfe-2474731d06b7\") " pod="openstack/keystone-cron-29535961-smmft" Feb 27 02:01:00 crc kubenswrapper[4771]: I0227 02:01:00.428531 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c1ffed-6a0c-4b69-9dfe-2474731d06b7-combined-ca-bundle\") pod \"keystone-cron-29535961-smmft\" (UID: \"46c1ffed-6a0c-4b69-9dfe-2474731d06b7\") " pod="openstack/keystone-cron-29535961-smmft" Feb 27 02:01:00 crc kubenswrapper[4771]: I0227 02:01:00.428667 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnzfl\" (UniqueName: \"kubernetes.io/projected/46c1ffed-6a0c-4b69-9dfe-2474731d06b7-kube-api-access-hnzfl\") pod \"keystone-cron-29535961-smmft\" (UID: \"46c1ffed-6a0c-4b69-9dfe-2474731d06b7\") " pod="openstack/keystone-cron-29535961-smmft" Feb 27 02:01:00 crc kubenswrapper[4771]: I0227 02:01:00.428691 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46c1ffed-6a0c-4b69-9dfe-2474731d06b7-config-data\") pod \"keystone-cron-29535961-smmft\" (UID: \"46c1ffed-6a0c-4b69-9dfe-2474731d06b7\") " pod="openstack/keystone-cron-29535961-smmft" Feb 27 02:01:00 crc kubenswrapper[4771]: I0227 02:01:00.428722 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46c1ffed-6a0c-4b69-9dfe-2474731d06b7-fernet-keys\") pod \"keystone-cron-29535961-smmft\" (UID: \"46c1ffed-6a0c-4b69-9dfe-2474731d06b7\") " pod="openstack/keystone-cron-29535961-smmft" Feb 27 02:01:00 crc kubenswrapper[4771]: I0227 02:01:00.436364 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46c1ffed-6a0c-4b69-9dfe-2474731d06b7-fernet-keys\") pod \"keystone-cron-29535961-smmft\" (UID: \"46c1ffed-6a0c-4b69-9dfe-2474731d06b7\") " pod="openstack/keystone-cron-29535961-smmft" Feb 27 02:01:00 crc kubenswrapper[4771]: I0227 02:01:00.437191 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c1ffed-6a0c-4b69-9dfe-2474731d06b7-combined-ca-bundle\") pod \"keystone-cron-29535961-smmft\" (UID: \"46c1ffed-6a0c-4b69-9dfe-2474731d06b7\") " pod="openstack/keystone-cron-29535961-smmft" Feb 27 02:01:00 crc kubenswrapper[4771]: I0227 02:01:00.437953 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46c1ffed-6a0c-4b69-9dfe-2474731d06b7-config-data\") pod \"keystone-cron-29535961-smmft\" (UID: \"46c1ffed-6a0c-4b69-9dfe-2474731d06b7\") " pod="openstack/keystone-cron-29535961-smmft" Feb 27 02:01:00 crc kubenswrapper[4771]: I0227 02:01:00.451183 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnzfl\" (UniqueName: \"kubernetes.io/projected/46c1ffed-6a0c-4b69-9dfe-2474731d06b7-kube-api-access-hnzfl\") pod \"keystone-cron-29535961-smmft\" (UID: \"46c1ffed-6a0c-4b69-9dfe-2474731d06b7\") " pod="openstack/keystone-cron-29535961-smmft" Feb 27 02:01:00 crc kubenswrapper[4771]: I0227 02:01:00.474185 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535961-smmft" Feb 27 02:01:00 crc kubenswrapper[4771]: I0227 02:01:00.988333 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29535961-smmft"] Feb 27 02:01:00 crc kubenswrapper[4771]: W0227 02:01:00.993383 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46c1ffed_6a0c_4b69_9dfe_2474731d06b7.slice/crio-5222023d80e9009b87db608b58eb6e9e6c3baa685b7750355025e537af8f544c WatchSource:0}: Error finding container 5222023d80e9009b87db608b58eb6e9e6c3baa685b7750355025e537af8f544c: Status 404 returned error can't find the container with id 5222023d80e9009b87db608b58eb6e9e6c3baa685b7750355025e537af8f544c Feb 27 02:01:01 crc kubenswrapper[4771]: I0227 02:01:01.209485 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535961-smmft" event={"ID":"46c1ffed-6a0c-4b69-9dfe-2474731d06b7","Type":"ContainerStarted","Data":"590887a838e10dce63b28d162a5886e5b55c71c0958a12e6f6a9a543701ba860"} Feb 27 02:01:01 crc kubenswrapper[4771]: I0227 02:01:01.209528 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535961-smmft" event={"ID":"46c1ffed-6a0c-4b69-9dfe-2474731d06b7","Type":"ContainerStarted","Data":"5222023d80e9009b87db608b58eb6e9e6c3baa685b7750355025e537af8f544c"} Feb 27 02:01:01 crc kubenswrapper[4771]: I0227 02:01:01.225813 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29535961-smmft" podStartSLOduration=1.225796358 podStartE2EDuration="1.225796358s" podCreationTimestamp="2026-02-27 02:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 02:01:01.223083346 +0000 UTC m=+3374.160644634" watchObservedRunningTime="2026-02-27 02:01:01.225796358 +0000 UTC m=+3374.163357646" Feb 27 02:01:02 crc kubenswrapper[4771]: I0227 02:01:02.773718 4771 scope.go:117] "RemoveContainer" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" Feb 27 02:01:02 crc kubenswrapper[4771]: E0227 02:01:02.774380 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:01:03 crc kubenswrapper[4771]: I0227 02:01:03.226185 4771 generic.go:334] "Generic (PLEG): container finished" podID="46c1ffed-6a0c-4b69-9dfe-2474731d06b7" containerID="590887a838e10dce63b28d162a5886e5b55c71c0958a12e6f6a9a543701ba860" exitCode=0 Feb 27 02:01:03 crc kubenswrapper[4771]: I0227 02:01:03.226236 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535961-smmft" event={"ID":"46c1ffed-6a0c-4b69-9dfe-2474731d06b7","Type":"ContainerDied","Data":"590887a838e10dce63b28d162a5886e5b55c71c0958a12e6f6a9a543701ba860"} Feb 27 02:01:04 crc kubenswrapper[4771]: I0227 02:01:04.604960 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535961-smmft" Feb 27 02:01:04 crc kubenswrapper[4771]: I0227 02:01:04.807962 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c1ffed-6a0c-4b69-9dfe-2474731d06b7-combined-ca-bundle\") pod \"46c1ffed-6a0c-4b69-9dfe-2474731d06b7\" (UID: \"46c1ffed-6a0c-4b69-9dfe-2474731d06b7\") " Feb 27 02:01:04 crc kubenswrapper[4771]: I0227 02:01:04.808242 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnzfl\" (UniqueName: \"kubernetes.io/projected/46c1ffed-6a0c-4b69-9dfe-2474731d06b7-kube-api-access-hnzfl\") pod \"46c1ffed-6a0c-4b69-9dfe-2474731d06b7\" (UID: \"46c1ffed-6a0c-4b69-9dfe-2474731d06b7\") " Feb 27 02:01:04 crc kubenswrapper[4771]: I0227 02:01:04.808311 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46c1ffed-6a0c-4b69-9dfe-2474731d06b7-config-data\") pod \"46c1ffed-6a0c-4b69-9dfe-2474731d06b7\" (UID: \"46c1ffed-6a0c-4b69-9dfe-2474731d06b7\") " Feb 27 02:01:04 crc kubenswrapper[4771]: I0227 02:01:04.808337 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46c1ffed-6a0c-4b69-9dfe-2474731d06b7-fernet-keys\") pod \"46c1ffed-6a0c-4b69-9dfe-2474731d06b7\" (UID: \"46c1ffed-6a0c-4b69-9dfe-2474731d06b7\") " Feb 27 02:01:04 crc kubenswrapper[4771]: I0227 02:01:04.814877 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46c1ffed-6a0c-4b69-9dfe-2474731d06b7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "46c1ffed-6a0c-4b69-9dfe-2474731d06b7" (UID: "46c1ffed-6a0c-4b69-9dfe-2474731d06b7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 02:01:04 crc kubenswrapper[4771]: I0227 02:01:04.822703 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46c1ffed-6a0c-4b69-9dfe-2474731d06b7-kube-api-access-hnzfl" (OuterVolumeSpecName: "kube-api-access-hnzfl") pod "46c1ffed-6a0c-4b69-9dfe-2474731d06b7" (UID: "46c1ffed-6a0c-4b69-9dfe-2474731d06b7"). InnerVolumeSpecName "kube-api-access-hnzfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:01:04 crc kubenswrapper[4771]: I0227 02:01:04.840505 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46c1ffed-6a0c-4b69-9dfe-2474731d06b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46c1ffed-6a0c-4b69-9dfe-2474731d06b7" (UID: "46c1ffed-6a0c-4b69-9dfe-2474731d06b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 02:01:04 crc kubenswrapper[4771]: I0227 02:01:04.876212 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46c1ffed-6a0c-4b69-9dfe-2474731d06b7-config-data" (OuterVolumeSpecName: "config-data") pod "46c1ffed-6a0c-4b69-9dfe-2474731d06b7" (UID: "46c1ffed-6a0c-4b69-9dfe-2474731d06b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 02:01:04 crc kubenswrapper[4771]: I0227 02:01:04.911615 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnzfl\" (UniqueName: \"kubernetes.io/projected/46c1ffed-6a0c-4b69-9dfe-2474731d06b7-kube-api-access-hnzfl\") on node \"crc\" DevicePath \"\"" Feb 27 02:01:04 crc kubenswrapper[4771]: I0227 02:01:04.911652 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46c1ffed-6a0c-4b69-9dfe-2474731d06b7-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 02:01:04 crc kubenswrapper[4771]: I0227 02:01:04.911662 4771 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46c1ffed-6a0c-4b69-9dfe-2474731d06b7-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 27 02:01:04 crc kubenswrapper[4771]: I0227 02:01:04.911670 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c1ffed-6a0c-4b69-9dfe-2474731d06b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 02:01:05 crc kubenswrapper[4771]: I0227 02:01:05.248308 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535961-smmft" event={"ID":"46c1ffed-6a0c-4b69-9dfe-2474731d06b7","Type":"ContainerDied","Data":"5222023d80e9009b87db608b58eb6e9e6c3baa685b7750355025e537af8f544c"} Feb 27 02:01:05 crc kubenswrapper[4771]: I0227 02:01:05.248646 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5222023d80e9009b87db608b58eb6e9e6c3baa685b7750355025e537af8f544c" Feb 27 02:01:05 crc kubenswrapper[4771]: I0227 02:01:05.248409 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535961-smmft" Feb 27 02:01:11 crc kubenswrapper[4771]: I0227 02:01:11.733070 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lz7zh"] Feb 27 02:01:11 crc kubenswrapper[4771]: E0227 02:01:11.734055 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46c1ffed-6a0c-4b69-9dfe-2474731d06b7" containerName="keystone-cron" Feb 27 02:01:11 crc kubenswrapper[4771]: I0227 02:01:11.734066 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="46c1ffed-6a0c-4b69-9dfe-2474731d06b7" containerName="keystone-cron" Feb 27 02:01:11 crc kubenswrapper[4771]: I0227 02:01:11.734254 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="46c1ffed-6a0c-4b69-9dfe-2474731d06b7" containerName="keystone-cron" Feb 27 02:01:11 crc kubenswrapper[4771]: I0227 02:01:11.735619 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lz7zh" Feb 27 02:01:11 crc kubenswrapper[4771]: I0227 02:01:11.745103 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lz7zh"] Feb 27 02:01:11 crc kubenswrapper[4771]: I0227 02:01:11.792993 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqjdm\" (UniqueName: \"kubernetes.io/projected/d514f025-100f-47e6-bf55-43e597ce1f3a-kube-api-access-hqjdm\") pod \"community-operators-lz7zh\" (UID: \"d514f025-100f-47e6-bf55-43e597ce1f3a\") " pod="openshift-marketplace/community-operators-lz7zh" Feb 27 02:01:11 crc kubenswrapper[4771]: I0227 02:01:11.793106 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d514f025-100f-47e6-bf55-43e597ce1f3a-catalog-content\") pod \"community-operators-lz7zh\" (UID: \"d514f025-100f-47e6-bf55-43e597ce1f3a\") " pod="openshift-marketplace/community-operators-lz7zh" Feb 27 02:01:11 crc kubenswrapper[4771]: I0227 02:01:11.793202 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d514f025-100f-47e6-bf55-43e597ce1f3a-utilities\") pod \"community-operators-lz7zh\" (UID: \"d514f025-100f-47e6-bf55-43e597ce1f3a\") " pod="openshift-marketplace/community-operators-lz7zh" Feb 27 02:01:11 crc kubenswrapper[4771]: I0227 02:01:11.894954 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqjdm\" (UniqueName: \"kubernetes.io/projected/d514f025-100f-47e6-bf55-43e597ce1f3a-kube-api-access-hqjdm\") pod \"community-operators-lz7zh\" (UID: \"d514f025-100f-47e6-bf55-43e597ce1f3a\") " pod="openshift-marketplace/community-operators-lz7zh" Feb 27 02:01:11 crc kubenswrapper[4771]: I0227 02:01:11.895049 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d514f025-100f-47e6-bf55-43e597ce1f3a-catalog-content\") pod \"community-operators-lz7zh\" (UID: \"d514f025-100f-47e6-bf55-43e597ce1f3a\") " pod="openshift-marketplace/community-operators-lz7zh" Feb 27 02:01:11 crc kubenswrapper[4771]: I0227 02:01:11.895110 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d514f025-100f-47e6-bf55-43e597ce1f3a-utilities\") pod \"community-operators-lz7zh\" (UID: \"d514f025-100f-47e6-bf55-43e597ce1f3a\") " pod="openshift-marketplace/community-operators-lz7zh" Feb 27 02:01:11 crc kubenswrapper[4771]: I0227 02:01:11.895612 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d514f025-100f-47e6-bf55-43e597ce1f3a-utilities\") pod \"community-operators-lz7zh\" (UID: \"d514f025-100f-47e6-bf55-43e597ce1f3a\") " pod="openshift-marketplace/community-operators-lz7zh" Feb 27 02:01:11 crc kubenswrapper[4771]: I0227 02:01:11.895853 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d514f025-100f-47e6-bf55-43e597ce1f3a-catalog-content\") pod \"community-operators-lz7zh\" (UID: \"d514f025-100f-47e6-bf55-43e597ce1f3a\") " pod="openshift-marketplace/community-operators-lz7zh" Feb 27 02:01:11 crc kubenswrapper[4771]: I0227 02:01:11.916355 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqjdm\" (UniqueName: \"kubernetes.io/projected/d514f025-100f-47e6-bf55-43e597ce1f3a-kube-api-access-hqjdm\") pod \"community-operators-lz7zh\" (UID: \"d514f025-100f-47e6-bf55-43e597ce1f3a\") " pod="openshift-marketplace/community-operators-lz7zh" Feb 27 02:01:12 crc kubenswrapper[4771]: I0227 02:01:12.089163 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lz7zh" Feb 27 02:01:12 crc kubenswrapper[4771]: I0227 02:01:12.673873 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lz7zh"] Feb 27 02:01:13 crc kubenswrapper[4771]: I0227 02:01:13.362507 4771 generic.go:334] "Generic (PLEG): container finished" podID="d514f025-100f-47e6-bf55-43e597ce1f3a" containerID="643c920ba25918e02b12279bbf8a31fc9a3a652738bf6144394b9c42fbd9e64f" exitCode=0 Feb 27 02:01:13 crc kubenswrapper[4771]: I0227 02:01:13.362590 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lz7zh" event={"ID":"d514f025-100f-47e6-bf55-43e597ce1f3a","Type":"ContainerDied","Data":"643c920ba25918e02b12279bbf8a31fc9a3a652738bf6144394b9c42fbd9e64f"} Feb 27 02:01:13 crc kubenswrapper[4771]: I0227 02:01:13.362776 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lz7zh" event={"ID":"d514f025-100f-47e6-bf55-43e597ce1f3a","Type":"ContainerStarted","Data":"762361f094ac144edf5eeb29c0d3f425af371a87668e9bf581faad6ac3053b28"} Feb 27 02:01:15 crc kubenswrapper[4771]: I0227 02:01:15.380578 4771 generic.go:334] "Generic (PLEG): container finished" podID="d514f025-100f-47e6-bf55-43e597ce1f3a" containerID="ea73112139f932468cf984534c1ac15c75098563774d598d60e3206e7fc662d1" exitCode=0 Feb 27 02:01:15 crc kubenswrapper[4771]: I0227 02:01:15.380790 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lz7zh" event={"ID":"d514f025-100f-47e6-bf55-43e597ce1f3a","Type":"ContainerDied","Data":"ea73112139f932468cf984534c1ac15c75098563774d598d60e3206e7fc662d1"} Feb 27 02:01:16 crc kubenswrapper[4771]: I0227 02:01:16.394492 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lz7zh" event={"ID":"d514f025-100f-47e6-bf55-43e597ce1f3a","Type":"ContainerStarted","Data":"0dab3b775e687105d8ad0c6e07aaab790d6749e19e332cb00008faa51bd11ec2"} Feb 27 02:01:16 crc kubenswrapper[4771]: I0227 02:01:16.419381 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lz7zh" podStartSLOduration=2.954570882 podStartE2EDuration="5.419358899s" podCreationTimestamp="2026-02-27 02:01:11 +0000 UTC" firstStartedPulling="2026-02-27 02:01:13.363969131 +0000 UTC m=+3386.301530409" lastFinishedPulling="2026-02-27 02:01:15.828757138 +0000 UTC m=+3388.766318426" observedRunningTime="2026-02-27 02:01:16.412031181 +0000 UTC m=+3389.349592489" watchObservedRunningTime="2026-02-27 02:01:16.419358899 +0000 UTC m=+3389.356920187" Feb 27 02:01:17 crc kubenswrapper[4771]: I0227 02:01:17.780373 4771 scope.go:117] "RemoveContainer" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" Feb 27 02:01:17 crc kubenswrapper[4771]: E0227 02:01:17.780982 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:01:22 crc kubenswrapper[4771]: I0227 02:01:22.089949 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lz7zh" Feb 27 02:01:22 crc kubenswrapper[4771]: I0227 02:01:22.090693 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lz7zh" Feb 27 02:01:22 crc kubenswrapper[4771]: I0227 02:01:22.169901 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lz7zh" Feb 27 02:01:22 crc kubenswrapper[4771]: I0227 02:01:22.511062 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lz7zh" Feb 27 02:01:22 crc kubenswrapper[4771]: I0227 02:01:22.583835 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lz7zh"] Feb 27 02:01:24 crc kubenswrapper[4771]: I0227 02:01:24.468345 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lz7zh" podUID="d514f025-100f-47e6-bf55-43e597ce1f3a" containerName="registry-server" containerID="cri-o://0dab3b775e687105d8ad0c6e07aaab790d6749e19e332cb00008faa51bd11ec2" gracePeriod=2 Feb 27 02:01:24 crc kubenswrapper[4771]: I0227 02:01:24.949004 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lz7zh" Feb 27 02:01:25 crc kubenswrapper[4771]: I0227 02:01:25.063105 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d514f025-100f-47e6-bf55-43e597ce1f3a-catalog-content\") pod \"d514f025-100f-47e6-bf55-43e597ce1f3a\" (UID: \"d514f025-100f-47e6-bf55-43e597ce1f3a\") " Feb 27 02:01:25 crc kubenswrapper[4771]: I0227 02:01:25.063205 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqjdm\" (UniqueName: \"kubernetes.io/projected/d514f025-100f-47e6-bf55-43e597ce1f3a-kube-api-access-hqjdm\") pod \"d514f025-100f-47e6-bf55-43e597ce1f3a\" (UID: \"d514f025-100f-47e6-bf55-43e597ce1f3a\") " Feb 27 02:01:25 crc kubenswrapper[4771]: I0227 02:01:25.063272 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d514f025-100f-47e6-bf55-43e597ce1f3a-utilities\") pod \"d514f025-100f-47e6-bf55-43e597ce1f3a\" (UID: \"d514f025-100f-47e6-bf55-43e597ce1f3a\") " Feb 27 02:01:25 crc kubenswrapper[4771]: I0227 02:01:25.064105 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d514f025-100f-47e6-bf55-43e597ce1f3a-utilities" (OuterVolumeSpecName: "utilities") pod "d514f025-100f-47e6-bf55-43e597ce1f3a" (UID: "d514f025-100f-47e6-bf55-43e597ce1f3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 02:01:25 crc kubenswrapper[4771]: I0227 02:01:25.080029 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d514f025-100f-47e6-bf55-43e597ce1f3a-kube-api-access-hqjdm" (OuterVolumeSpecName: "kube-api-access-hqjdm") pod "d514f025-100f-47e6-bf55-43e597ce1f3a" (UID: "d514f025-100f-47e6-bf55-43e597ce1f3a"). InnerVolumeSpecName "kube-api-access-hqjdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:01:25 crc kubenswrapper[4771]: I0227 02:01:25.122991 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d514f025-100f-47e6-bf55-43e597ce1f3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d514f025-100f-47e6-bf55-43e597ce1f3a" (UID: "d514f025-100f-47e6-bf55-43e597ce1f3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 02:01:25 crc kubenswrapper[4771]: I0227 02:01:25.165272 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d514f025-100f-47e6-bf55-43e597ce1f3a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 02:01:25 crc kubenswrapper[4771]: I0227 02:01:25.165343 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqjdm\" (UniqueName: \"kubernetes.io/projected/d514f025-100f-47e6-bf55-43e597ce1f3a-kube-api-access-hqjdm\") on node \"crc\" DevicePath \"\"" Feb 27 02:01:25 crc kubenswrapper[4771]: I0227 02:01:25.165354 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d514f025-100f-47e6-bf55-43e597ce1f3a-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 02:01:25 crc kubenswrapper[4771]: I0227 02:01:25.478187 4771 generic.go:334] "Generic (PLEG): container finished" podID="d514f025-100f-47e6-bf55-43e597ce1f3a" containerID="0dab3b775e687105d8ad0c6e07aaab790d6749e19e332cb00008faa51bd11ec2" exitCode=0 Feb 27 02:01:25 crc kubenswrapper[4771]: I0227 02:01:25.478224 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lz7zh" event={"ID":"d514f025-100f-47e6-bf55-43e597ce1f3a","Type":"ContainerDied","Data":"0dab3b775e687105d8ad0c6e07aaab790d6749e19e332cb00008faa51bd11ec2"} Feb 27 02:01:25 crc kubenswrapper[4771]: I0227 02:01:25.478242 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lz7zh" Feb 27 02:01:25 crc kubenswrapper[4771]: I0227 02:01:25.479577 4771 scope.go:117] "RemoveContainer" containerID="0dab3b775e687105d8ad0c6e07aaab790d6749e19e332cb00008faa51bd11ec2" Feb 27 02:01:25 crc kubenswrapper[4771]: I0227 02:01:25.479487 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lz7zh" event={"ID":"d514f025-100f-47e6-bf55-43e597ce1f3a","Type":"ContainerDied","Data":"762361f094ac144edf5eeb29c0d3f425af371a87668e9bf581faad6ac3053b28"} Feb 27 02:01:25 crc kubenswrapper[4771]: I0227 02:01:25.521513 4771 scope.go:117] "RemoveContainer" containerID="ea73112139f932468cf984534c1ac15c75098563774d598d60e3206e7fc662d1" Feb 27 02:01:25 crc kubenswrapper[4771]: I0227 02:01:25.547014 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lz7zh"] Feb 27 02:01:25 crc kubenswrapper[4771]: I0227 02:01:25.552148 4771 scope.go:117] "RemoveContainer" containerID="643c920ba25918e02b12279bbf8a31fc9a3a652738bf6144394b9c42fbd9e64f" Feb 27 02:01:25 crc kubenswrapper[4771]: I0227 02:01:25.559027 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lz7zh"] Feb 27 02:01:25 crc kubenswrapper[4771]: I0227 02:01:25.592292 4771 scope.go:117] "RemoveContainer" containerID="0dab3b775e687105d8ad0c6e07aaab790d6749e19e332cb00008faa51bd11ec2" Feb 27 02:01:25 crc kubenswrapper[4771]: E0227 02:01:25.592888 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dab3b775e687105d8ad0c6e07aaab790d6749e19e332cb00008faa51bd11ec2\": container with ID starting with 0dab3b775e687105d8ad0c6e07aaab790d6749e19e332cb00008faa51bd11ec2 not found: ID does not exist" containerID="0dab3b775e687105d8ad0c6e07aaab790d6749e19e332cb00008faa51bd11ec2" Feb 27 02:01:25 crc kubenswrapper[4771]: I0227 02:01:25.592947 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dab3b775e687105d8ad0c6e07aaab790d6749e19e332cb00008faa51bd11ec2"} err="failed to get container status \"0dab3b775e687105d8ad0c6e07aaab790d6749e19e332cb00008faa51bd11ec2\": rpc error: code = NotFound desc = could not find container \"0dab3b775e687105d8ad0c6e07aaab790d6749e19e332cb00008faa51bd11ec2\": container with ID starting with 0dab3b775e687105d8ad0c6e07aaab790d6749e19e332cb00008faa51bd11ec2 not found: ID does not exist" Feb 27 02:01:25 crc kubenswrapper[4771]: I0227 02:01:25.592979 4771 scope.go:117] "RemoveContainer" containerID="ea73112139f932468cf984534c1ac15c75098563774d598d60e3206e7fc662d1" Feb 27 02:01:25 crc kubenswrapper[4771]: E0227 02:01:25.593413 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea73112139f932468cf984534c1ac15c75098563774d598d60e3206e7fc662d1\": container with ID starting with ea73112139f932468cf984534c1ac15c75098563774d598d60e3206e7fc662d1 not found: ID does not exist" containerID="ea73112139f932468cf984534c1ac15c75098563774d598d60e3206e7fc662d1" Feb 27 02:01:25 crc kubenswrapper[4771]: I0227 02:01:25.593455 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea73112139f932468cf984534c1ac15c75098563774d598d60e3206e7fc662d1"} err="failed to get container status \"ea73112139f932468cf984534c1ac15c75098563774d598d60e3206e7fc662d1\": rpc error: code = NotFound desc = could not find container \"ea73112139f932468cf984534c1ac15c75098563774d598d60e3206e7fc662d1\": container with ID starting with ea73112139f932468cf984534c1ac15c75098563774d598d60e3206e7fc662d1 not found: ID does not exist" Feb 27 02:01:25 crc kubenswrapper[4771]: I0227 02:01:25.593484 4771 scope.go:117] "RemoveContainer" containerID="643c920ba25918e02b12279bbf8a31fc9a3a652738bf6144394b9c42fbd9e64f" Feb 27 02:01:25 crc kubenswrapper[4771]: E0227 02:01:25.593784 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"643c920ba25918e02b12279bbf8a31fc9a3a652738bf6144394b9c42fbd9e64f\": container with ID starting with 643c920ba25918e02b12279bbf8a31fc9a3a652738bf6144394b9c42fbd9e64f not found: ID does not exist" containerID="643c920ba25918e02b12279bbf8a31fc9a3a652738bf6144394b9c42fbd9e64f" Feb 27 02:01:25 crc kubenswrapper[4771]: I0227 02:01:25.593819 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"643c920ba25918e02b12279bbf8a31fc9a3a652738bf6144394b9c42fbd9e64f"} err="failed to get container status \"643c920ba25918e02b12279bbf8a31fc9a3a652738bf6144394b9c42fbd9e64f\": rpc error: code = NotFound desc = could not find container \"643c920ba25918e02b12279bbf8a31fc9a3a652738bf6144394b9c42fbd9e64f\": container with ID starting with 643c920ba25918e02b12279bbf8a31fc9a3a652738bf6144394b9c42fbd9e64f not found: ID does not exist" Feb 27 02:01:25 crc kubenswrapper[4771]: I0227 02:01:25.782035 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d514f025-100f-47e6-bf55-43e597ce1f3a" path="/var/lib/kubelet/pods/d514f025-100f-47e6-bf55-43e597ce1f3a/volumes" Feb 27 02:01:30 crc kubenswrapper[4771]: I0227 02:01:30.773244 4771 scope.go:117] "RemoveContainer" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" Feb 27 02:01:30 crc kubenswrapper[4771]: E0227 02:01:30.774227 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:01:31 crc kubenswrapper[4771]: I0227 02:01:31.649168 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-frcwp"] Feb 27 02:01:31 crc kubenswrapper[4771]: E0227 02:01:31.649768 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d514f025-100f-47e6-bf55-43e597ce1f3a" containerName="registry-server" Feb 27 02:01:31 crc kubenswrapper[4771]: I0227 02:01:31.649799 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d514f025-100f-47e6-bf55-43e597ce1f3a" containerName="registry-server" Feb 27 02:01:31 crc kubenswrapper[4771]: E0227 02:01:31.649847 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d514f025-100f-47e6-bf55-43e597ce1f3a" containerName="extract-content" Feb 27 02:01:31 crc kubenswrapper[4771]: I0227 02:01:31.649861 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d514f025-100f-47e6-bf55-43e597ce1f3a" containerName="extract-content" Feb 27 02:01:31 crc kubenswrapper[4771]: E0227 02:01:31.649891 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d514f025-100f-47e6-bf55-43e597ce1f3a" containerName="extract-utilities" Feb 27 02:01:31 crc kubenswrapper[4771]: I0227 02:01:31.649901 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d514f025-100f-47e6-bf55-43e597ce1f3a" containerName="extract-utilities" Feb 27 02:01:31 crc kubenswrapper[4771]: I0227 02:01:31.650170 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d514f025-100f-47e6-bf55-43e597ce1f3a" containerName="registry-server" Feb 27 02:01:31 crc kubenswrapper[4771]: I0227 02:01:31.652631 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frcwp" Feb 27 02:01:31 crc kubenswrapper[4771]: I0227 02:01:31.662811 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-frcwp"] Feb 27 02:01:31 crc kubenswrapper[4771]: I0227 02:01:31.794901 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0-catalog-content\") pod \"certified-operators-frcwp\" (UID: \"a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0\") " pod="openshift-marketplace/certified-operators-frcwp" Feb 27 02:01:31 crc kubenswrapper[4771]: I0227 02:01:31.795014 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0-utilities\") pod \"certified-operators-frcwp\" (UID: \"a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0\") " pod="openshift-marketplace/certified-operators-frcwp" Feb 27 02:01:31 crc kubenswrapper[4771]: I0227 02:01:31.795142 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk7m5\" (UniqueName: \"kubernetes.io/projected/a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0-kube-api-access-bk7m5\") pod \"certified-operators-frcwp\" (UID: \"a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0\") " pod="openshift-marketplace/certified-operators-frcwp" Feb 27 02:01:31 crc kubenswrapper[4771]: I0227 02:01:31.896577 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0-catalog-content\") pod \"certified-operators-frcwp\" (UID: \"a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0\") " pod="openshift-marketplace/certified-operators-frcwp" Feb 27 02:01:31 crc kubenswrapper[4771]: I0227 02:01:31.896642 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0-utilities\") pod \"certified-operators-frcwp\" (UID: \"a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0\") " pod="openshift-marketplace/certified-operators-frcwp" Feb 27 02:01:31 crc kubenswrapper[4771]: I0227 02:01:31.896693 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk7m5\" (UniqueName: \"kubernetes.io/projected/a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0-kube-api-access-bk7m5\") pod \"certified-operators-frcwp\" (UID: \"a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0\") " pod="openshift-marketplace/certified-operators-frcwp" Feb 27 02:01:31 crc kubenswrapper[4771]: I0227 02:01:31.897780 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0-catalog-content\") pod \"certified-operators-frcwp\" (UID: \"a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0\") " pod="openshift-marketplace/certified-operators-frcwp" Feb 27 02:01:31 crc kubenswrapper[4771]: I0227 02:01:31.897886 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0-utilities\") pod \"certified-operators-frcwp\" (UID: \"a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0\") " pod="openshift-marketplace/certified-operators-frcwp" Feb 27 02:01:31 crc kubenswrapper[4771]: I0227 02:01:31.919329 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk7m5\" (UniqueName: \"kubernetes.io/projected/a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0-kube-api-access-bk7m5\") pod \"certified-operators-frcwp\" (UID: \"a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0\") " pod="openshift-marketplace/certified-operators-frcwp" Feb 27 02:01:31 crc kubenswrapper[4771]: I0227 02:01:31.987541 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frcwp" Feb 27 02:01:32 crc kubenswrapper[4771]: I0227 02:01:32.481311 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-frcwp"] Feb 27 02:01:32 crc kubenswrapper[4771]: I0227 02:01:32.557144 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frcwp" event={"ID":"a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0","Type":"ContainerStarted","Data":"d81c1b0db4ddd4c93d3c8d19c82541c66c9a6989be221203c6f9028743277cc0"} Feb 27 02:01:33 crc kubenswrapper[4771]: I0227 02:01:33.571845 4771 generic.go:334] "Generic (PLEG): container finished" podID="a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0" containerID="4ff4bf5f3a4d764464fd917b4dfa894a37695db6921d959e7bed71337e58f760" exitCode=0 Feb 27 02:01:33 crc kubenswrapper[4771]: I0227 02:01:33.571954 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frcwp" event={"ID":"a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0","Type":"ContainerDied","Data":"4ff4bf5f3a4d764464fd917b4dfa894a37695db6921d959e7bed71337e58f760"} Feb 27 02:01:34 crc kubenswrapper[4771]: I0227 02:01:34.584948 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frcwp" event={"ID":"a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0","Type":"ContainerStarted","Data":"02b66a70630190f7a36a02d1e094c39bfec23b5611b3754941e5b45881742112"} Feb 27 02:01:35 crc kubenswrapper[4771]: I0227 02:01:35.593209 4771 generic.go:334] "Generic (PLEG): container finished" podID="a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0" containerID="02b66a70630190f7a36a02d1e094c39bfec23b5611b3754941e5b45881742112" exitCode=0 Feb 27 02:01:35 crc kubenswrapper[4771]: I0227 02:01:35.593253 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frcwp" event={"ID":"a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0","Type":"ContainerDied","Data":"02b66a70630190f7a36a02d1e094c39bfec23b5611b3754941e5b45881742112"} Feb 27 02:01:36 crc kubenswrapper[4771]: I0227 02:01:36.609982 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frcwp" event={"ID":"a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0","Type":"ContainerStarted","Data":"4ac7db66d3cd247c2ca19892b14ef69d638835f2518c04a5dd15d11465543e4e"} Feb 27 02:01:36 crc kubenswrapper[4771]: I0227 02:01:36.641429 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-frcwp" podStartSLOduration=3.108346423 podStartE2EDuration="5.641401075s" podCreationTimestamp="2026-02-27 02:01:31 +0000 UTC" firstStartedPulling="2026-02-27 02:01:33.574768773 +0000 UTC m=+3406.512330101" lastFinishedPulling="2026-02-27 02:01:36.107823425 +0000 UTC m=+3409.045384753" observedRunningTime="2026-02-27 02:01:36.625442784 +0000 UTC m=+3409.563004072" watchObservedRunningTime="2026-02-27 02:01:36.641401075 +0000 UTC m=+3409.578962403" Feb 27 02:01:41 crc kubenswrapper[4771]: I0227 02:01:41.988176 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-frcwp" Feb 27 02:01:41 crc kubenswrapper[4771]: I0227 02:01:41.989693 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-frcwp" Feb 27 02:01:42 crc kubenswrapper[4771]: I0227 02:01:42.077215 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-frcwp" Feb 27 02:01:42 crc kubenswrapper[4771]: I0227 02:01:42.761896 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-frcwp" Feb 27 02:01:42 crc kubenswrapper[4771]: I0227 02:01:42.930784 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-frcwp"] Feb 27 02:01:43 crc kubenswrapper[4771]: I0227 02:01:43.773679 4771 scope.go:117] "RemoveContainer" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" Feb 27 02:01:43 crc kubenswrapper[4771]: E0227 02:01:43.775399 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:01:44 crc kubenswrapper[4771]: I0227 02:01:44.694938 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-frcwp" podUID="a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0" containerName="registry-server" containerID="cri-o://4ac7db66d3cd247c2ca19892b14ef69d638835f2518c04a5dd15d11465543e4e" gracePeriod=2 Feb 27 02:01:45 crc kubenswrapper[4771]: I0227 02:01:45.283392 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frcwp" Feb 27 02:01:45 crc kubenswrapper[4771]: I0227 02:01:45.373497 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0-utilities\") pod \"a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0\" (UID: \"a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0\") " Feb 27 02:01:45 crc kubenswrapper[4771]: I0227 02:01:45.373702 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0-catalog-content\") pod \"a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0\" (UID: \"a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0\") " Feb 27 02:01:45 crc kubenswrapper[4771]: I0227 02:01:45.373788 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk7m5\" (UniqueName: \"kubernetes.io/projected/a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0-kube-api-access-bk7m5\") pod \"a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0\" (UID: \"a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0\") " Feb 27 02:01:45 crc kubenswrapper[4771]: I0227 02:01:45.374347 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0-utilities" (OuterVolumeSpecName: "utilities") pod "a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0" (UID: "a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 02:01:45 crc kubenswrapper[4771]: I0227 02:01:45.381084 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0-kube-api-access-bk7m5" (OuterVolumeSpecName: "kube-api-access-bk7m5") pod "a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0" (UID: "a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0"). InnerVolumeSpecName "kube-api-access-bk7m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:01:45 crc kubenswrapper[4771]: I0227 02:01:45.435442 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0" (UID: "a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 02:01:45 crc kubenswrapper[4771]: I0227 02:01:45.476015 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 02:01:45 crc kubenswrapper[4771]: I0227 02:01:45.476049 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 02:01:45 crc kubenswrapper[4771]: I0227 02:01:45.476062 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk7m5\" (UniqueName: \"kubernetes.io/projected/a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0-kube-api-access-bk7m5\") on node \"crc\" DevicePath \"\"" Feb 27 02:01:45 crc kubenswrapper[4771]: I0227 02:01:45.706038 4771 generic.go:334] "Generic (PLEG): container finished" podID="a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0" containerID="4ac7db66d3cd247c2ca19892b14ef69d638835f2518c04a5dd15d11465543e4e" exitCode=0 Feb 27 02:01:45 crc kubenswrapper[4771]: I0227 02:01:45.706075 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frcwp" event={"ID":"a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0","Type":"ContainerDied","Data":"4ac7db66d3cd247c2ca19892b14ef69d638835f2518c04a5dd15d11465543e4e"} Feb 27 02:01:45 crc kubenswrapper[4771]: I0227 02:01:45.706406 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frcwp" event={"ID":"a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0","Type":"ContainerDied","Data":"d81c1b0db4ddd4c93d3c8d19c82541c66c9a6989be221203c6f9028743277cc0"} Feb 27 02:01:45 crc kubenswrapper[4771]: I0227 02:01:45.706475 4771 scope.go:117] "RemoveContainer" containerID="4ac7db66d3cd247c2ca19892b14ef69d638835f2518c04a5dd15d11465543e4e" Feb 27 02:01:45 crc kubenswrapper[4771]: I0227 02:01:45.706108 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frcwp" Feb 27 02:01:45 crc kubenswrapper[4771]: I0227 02:01:45.727607 4771 scope.go:117] "RemoveContainer" containerID="02b66a70630190f7a36a02d1e094c39bfec23b5611b3754941e5b45881742112" Feb 27 02:01:45 crc kubenswrapper[4771]: I0227 02:01:45.740869 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-frcwp"] Feb 27 02:01:45 crc kubenswrapper[4771]: I0227 02:01:45.748040 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-frcwp"] Feb 27 02:01:45 crc kubenswrapper[4771]: I0227 02:01:45.762890 4771 scope.go:117] "RemoveContainer" containerID="4ff4bf5f3a4d764464fd917b4dfa894a37695db6921d959e7bed71337e58f760" Feb 27 02:01:45 crc kubenswrapper[4771]: I0227 02:01:45.789464 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0" path="/var/lib/kubelet/pods/a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0/volumes" Feb 27 02:01:45 crc kubenswrapper[4771]: I0227 02:01:45.806138 4771 scope.go:117] "RemoveContainer" containerID="4ac7db66d3cd247c2ca19892b14ef69d638835f2518c04a5dd15d11465543e4e" Feb 27 02:01:45 crc kubenswrapper[4771]: E0227 02:01:45.806714 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ac7db66d3cd247c2ca19892b14ef69d638835f2518c04a5dd15d11465543e4e\": container with ID starting with 4ac7db66d3cd247c2ca19892b14ef69d638835f2518c04a5dd15d11465543e4e not found: ID does not exist" containerID="4ac7db66d3cd247c2ca19892b14ef69d638835f2518c04a5dd15d11465543e4e" Feb 27 02:01:45 crc kubenswrapper[4771]: I0227 02:01:45.806761 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ac7db66d3cd247c2ca19892b14ef69d638835f2518c04a5dd15d11465543e4e"} err="failed to get container status \"4ac7db66d3cd247c2ca19892b14ef69d638835f2518c04a5dd15d11465543e4e\": rpc error: code = NotFound desc = could not find container \"4ac7db66d3cd247c2ca19892b14ef69d638835f2518c04a5dd15d11465543e4e\": container with ID starting with 4ac7db66d3cd247c2ca19892b14ef69d638835f2518c04a5dd15d11465543e4e not found: ID does not exist" Feb 27 02:01:45 crc kubenswrapper[4771]: I0227 02:01:45.806787 4771 scope.go:117] "RemoveContainer" containerID="02b66a70630190f7a36a02d1e094c39bfec23b5611b3754941e5b45881742112" Feb 27 02:01:45 crc kubenswrapper[4771]: E0227 02:01:45.807253 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02b66a70630190f7a36a02d1e094c39bfec23b5611b3754941e5b45881742112\": container with ID starting with 02b66a70630190f7a36a02d1e094c39bfec23b5611b3754941e5b45881742112 not found: ID does not exist" containerID="02b66a70630190f7a36a02d1e094c39bfec23b5611b3754941e5b45881742112" Feb 27 02:01:45 crc kubenswrapper[4771]: I0227 02:01:45.807326 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02b66a70630190f7a36a02d1e094c39bfec23b5611b3754941e5b45881742112"} err="failed to get container status \"02b66a70630190f7a36a02d1e094c39bfec23b5611b3754941e5b45881742112\": rpc error: code = NotFound desc = could not find container \"02b66a70630190f7a36a02d1e094c39bfec23b5611b3754941e5b45881742112\": container with ID starting with 02b66a70630190f7a36a02d1e094c39bfec23b5611b3754941e5b45881742112 not found: ID does not exist" Feb 27 02:01:45 crc kubenswrapper[4771]: I0227 02:01:45.807371 4771 scope.go:117] "RemoveContainer" containerID="4ff4bf5f3a4d764464fd917b4dfa894a37695db6921d959e7bed71337e58f760" Feb 27 02:01:45 crc kubenswrapper[4771]: E0227 02:01:45.807811 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ff4bf5f3a4d764464fd917b4dfa894a37695db6921d959e7bed71337e58f760\": container with ID starting with 4ff4bf5f3a4d764464fd917b4dfa894a37695db6921d959e7bed71337e58f760 not found: ID does not exist" containerID="4ff4bf5f3a4d764464fd917b4dfa894a37695db6921d959e7bed71337e58f760" Feb 27 02:01:45 crc kubenswrapper[4771]: I0227 02:01:45.807863 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ff4bf5f3a4d764464fd917b4dfa894a37695db6921d959e7bed71337e58f760"} err="failed to get container status \"4ff4bf5f3a4d764464fd917b4dfa894a37695db6921d959e7bed71337e58f760\": rpc error: code = NotFound desc = could not find container \"4ff4bf5f3a4d764464fd917b4dfa894a37695db6921d959e7bed71337e58f760\": container with ID starting with 4ff4bf5f3a4d764464fd917b4dfa894a37695db6921d959e7bed71337e58f760 not found: ID does not exist" Feb 27 02:01:52 crc kubenswrapper[4771]: I0227 02:01:52.787027 4771 generic.go:334] "Generic (PLEG): container finished" podID="4b362ce5-5892-43a0-8ec9-e280131b32ee" containerID="12d228d3dc7206d21966d9d081e359bb92788e3bc231ff9e7f545fb2af557236" exitCode=0 Feb 27 02:01:52 crc kubenswrapper[4771]: I0227 02:01:52.787155 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4b362ce5-5892-43a0-8ec9-e280131b32ee","Type":"ContainerDied","Data":"12d228d3dc7206d21966d9d081e359bb92788e3bc231ff9e7f545fb2af557236"} Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.253738 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.455859 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4b362ce5-5892-43a0-8ec9-e280131b32ee-ca-certs\") pod \"4b362ce5-5892-43a0-8ec9-e280131b32ee\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.455904 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4b362ce5-5892-43a0-8ec9-e280131b32ee-test-operator-ephemeral-temporary\") pod \"4b362ce5-5892-43a0-8ec9-e280131b32ee\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.455954 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"4b362ce5-5892-43a0-8ec9-e280131b32ee\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.455992 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4b362ce5-5892-43a0-8ec9-e280131b32ee-test-operator-ephemeral-workdir\") pod \"4b362ce5-5892-43a0-8ec9-e280131b32ee\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.456086 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzlmx\" (UniqueName: \"kubernetes.io/projected/4b362ce5-5892-43a0-8ec9-e280131b32ee-kube-api-access-mzlmx\") pod \"4b362ce5-5892-43a0-8ec9-e280131b32ee\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.456147 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4b362ce5-5892-43a0-8ec9-e280131b32ee-openstack-config\") pod \"4b362ce5-5892-43a0-8ec9-e280131b32ee\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.456183 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4b362ce5-5892-43a0-8ec9-e280131b32ee-openstack-config-secret\") pod \"4b362ce5-5892-43a0-8ec9-e280131b32ee\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.456202 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b362ce5-5892-43a0-8ec9-e280131b32ee-ssh-key\") pod \"4b362ce5-5892-43a0-8ec9-e280131b32ee\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.456248 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4b362ce5-5892-43a0-8ec9-e280131b32ee-config-data\") pod \"4b362ce5-5892-43a0-8ec9-e280131b32ee\" (UID: \"4b362ce5-5892-43a0-8ec9-e280131b32ee\") " Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.456632 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b362ce5-5892-43a0-8ec9-e280131b32ee-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "4b362ce5-5892-43a0-8ec9-e280131b32ee" (UID: "4b362ce5-5892-43a0-8ec9-e280131b32ee"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.457863 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b362ce5-5892-43a0-8ec9-e280131b32ee-config-data" (OuterVolumeSpecName: "config-data") pod "4b362ce5-5892-43a0-8ec9-e280131b32ee" (UID: "4b362ce5-5892-43a0-8ec9-e280131b32ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.462963 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b362ce5-5892-43a0-8ec9-e280131b32ee-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "4b362ce5-5892-43a0-8ec9-e280131b32ee" (UID: "4b362ce5-5892-43a0-8ec9-e280131b32ee"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.463522 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "test-operator-logs") pod "4b362ce5-5892-43a0-8ec9-e280131b32ee" (UID: "4b362ce5-5892-43a0-8ec9-e280131b32ee"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.464587 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b362ce5-5892-43a0-8ec9-e280131b32ee-kube-api-access-mzlmx" (OuterVolumeSpecName: "kube-api-access-mzlmx") pod "4b362ce5-5892-43a0-8ec9-e280131b32ee" (UID: "4b362ce5-5892-43a0-8ec9-e280131b32ee"). InnerVolumeSpecName "kube-api-access-mzlmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.488504 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b362ce5-5892-43a0-8ec9-e280131b32ee-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "4b362ce5-5892-43a0-8ec9-e280131b32ee" (UID: "4b362ce5-5892-43a0-8ec9-e280131b32ee"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.501696 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b362ce5-5892-43a0-8ec9-e280131b32ee-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "4b362ce5-5892-43a0-8ec9-e280131b32ee" (UID: "4b362ce5-5892-43a0-8ec9-e280131b32ee"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.516676 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b362ce5-5892-43a0-8ec9-e280131b32ee-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4b362ce5-5892-43a0-8ec9-e280131b32ee" (UID: "4b362ce5-5892-43a0-8ec9-e280131b32ee"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.526949 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b362ce5-5892-43a0-8ec9-e280131b32ee-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "4b362ce5-5892-43a0-8ec9-e280131b32ee" (UID: "4b362ce5-5892-43a0-8ec9-e280131b32ee"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.558865 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4b362ce5-5892-43a0-8ec9-e280131b32ee-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.558895 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b362ce5-5892-43a0-8ec9-e280131b32ee-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.558904 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4b362ce5-5892-43a0-8ec9-e280131b32ee-config-data\") on node \"crc\" DevicePath \"\"" Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.558913 4771 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4b362ce5-5892-43a0-8ec9-e280131b32ee-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.558922 4771 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4b362ce5-5892-43a0-8ec9-e280131b32ee-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.558948 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.558958 4771 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4b362ce5-5892-43a0-8ec9-e280131b32ee-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.558971 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzlmx\" (UniqueName: \"kubernetes.io/projected/4b362ce5-5892-43a0-8ec9-e280131b32ee-kube-api-access-mzlmx\") on node \"crc\" DevicePath \"\"" Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.558981 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4b362ce5-5892-43a0-8ec9-e280131b32ee-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.577169 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.661183 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.810773 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4b362ce5-5892-43a0-8ec9-e280131b32ee","Type":"ContainerDied","Data":"3dea908ac4d6f63aee8c97dc34c8eaa1b6d1c11b0ebce05954f743fe66bad5c4"} Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.810817 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3dea908ac4d6f63aee8c97dc34c8eaa1b6d1c11b0ebce05954f743fe66bad5c4" Feb 27 02:01:54 crc kubenswrapper[4771]: I0227 02:01:54.810872 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 27 02:01:55 crc kubenswrapper[4771]: I0227 02:01:55.773364 4771 scope.go:117] "RemoveContainer" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" Feb 27 02:01:55 crc kubenswrapper[4771]: E0227 02:01:55.773985 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:01:57 crc kubenswrapper[4771]: I0227 02:01:57.662407 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 27 02:01:57 crc kubenswrapper[4771]: E0227 02:01:57.664147 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b362ce5-5892-43a0-8ec9-e280131b32ee" containerName="tempest-tests-tempest-tests-runner" Feb 27 02:01:57 crc kubenswrapper[4771]: I0227 02:01:57.664194 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b362ce5-5892-43a0-8ec9-e280131b32ee" containerName="tempest-tests-tempest-tests-runner" Feb 27 02:01:57 crc kubenswrapper[4771]: E0227 02:01:57.664217 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0" containerName="extract-content" Feb 27 02:01:57 crc kubenswrapper[4771]: I0227 02:01:57.664225 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0" containerName="extract-content" Feb 27 02:01:57 crc kubenswrapper[4771]: E0227 02:01:57.664253 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0" containerName="extract-utilities" Feb 27 02:01:57 crc kubenswrapper[4771]: I0227 02:01:57.664260 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0" containerName="extract-utilities" Feb 27 02:01:57 crc kubenswrapper[4771]: E0227 02:01:57.664279 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0" containerName="registry-server" Feb 27 02:01:57 crc kubenswrapper[4771]: I0227 02:01:57.664314 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0" containerName="registry-server" Feb 27 02:01:57 crc kubenswrapper[4771]: I0227 02:01:57.664509 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b362ce5-5892-43a0-8ec9-e280131b32ee" containerName="tempest-tests-tempest-tests-runner" Feb 27 02:01:57 crc kubenswrapper[4771]: I0227 02:01:57.664533 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a08ba2dd-a5c3-4f4f-ba63-34d3bdc02de0" containerName="registry-server" Feb 27 02:01:57 crc kubenswrapper[4771]: I0227 02:01:57.665336 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 02:01:57 crc kubenswrapper[4771]: I0227 02:01:57.670186 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 27 02:01:57 crc kubenswrapper[4771]: I0227 02:01:57.694766 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-q58qr" Feb 27 02:01:57 crc kubenswrapper[4771]: I0227 02:01:57.817349 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct7qd\" (UniqueName: \"kubernetes.io/projected/8967caa6-adcf-41fa-8cba-ccb9aaf3b0f4-kube-api-access-ct7qd\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8967caa6-adcf-41fa-8cba-ccb9aaf3b0f4\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 02:01:57 crc kubenswrapper[4771]: I0227 02:01:57.817499 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8967caa6-adcf-41fa-8cba-ccb9aaf3b0f4\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 02:01:57 crc kubenswrapper[4771]: I0227 02:01:57.817530 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d82m6"] Feb 27 02:01:57 crc kubenswrapper[4771]: I0227 02:01:57.819535 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d82m6" Feb 27 02:01:57 crc kubenswrapper[4771]: I0227 02:01:57.839443 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d82m6"] Feb 27 02:01:57 crc kubenswrapper[4771]: I0227 02:01:57.918961 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p7qn\" (UniqueName: \"kubernetes.io/projected/f853f3cb-f8fe-46d6-a177-929f0c7a55db-kube-api-access-9p7qn\") pod \"redhat-operators-d82m6\" (UID: \"f853f3cb-f8fe-46d6-a177-929f0c7a55db\") " pod="openshift-marketplace/redhat-operators-d82m6" Feb 27 02:01:57 crc kubenswrapper[4771]: I0227 02:01:57.919020 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f853f3cb-f8fe-46d6-a177-929f0c7a55db-utilities\") pod \"redhat-operators-d82m6\" (UID: \"f853f3cb-f8fe-46d6-a177-929f0c7a55db\") " pod="openshift-marketplace/redhat-operators-d82m6" Feb 27 02:01:57 crc kubenswrapper[4771]: I0227 02:01:57.919153 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct7qd\" (UniqueName: \"kubernetes.io/projected/8967caa6-adcf-41fa-8cba-ccb9aaf3b0f4-kube-api-access-ct7qd\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8967caa6-adcf-41fa-8cba-ccb9aaf3b0f4\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 02:01:57 crc kubenswrapper[4771]: I0227 02:01:57.919189 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f853f3cb-f8fe-46d6-a177-929f0c7a55db-catalog-content\") pod \"redhat-operators-d82m6\" (UID: \"f853f3cb-f8fe-46d6-a177-929f0c7a55db\") " pod="openshift-marketplace/redhat-operators-d82m6" Feb 27 02:01:57 crc kubenswrapper[4771]: I0227 02:01:57.919269 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8967caa6-adcf-41fa-8cba-ccb9aaf3b0f4\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 02:01:57 crc kubenswrapper[4771]: I0227 02:01:57.919643 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8967caa6-adcf-41fa-8cba-ccb9aaf3b0f4\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 02:01:57 crc kubenswrapper[4771]: I0227 02:01:57.943464 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct7qd\" (UniqueName: \"kubernetes.io/projected/8967caa6-adcf-41fa-8cba-ccb9aaf3b0f4-kube-api-access-ct7qd\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8967caa6-adcf-41fa-8cba-ccb9aaf3b0f4\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 02:01:57 crc kubenswrapper[4771]: I0227 02:01:57.944360 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8967caa6-adcf-41fa-8cba-ccb9aaf3b0f4\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 02:01:58 crc kubenswrapper[4771]: I0227 02:01:58.021010 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p7qn\" (UniqueName: \"kubernetes.io/projected/f853f3cb-f8fe-46d6-a177-929f0c7a55db-kube-api-access-9p7qn\") pod \"redhat-operators-d82m6\" (UID: \"f853f3cb-f8fe-46d6-a177-929f0c7a55db\") " pod="openshift-marketplace/redhat-operators-d82m6" Feb 27 02:01:58 crc kubenswrapper[4771]: I0227 02:01:58.021070 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f853f3cb-f8fe-46d6-a177-929f0c7a55db-utilities\") pod \"redhat-operators-d82m6\" (UID: \"f853f3cb-f8fe-46d6-a177-929f0c7a55db\") " pod="openshift-marketplace/redhat-operators-d82m6" Feb 27 02:01:58 crc kubenswrapper[4771]: I0227 02:01:58.021145 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f853f3cb-f8fe-46d6-a177-929f0c7a55db-catalog-content\") pod \"redhat-operators-d82m6\" (UID: \"f853f3cb-f8fe-46d6-a177-929f0c7a55db\") " pod="openshift-marketplace/redhat-operators-d82m6" Feb 27 02:01:58 crc kubenswrapper[4771]: I0227 02:01:58.021573 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f853f3cb-f8fe-46d6-a177-929f0c7a55db-catalog-content\") pod \"redhat-operators-d82m6\" (UID: \"f853f3cb-f8fe-46d6-a177-929f0c7a55db\") " pod="openshift-marketplace/redhat-operators-d82m6" Feb 27 02:01:58 crc kubenswrapper[4771]: I0227 02:01:58.022084 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f853f3cb-f8fe-46d6-a177-929f0c7a55db-utilities\") pod \"redhat-operators-d82m6\" (UID: \"f853f3cb-f8fe-46d6-a177-929f0c7a55db\") " pod="openshift-marketplace/redhat-operators-d82m6" Feb 27 02:01:58 crc kubenswrapper[4771]: I0227 02:01:58.022250 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 27 02:01:58 crc kubenswrapper[4771]: I0227 02:01:58.041472 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p7qn\" (UniqueName: \"kubernetes.io/projected/f853f3cb-f8fe-46d6-a177-929f0c7a55db-kube-api-access-9p7qn\") pod \"redhat-operators-d82m6\" (UID: \"f853f3cb-f8fe-46d6-a177-929f0c7a55db\") " pod="openshift-marketplace/redhat-operators-d82m6" Feb 27 02:01:58 crc kubenswrapper[4771]: I0227 02:01:58.136217 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d82m6" Feb 27 02:01:58 crc kubenswrapper[4771]: I0227 02:01:58.501808 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 27 02:01:58 crc kubenswrapper[4771]: W0227 02:01:58.624110 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf853f3cb_f8fe_46d6_a177_929f0c7a55db.slice/crio-917960a269556c7a0bad0c402a4b99eda069cbbd216ed078c9a0978c3d18871c WatchSource:0}: Error finding container 917960a269556c7a0bad0c402a4b99eda069cbbd216ed078c9a0978c3d18871c: Status 404 returned error can't find the container with id 917960a269556c7a0bad0c402a4b99eda069cbbd216ed078c9a0978c3d18871c Feb 27 02:01:58 crc kubenswrapper[4771]: I0227 02:01:58.624262 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d82m6"] Feb 27 02:01:58 crc kubenswrapper[4771]: I0227 02:01:58.844934 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"8967caa6-adcf-41fa-8cba-ccb9aaf3b0f4","Type":"ContainerStarted","Data":"492dd0c061096a66620674881295ffd56a6b21e1b43d02d094ec4dbdf77c6ffb"} Feb 27 02:01:58 crc kubenswrapper[4771]: I0227 02:01:58.847604 4771 generic.go:334] "Generic (PLEG): container finished" podID="f853f3cb-f8fe-46d6-a177-929f0c7a55db" containerID="e0df41afaa786c0476137c81c17dcc233ffb5a6d52802b649879153b46ba6c42" exitCode=0 Feb 27 02:01:58 crc kubenswrapper[4771]: I0227 02:01:58.847639 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d82m6" event={"ID":"f853f3cb-f8fe-46d6-a177-929f0c7a55db","Type":"ContainerDied","Data":"e0df41afaa786c0476137c81c17dcc233ffb5a6d52802b649879153b46ba6c42"} Feb 27 02:01:58 crc kubenswrapper[4771]: I0227 02:01:58.847660 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d82m6" event={"ID":"f853f3cb-f8fe-46d6-a177-929f0c7a55db","Type":"ContainerStarted","Data":"917960a269556c7a0bad0c402a4b99eda069cbbd216ed078c9a0978c3d18871c"} Feb 27 02:01:59 crc kubenswrapper[4771]: I0227 02:01:59.858604 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d82m6" event={"ID":"f853f3cb-f8fe-46d6-a177-929f0c7a55db","Type":"ContainerStarted","Data":"53c4e276a532cffa248b2109631d06f99eb1a8eb219efc946cce160c87bb0ff8"} Feb 27 02:01:59 crc kubenswrapper[4771]: I0227 02:01:59.861702 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"8967caa6-adcf-41fa-8cba-ccb9aaf3b0f4","Type":"ContainerStarted","Data":"ec1005d121f80b60117d84ac20e1c0fad22061f2420d17ffaf79131b565e539f"} Feb 27 02:01:59 crc kubenswrapper[4771]: I0227 02:01:59.904766 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.166100759 podStartE2EDuration="2.904745888s" podCreationTimestamp="2026-02-27 02:01:57 +0000 UTC" firstStartedPulling="2026-02-27 02:01:58.508015326 +0000 UTC m=+3431.445576614" lastFinishedPulling="2026-02-27 02:01:59.246660455 +0000 UTC m=+3432.184221743" observedRunningTime="2026-02-27 02:01:59.892506097 +0000 UTC m=+3432.830067385" watchObservedRunningTime="2026-02-27 02:01:59.904745888 +0000 UTC m=+3432.842307186" Feb 27 02:02:00 crc kubenswrapper[4771]: I0227 02:02:00.150971 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535962-jjvmv"] Feb 27 02:02:00 crc kubenswrapper[4771]: I0227 02:02:00.152273 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535962-jjvmv" Feb 27 02:02:00 crc kubenswrapper[4771]: I0227 02:02:00.154906 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 02:02:00 crc kubenswrapper[4771]: I0227 02:02:00.155187 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 02:02:00 crc kubenswrapper[4771]: I0227 02:02:00.158102 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 02:02:00 crc kubenswrapper[4771]: I0227 02:02:00.159307 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535962-jjvmv"] Feb 27 02:02:00 crc kubenswrapper[4771]: I0227 02:02:00.260452 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s2jw\" (UniqueName: \"kubernetes.io/projected/181d3584-175e-49e7-9efc-58426d4b4903-kube-api-access-8s2jw\") pod \"auto-csr-approver-29535962-jjvmv\" (UID: \"181d3584-175e-49e7-9efc-58426d4b4903\") " pod="openshift-infra/auto-csr-approver-29535962-jjvmv" Feb 27 02:02:00 crc kubenswrapper[4771]: I0227 02:02:00.362137 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s2jw\" (UniqueName: \"kubernetes.io/projected/181d3584-175e-49e7-9efc-58426d4b4903-kube-api-access-8s2jw\") pod \"auto-csr-approver-29535962-jjvmv\" (UID: \"181d3584-175e-49e7-9efc-58426d4b4903\") " pod="openshift-infra/auto-csr-approver-29535962-jjvmv" Feb 27 02:02:00 crc kubenswrapper[4771]: I0227 02:02:00.384934 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s2jw\" (UniqueName: \"kubernetes.io/projected/181d3584-175e-49e7-9efc-58426d4b4903-kube-api-access-8s2jw\") pod \"auto-csr-approver-29535962-jjvmv\" (UID: \"181d3584-175e-49e7-9efc-58426d4b4903\") " pod="openshift-infra/auto-csr-approver-29535962-jjvmv" Feb 27 02:02:00 crc kubenswrapper[4771]: I0227 02:02:00.506241 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535962-jjvmv" Feb 27 02:02:01 crc kubenswrapper[4771]: W0227 02:02:01.015485 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod181d3584_175e_49e7_9efc_58426d4b4903.slice/crio-535e465224394fd14aeab06a28734c07aa8d563f864e1784f4a21cf0d149dc86 WatchSource:0}: Error finding container 535e465224394fd14aeab06a28734c07aa8d563f864e1784f4a21cf0d149dc86: Status 404 returned error can't find the container with id 535e465224394fd14aeab06a28734c07aa8d563f864e1784f4a21cf0d149dc86 Feb 27 02:02:01 crc kubenswrapper[4771]: I0227 02:02:01.017472 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535962-jjvmv"] Feb 27 02:02:01 crc kubenswrapper[4771]: I0227 02:02:01.879797 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535962-jjvmv" event={"ID":"181d3584-175e-49e7-9efc-58426d4b4903","Type":"ContainerStarted","Data":"535e465224394fd14aeab06a28734c07aa8d563f864e1784f4a21cf0d149dc86"} Feb 27 02:02:02 crc kubenswrapper[4771]: I0227 02:02:02.892841 4771 generic.go:334] "Generic (PLEG): container finished" podID="f853f3cb-f8fe-46d6-a177-929f0c7a55db" containerID="53c4e276a532cffa248b2109631d06f99eb1a8eb219efc946cce160c87bb0ff8" exitCode=0 Feb 27 02:02:02 crc kubenswrapper[4771]: I0227 02:02:02.892900 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d82m6" event={"ID":"f853f3cb-f8fe-46d6-a177-929f0c7a55db","Type":"ContainerDied","Data":"53c4e276a532cffa248b2109631d06f99eb1a8eb219efc946cce160c87bb0ff8"} Feb 27 02:02:03 crc kubenswrapper[4771]: I0227 02:02:03.904808 4771 generic.go:334] "Generic (PLEG): container finished" podID="181d3584-175e-49e7-9efc-58426d4b4903" containerID="7db21d7f6dd657851cbd3d0512cf3dd1d2643656cf5f8a28b2c89c5d03ae58bf" exitCode=0 Feb 27 02:02:03 crc kubenswrapper[4771]: I0227 02:02:03.905171 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535962-jjvmv" event={"ID":"181d3584-175e-49e7-9efc-58426d4b4903","Type":"ContainerDied","Data":"7db21d7f6dd657851cbd3d0512cf3dd1d2643656cf5f8a28b2c89c5d03ae58bf"} Feb 27 02:02:03 crc kubenswrapper[4771]: I0227 02:02:03.908211 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d82m6" event={"ID":"f853f3cb-f8fe-46d6-a177-929f0c7a55db","Type":"ContainerStarted","Data":"217051c63ac5eb7b63440f6f9fccf0ca2f93a692d3f589f8b1752234fa1b070f"} Feb 27 02:02:03 crc kubenswrapper[4771]: I0227 02:02:03.957938 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d82m6" podStartSLOduration=2.379220995 podStartE2EDuration="6.957911164s" podCreationTimestamp="2026-02-27 02:01:57 +0000 UTC" firstStartedPulling="2026-02-27 02:01:58.849118918 +0000 UTC m=+3431.786680206" lastFinishedPulling="2026-02-27 02:02:03.427809087 +0000 UTC m=+3436.365370375" observedRunningTime="2026-02-27 02:02:03.94293889 +0000 UTC m=+3436.880500198" watchObservedRunningTime="2026-02-27 02:02:03.957911164 +0000 UTC m=+3436.895472442" Feb 27 02:02:05 crc kubenswrapper[4771]: I0227 02:02:05.351116 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535962-jjvmv" Feb 27 02:02:05 crc kubenswrapper[4771]: I0227 02:02:05.473030 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s2jw\" (UniqueName: \"kubernetes.io/projected/181d3584-175e-49e7-9efc-58426d4b4903-kube-api-access-8s2jw\") pod \"181d3584-175e-49e7-9efc-58426d4b4903\" (UID: \"181d3584-175e-49e7-9efc-58426d4b4903\") " Feb 27 02:02:05 crc kubenswrapper[4771]: I0227 02:02:05.485076 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/181d3584-175e-49e7-9efc-58426d4b4903-kube-api-access-8s2jw" (OuterVolumeSpecName: "kube-api-access-8s2jw") pod "181d3584-175e-49e7-9efc-58426d4b4903" (UID: "181d3584-175e-49e7-9efc-58426d4b4903"). InnerVolumeSpecName "kube-api-access-8s2jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:02:05 crc kubenswrapper[4771]: I0227 02:02:05.576033 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s2jw\" (UniqueName: \"kubernetes.io/projected/181d3584-175e-49e7-9efc-58426d4b4903-kube-api-access-8s2jw\") on node \"crc\" DevicePath \"\"" Feb 27 02:02:05 crc kubenswrapper[4771]: I0227 02:02:05.931101 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535962-jjvmv" event={"ID":"181d3584-175e-49e7-9efc-58426d4b4903","Type":"ContainerDied","Data":"535e465224394fd14aeab06a28734c07aa8d563f864e1784f4a21cf0d149dc86"} Feb 27 02:02:05 crc kubenswrapper[4771]: I0227 02:02:05.931239 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="535e465224394fd14aeab06a28734c07aa8d563f864e1784f4a21cf0d149dc86" Feb 27 02:02:05 crc kubenswrapper[4771]: I0227 02:02:05.931137 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535962-jjvmv" Feb 27 02:02:06 crc kubenswrapper[4771]: I0227 02:02:06.426582 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535956-fm6j9"] Feb 27 02:02:06 crc kubenswrapper[4771]: I0227 02:02:06.439745 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535956-fm6j9"] Feb 27 02:02:06 crc kubenswrapper[4771]: I0227 02:02:06.773396 4771 scope.go:117] "RemoveContainer" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" Feb 27 02:02:06 crc kubenswrapper[4771]: E0227 02:02:06.773913 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:02:07 crc kubenswrapper[4771]: I0227 02:02:07.782764 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c85e9c07-b194-45d7-936a-33ade4484507" path="/var/lib/kubelet/pods/c85e9c07-b194-45d7-936a-33ade4484507/volumes" Feb 27 02:02:08 crc kubenswrapper[4771]: I0227 02:02:08.139007 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d82m6" Feb 27 02:02:08 crc kubenswrapper[4771]: I0227 02:02:08.139072 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d82m6" Feb 27 02:02:09 crc kubenswrapper[4771]: I0227 02:02:09.195789 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d82m6" podUID="f853f3cb-f8fe-46d6-a177-929f0c7a55db" containerName="registry-server" probeResult="failure" output=< Feb 27 02:02:09 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 27 02:02:09 crc kubenswrapper[4771]: > Feb 27 02:02:18 crc kubenswrapper[4771]: I0227 02:02:18.214743 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d82m6" Feb 27 02:02:18 crc kubenswrapper[4771]: I0227 02:02:18.301975 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d82m6" Feb 27 02:02:18 crc kubenswrapper[4771]: I0227 02:02:18.756164 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d82m6"] Feb 27 02:02:19 crc kubenswrapper[4771]: I0227 02:02:19.774944 4771 scope.go:117] "RemoveContainer" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" Feb 27 02:02:19 crc kubenswrapper[4771]: E0227 02:02:19.775209 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.087028 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d82m6" podUID="f853f3cb-f8fe-46d6-a177-929f0c7a55db" containerName="registry-server" containerID="cri-o://217051c63ac5eb7b63440f6f9fccf0ca2f93a692d3f589f8b1752234fa1b070f" gracePeriod=2 Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.608874 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d82m6" Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.678222 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f853f3cb-f8fe-46d6-a177-929f0c7a55db-catalog-content\") pod \"f853f3cb-f8fe-46d6-a177-929f0c7a55db\" (UID: \"f853f3cb-f8fe-46d6-a177-929f0c7a55db\") " Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.678301 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p7qn\" (UniqueName: \"kubernetes.io/projected/f853f3cb-f8fe-46d6-a177-929f0c7a55db-kube-api-access-9p7qn\") pod \"f853f3cb-f8fe-46d6-a177-929f0c7a55db\" (UID: \"f853f3cb-f8fe-46d6-a177-929f0c7a55db\") " Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.678390 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f853f3cb-f8fe-46d6-a177-929f0c7a55db-utilities\") pod \"f853f3cb-f8fe-46d6-a177-929f0c7a55db\" (UID: \"f853f3cb-f8fe-46d6-a177-929f0c7a55db\") " Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.679661 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f853f3cb-f8fe-46d6-a177-929f0c7a55db-utilities" (OuterVolumeSpecName: "utilities") pod "f853f3cb-f8fe-46d6-a177-929f0c7a55db" (UID: "f853f3cb-f8fe-46d6-a177-929f0c7a55db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.685860 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f853f3cb-f8fe-46d6-a177-929f0c7a55db-kube-api-access-9p7qn" (OuterVolumeSpecName: "kube-api-access-9p7qn") pod "f853f3cb-f8fe-46d6-a177-929f0c7a55db" (UID: "f853f3cb-f8fe-46d6-a177-929f0c7a55db"). InnerVolumeSpecName "kube-api-access-9p7qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.781148 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p7qn\" (UniqueName: \"kubernetes.io/projected/f853f3cb-f8fe-46d6-a177-929f0c7a55db-kube-api-access-9p7qn\") on node \"crc\" DevicePath \"\"" Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.781198 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f853f3cb-f8fe-46d6-a177-929f0c7a55db-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.806758 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f853f3cb-f8fe-46d6-a177-929f0c7a55db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f853f3cb-f8fe-46d6-a177-929f0c7a55db" (UID: "f853f3cb-f8fe-46d6-a177-929f0c7a55db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.818284 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7zpf6/must-gather-69wsd"] Feb 27 02:02:20 crc kubenswrapper[4771]: E0227 02:02:20.818801 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f853f3cb-f8fe-46d6-a177-929f0c7a55db" containerName="extract-content" Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.818827 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f853f3cb-f8fe-46d6-a177-929f0c7a55db" containerName="extract-content" Feb 27 02:02:20 crc kubenswrapper[4771]: E0227 02:02:20.818863 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181d3584-175e-49e7-9efc-58426d4b4903" containerName="oc" Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.818873 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="181d3584-175e-49e7-9efc-58426d4b4903" containerName="oc" Feb 27 02:02:20 crc kubenswrapper[4771]: E0227 02:02:20.818891 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f853f3cb-f8fe-46d6-a177-929f0c7a55db" containerName="extract-utilities" Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.818900 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f853f3cb-f8fe-46d6-a177-929f0c7a55db" containerName="extract-utilities" Feb 27 02:02:20 crc kubenswrapper[4771]: E0227 02:02:20.818924 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f853f3cb-f8fe-46d6-a177-929f0c7a55db" containerName="registry-server" Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.818931 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f853f3cb-f8fe-46d6-a177-929f0c7a55db" containerName="registry-server" Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.819167 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f853f3cb-f8fe-46d6-a177-929f0c7a55db" containerName="registry-server" Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.819205 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="181d3584-175e-49e7-9efc-58426d4b4903" containerName="oc" Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.820475 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7zpf6/must-gather-69wsd" Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.823195 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7zpf6"/"openshift-service-ca.crt" Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.823437 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7zpf6"/"kube-root-ca.crt" Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.833140 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7zpf6/must-gather-69wsd"] Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.885196 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/27073f48-03fd-4fce-99f4-730ba4479ae8-must-gather-output\") pod \"must-gather-69wsd\" (UID: \"27073f48-03fd-4fce-99f4-730ba4479ae8\") " pod="openshift-must-gather-7zpf6/must-gather-69wsd" Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.885265 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j8qr\" (UniqueName: \"kubernetes.io/projected/27073f48-03fd-4fce-99f4-730ba4479ae8-kube-api-access-5j8qr\") pod \"must-gather-69wsd\" (UID: \"27073f48-03fd-4fce-99f4-730ba4479ae8\") " pod="openshift-must-gather-7zpf6/must-gather-69wsd" Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.885681 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f853f3cb-f8fe-46d6-a177-929f0c7a55db-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.987294 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/27073f48-03fd-4fce-99f4-730ba4479ae8-must-gather-output\") pod \"must-gather-69wsd\" (UID: \"27073f48-03fd-4fce-99f4-730ba4479ae8\") " pod="openshift-must-gather-7zpf6/must-gather-69wsd" Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.987346 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j8qr\" (UniqueName: \"kubernetes.io/projected/27073f48-03fd-4fce-99f4-730ba4479ae8-kube-api-access-5j8qr\") pod \"must-gather-69wsd\" (UID: \"27073f48-03fd-4fce-99f4-730ba4479ae8\") " pod="openshift-must-gather-7zpf6/must-gather-69wsd" Feb 27 02:02:20 crc kubenswrapper[4771]: I0227 02:02:20.988104 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/27073f48-03fd-4fce-99f4-730ba4479ae8-must-gather-output\") pod \"must-gather-69wsd\" (UID: \"27073f48-03fd-4fce-99f4-730ba4479ae8\") " pod="openshift-must-gather-7zpf6/must-gather-69wsd" Feb 27 02:02:21 crc kubenswrapper[4771]: I0227 02:02:21.008191 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j8qr\" (UniqueName: \"kubernetes.io/projected/27073f48-03fd-4fce-99f4-730ba4479ae8-kube-api-access-5j8qr\") pod \"must-gather-69wsd\" (UID: \"27073f48-03fd-4fce-99f4-730ba4479ae8\") " pod="openshift-must-gather-7zpf6/must-gather-69wsd" Feb 27 02:02:21 crc kubenswrapper[4771]: I0227 02:02:21.096902 4771 generic.go:334] "Generic (PLEG): container finished" podID="f853f3cb-f8fe-46d6-a177-929f0c7a55db" containerID="217051c63ac5eb7b63440f6f9fccf0ca2f93a692d3f589f8b1752234fa1b070f" exitCode=0 Feb 27 02:02:21 crc kubenswrapper[4771]: I0227 02:02:21.096943 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d82m6" event={"ID":"f853f3cb-f8fe-46d6-a177-929f0c7a55db","Type":"ContainerDied","Data":"217051c63ac5eb7b63440f6f9fccf0ca2f93a692d3f589f8b1752234fa1b070f"} Feb 27 02:02:21 crc kubenswrapper[4771]: I0227 02:02:21.096971 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d82m6" event={"ID":"f853f3cb-f8fe-46d6-a177-929f0c7a55db","Type":"ContainerDied","Data":"917960a269556c7a0bad0c402a4b99eda069cbbd216ed078c9a0978c3d18871c"} Feb 27 02:02:21 crc kubenswrapper[4771]: I0227 02:02:21.096990 4771 scope.go:117] "RemoveContainer" containerID="217051c63ac5eb7b63440f6f9fccf0ca2f93a692d3f589f8b1752234fa1b070f" Feb 27 02:02:21 crc kubenswrapper[4771]: I0227 02:02:21.097119 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d82m6" Feb 27 02:02:21 crc kubenswrapper[4771]: I0227 02:02:21.133869 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d82m6"] Feb 27 02:02:21 crc kubenswrapper[4771]: I0227 02:02:21.135889 4771 scope.go:117] "RemoveContainer" containerID="53c4e276a532cffa248b2109631d06f99eb1a8eb219efc946cce160c87bb0ff8" Feb 27 02:02:21 crc kubenswrapper[4771]: I0227 02:02:21.138865 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7zpf6/must-gather-69wsd" Feb 27 02:02:21 crc kubenswrapper[4771]: I0227 02:02:21.142320 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d82m6"] Feb 27 02:02:21 crc kubenswrapper[4771]: I0227 02:02:21.167363 4771 scope.go:117] "RemoveContainer" containerID="e0df41afaa786c0476137c81c17dcc233ffb5a6d52802b649879153b46ba6c42" Feb 27 02:02:21 crc kubenswrapper[4771]: I0227 02:02:21.190140 4771 scope.go:117] "RemoveContainer" containerID="217051c63ac5eb7b63440f6f9fccf0ca2f93a692d3f589f8b1752234fa1b070f" Feb 27 02:02:21 crc kubenswrapper[4771]: E0227 02:02:21.190623 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"217051c63ac5eb7b63440f6f9fccf0ca2f93a692d3f589f8b1752234fa1b070f\": container with ID starting with 217051c63ac5eb7b63440f6f9fccf0ca2f93a692d3f589f8b1752234fa1b070f not found: ID does not exist" containerID="217051c63ac5eb7b63440f6f9fccf0ca2f93a692d3f589f8b1752234fa1b070f" Feb 27 02:02:21 crc kubenswrapper[4771]: I0227 02:02:21.190670 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"217051c63ac5eb7b63440f6f9fccf0ca2f93a692d3f589f8b1752234fa1b070f"} err="failed to get container status \"217051c63ac5eb7b63440f6f9fccf0ca2f93a692d3f589f8b1752234fa1b070f\": rpc error: code = NotFound desc = could not find container \"217051c63ac5eb7b63440f6f9fccf0ca2f93a692d3f589f8b1752234fa1b070f\": container with ID starting with 217051c63ac5eb7b63440f6f9fccf0ca2f93a692d3f589f8b1752234fa1b070f not found: ID does not exist" Feb 27 02:02:21 crc kubenswrapper[4771]: I0227 02:02:21.190698 4771 scope.go:117] "RemoveContainer" containerID="53c4e276a532cffa248b2109631d06f99eb1a8eb219efc946cce160c87bb0ff8" Feb 27 02:02:21 crc kubenswrapper[4771]: E0227 02:02:21.190981 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53c4e276a532cffa248b2109631d06f99eb1a8eb219efc946cce160c87bb0ff8\": container with ID starting with 53c4e276a532cffa248b2109631d06f99eb1a8eb219efc946cce160c87bb0ff8 not found: ID does not exist" containerID="53c4e276a532cffa248b2109631d06f99eb1a8eb219efc946cce160c87bb0ff8" Feb 27 02:02:21 crc kubenswrapper[4771]: I0227 02:02:21.191009 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53c4e276a532cffa248b2109631d06f99eb1a8eb219efc946cce160c87bb0ff8"} err="failed to get container status \"53c4e276a532cffa248b2109631d06f99eb1a8eb219efc946cce160c87bb0ff8\": rpc error: code = NotFound desc = could not find container \"53c4e276a532cffa248b2109631d06f99eb1a8eb219efc946cce160c87bb0ff8\": container with ID starting with 53c4e276a532cffa248b2109631d06f99eb1a8eb219efc946cce160c87bb0ff8 not found: ID does not exist" Feb 27 02:02:21 crc kubenswrapper[4771]: I0227 02:02:21.191029 4771 scope.go:117] "RemoveContainer" containerID="e0df41afaa786c0476137c81c17dcc233ffb5a6d52802b649879153b46ba6c42" Feb 27 02:02:21 crc kubenswrapper[4771]: E0227 02:02:21.191228 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0df41afaa786c0476137c81c17dcc233ffb5a6d52802b649879153b46ba6c42\": container with ID starting with e0df41afaa786c0476137c81c17dcc233ffb5a6d52802b649879153b46ba6c42 not found: ID does not exist" containerID="e0df41afaa786c0476137c81c17dcc233ffb5a6d52802b649879153b46ba6c42" Feb 27 02:02:21 crc kubenswrapper[4771]: I0227 02:02:21.191245 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0df41afaa786c0476137c81c17dcc233ffb5a6d52802b649879153b46ba6c42"} err="failed to get container status \"e0df41afaa786c0476137c81c17dcc233ffb5a6d52802b649879153b46ba6c42\": rpc error: code = NotFound desc = could not find container \"e0df41afaa786c0476137c81c17dcc233ffb5a6d52802b649879153b46ba6c42\": container with ID starting with e0df41afaa786c0476137c81c17dcc233ffb5a6d52802b649879153b46ba6c42 not found: ID does not exist" Feb 27 02:02:21 crc kubenswrapper[4771]: I0227 02:02:21.656895 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7zpf6/must-gather-69wsd"] Feb 27 02:02:21 crc kubenswrapper[4771]: W0227 02:02:21.660291 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27073f48_03fd_4fce_99f4_730ba4479ae8.slice/crio-48aab6e9e34bb876c70dc5718285f2f4f11ccb5c26d6a55e2bc65a5a327047a7 WatchSource:0}: Error finding container 48aab6e9e34bb876c70dc5718285f2f4f11ccb5c26d6a55e2bc65a5a327047a7: Status 404 returned error can't find the container with id 48aab6e9e34bb876c70dc5718285f2f4f11ccb5c26d6a55e2bc65a5a327047a7 Feb 27 02:02:21 crc kubenswrapper[4771]: I0227 02:02:21.788734 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f853f3cb-f8fe-46d6-a177-929f0c7a55db" path="/var/lib/kubelet/pods/f853f3cb-f8fe-46d6-a177-929f0c7a55db/volumes" Feb 27 02:02:22 crc kubenswrapper[4771]: I0227 02:02:22.108093 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7zpf6/must-gather-69wsd" event={"ID":"27073f48-03fd-4fce-99f4-730ba4479ae8","Type":"ContainerStarted","Data":"48aab6e9e34bb876c70dc5718285f2f4f11ccb5c26d6a55e2bc65a5a327047a7"} Feb 27 02:02:28 crc kubenswrapper[4771]: I0227 02:02:28.169003 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7zpf6/must-gather-69wsd" event={"ID":"27073f48-03fd-4fce-99f4-730ba4479ae8","Type":"ContainerStarted","Data":"1221c8926954ba14c11d9f618491b45f2fc799e61e3cc9abb9fe6b116a465b58"} Feb 27 02:02:29 crc kubenswrapper[4771]: I0227 02:02:29.182149 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7zpf6/must-gather-69wsd" event={"ID":"27073f48-03fd-4fce-99f4-730ba4479ae8","Type":"ContainerStarted","Data":"c98e5510848b2ac3a844f4adc3cef2b61206b6b0f94dac9561a3a603cf8f3d14"} Feb 27 02:02:29 crc kubenswrapper[4771]: I0227 02:02:29.201057 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7zpf6/must-gather-69wsd" podStartSLOduration=3.187914558 podStartE2EDuration="9.201037476s" podCreationTimestamp="2026-02-27 02:02:20 +0000 UTC" firstStartedPulling="2026-02-27 02:02:21.662958114 +0000 UTC m=+3454.600519402" lastFinishedPulling="2026-02-27 02:02:27.676081032 +0000 UTC m=+3460.613642320" observedRunningTime="2026-02-27 02:02:29.196034812 +0000 UTC m=+3462.133596170" watchObservedRunningTime="2026-02-27 02:02:29.201037476 +0000 UTC m=+3462.138598754" Feb 27 02:02:30 crc kubenswrapper[4771]: I0227 02:02:30.726167 4771 scope.go:117] "RemoveContainer" containerID="abe02359aba3470f4e2235c39f84b863bd1f52bba85116ab6a231d4f94e48ca3" Feb 27 02:02:30 crc kubenswrapper[4771]: I0227 02:02:30.774530 4771 scope.go:117] "RemoveContainer" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" Feb 27 02:02:30 crc kubenswrapper[4771]: E0227 02:02:30.774855 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:02:31 crc kubenswrapper[4771]: I0227 02:02:31.730273 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7zpf6/crc-debug-zn55p"] Feb 27 02:02:31 crc kubenswrapper[4771]: I0227 02:02:31.732160 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7zpf6/crc-debug-zn55p" Feb 27 02:02:31 crc kubenswrapper[4771]: I0227 02:02:31.734505 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7zpf6"/"default-dockercfg-cdp67" Feb 27 02:02:31 crc kubenswrapper[4771]: I0227 02:02:31.798223 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72lcz\" (UniqueName: \"kubernetes.io/projected/676a9309-6de7-4402-8cdd-64b9200ebf40-kube-api-access-72lcz\") pod \"crc-debug-zn55p\" (UID: \"676a9309-6de7-4402-8cdd-64b9200ebf40\") " pod="openshift-must-gather-7zpf6/crc-debug-zn55p" Feb 27 02:02:31 crc kubenswrapper[4771]: I0227 02:02:31.798353 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/676a9309-6de7-4402-8cdd-64b9200ebf40-host\") pod \"crc-debug-zn55p\" (UID: \"676a9309-6de7-4402-8cdd-64b9200ebf40\") " pod="openshift-must-gather-7zpf6/crc-debug-zn55p" Feb 27 02:02:31 crc kubenswrapper[4771]: I0227 02:02:31.900210 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72lcz\" (UniqueName: \"kubernetes.io/projected/676a9309-6de7-4402-8cdd-64b9200ebf40-kube-api-access-72lcz\") pod \"crc-debug-zn55p\" (UID: \"676a9309-6de7-4402-8cdd-64b9200ebf40\") " pod="openshift-must-gather-7zpf6/crc-debug-zn55p" Feb 27 02:02:31 crc kubenswrapper[4771]: I0227 02:02:31.900299 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/676a9309-6de7-4402-8cdd-64b9200ebf40-host\") pod \"crc-debug-zn55p\" (UID: \"676a9309-6de7-4402-8cdd-64b9200ebf40\") " pod="openshift-must-gather-7zpf6/crc-debug-zn55p" Feb 27 02:02:31 crc kubenswrapper[4771]: I0227 02:02:31.900829 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/676a9309-6de7-4402-8cdd-64b9200ebf40-host\") pod \"crc-debug-zn55p\" (UID: \"676a9309-6de7-4402-8cdd-64b9200ebf40\") " pod="openshift-must-gather-7zpf6/crc-debug-zn55p" Feb 27 02:02:31 crc kubenswrapper[4771]: I0227 02:02:31.918848 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72lcz\" (UniqueName: \"kubernetes.io/projected/676a9309-6de7-4402-8cdd-64b9200ebf40-kube-api-access-72lcz\") pod \"crc-debug-zn55p\" (UID: \"676a9309-6de7-4402-8cdd-64b9200ebf40\") " pod="openshift-must-gather-7zpf6/crc-debug-zn55p" Feb 27 02:02:32 crc kubenswrapper[4771]: I0227 02:02:32.070177 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7zpf6/crc-debug-zn55p" Feb 27 02:02:32 crc kubenswrapper[4771]: W0227 02:02:32.124573 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod676a9309_6de7_4402_8cdd_64b9200ebf40.slice/crio-704a896d7402154b7c1457811e18512a8ac8684c2bb4a64f9b1e543b9b66daa8 WatchSource:0}: Error finding container 704a896d7402154b7c1457811e18512a8ac8684c2bb4a64f9b1e543b9b66daa8: Status 404 returned error can't find the container with id 704a896d7402154b7c1457811e18512a8ac8684c2bb4a64f9b1e543b9b66daa8 Feb 27 02:02:32 crc kubenswrapper[4771]: I0227 02:02:32.207741 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7zpf6/crc-debug-zn55p" event={"ID":"676a9309-6de7-4402-8cdd-64b9200ebf40","Type":"ContainerStarted","Data":"704a896d7402154b7c1457811e18512a8ac8684c2bb4a64f9b1e543b9b66daa8"} Feb 27 02:02:45 crc kubenswrapper[4771]: I0227 02:02:45.333008 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7zpf6/crc-debug-zn55p" event={"ID":"676a9309-6de7-4402-8cdd-64b9200ebf40","Type":"ContainerStarted","Data":"967763f8e5adef6c55bbc7a0788966df348fbcf05d5212fa79c08317cf897070"} Feb 27 02:02:45 crc kubenswrapper[4771]: I0227 02:02:45.352942 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7zpf6/crc-debug-zn55p" podStartSLOduration=2.238503869 podStartE2EDuration="14.352923428s" podCreationTimestamp="2026-02-27 02:02:31 +0000 UTC" firstStartedPulling="2026-02-27 02:02:32.127625167 +0000 UTC m=+3465.065186455" lastFinishedPulling="2026-02-27 02:02:44.242044726 +0000 UTC m=+3477.179606014" observedRunningTime="2026-02-27 02:02:45.343875464 +0000 UTC m=+3478.281436752" watchObservedRunningTime="2026-02-27 02:02:45.352923428 +0000 UTC m=+3478.290484716" Feb 27 02:02:45 crc kubenswrapper[4771]: I0227 02:02:45.774295 4771 scope.go:117] "RemoveContainer" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" Feb 27 02:02:45 crc kubenswrapper[4771]: E0227 02:02:45.774657 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:03:00 crc kubenswrapper[4771]: I0227 02:03:00.774228 4771 scope.go:117] "RemoveContainer" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" Feb 27 02:03:00 crc kubenswrapper[4771]: E0227 02:03:00.774895 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:03:14 crc kubenswrapper[4771]: I0227 02:03:14.773377 4771 scope.go:117] "RemoveContainer" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" Feb 27 02:03:14 crc kubenswrapper[4771]: E0227 02:03:14.774139 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:03:21 crc kubenswrapper[4771]: I0227 02:03:21.669677 4771 generic.go:334] "Generic (PLEG): container finished" podID="676a9309-6de7-4402-8cdd-64b9200ebf40" containerID="967763f8e5adef6c55bbc7a0788966df348fbcf05d5212fa79c08317cf897070" exitCode=0 Feb 27 02:03:21 crc kubenswrapper[4771]: I0227 02:03:21.669774 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7zpf6/crc-debug-zn55p" event={"ID":"676a9309-6de7-4402-8cdd-64b9200ebf40","Type":"ContainerDied","Data":"967763f8e5adef6c55bbc7a0788966df348fbcf05d5212fa79c08317cf897070"} Feb 27 02:03:22 crc kubenswrapper[4771]: I0227 02:03:22.776762 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7zpf6/crc-debug-zn55p" Feb 27 02:03:22 crc kubenswrapper[4771]: I0227 02:03:22.794820 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/676a9309-6de7-4402-8cdd-64b9200ebf40-host\") pod \"676a9309-6de7-4402-8cdd-64b9200ebf40\" (UID: \"676a9309-6de7-4402-8cdd-64b9200ebf40\") " Feb 27 02:03:22 crc kubenswrapper[4771]: I0227 02:03:22.795079 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/676a9309-6de7-4402-8cdd-64b9200ebf40-host" (OuterVolumeSpecName: "host") pod "676a9309-6de7-4402-8cdd-64b9200ebf40" (UID: "676a9309-6de7-4402-8cdd-64b9200ebf40"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 02:03:22 crc kubenswrapper[4771]: I0227 02:03:22.795126 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72lcz\" (UniqueName: \"kubernetes.io/projected/676a9309-6de7-4402-8cdd-64b9200ebf40-kube-api-access-72lcz\") pod \"676a9309-6de7-4402-8cdd-64b9200ebf40\" (UID: \"676a9309-6de7-4402-8cdd-64b9200ebf40\") " Feb 27 02:03:22 crc kubenswrapper[4771]: I0227 02:03:22.795583 4771 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/676a9309-6de7-4402-8cdd-64b9200ebf40-host\") on node \"crc\" DevicePath \"\"" Feb 27 02:03:22 crc kubenswrapper[4771]: I0227 02:03:22.804353 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/676a9309-6de7-4402-8cdd-64b9200ebf40-kube-api-access-72lcz" (OuterVolumeSpecName: "kube-api-access-72lcz") pod "676a9309-6de7-4402-8cdd-64b9200ebf40" (UID: "676a9309-6de7-4402-8cdd-64b9200ebf40"). InnerVolumeSpecName "kube-api-access-72lcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:03:22 crc kubenswrapper[4771]: I0227 02:03:22.817253 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7zpf6/crc-debug-zn55p"] Feb 27 02:03:22 crc kubenswrapper[4771]: I0227 02:03:22.826746 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7zpf6/crc-debug-zn55p"] Feb 27 02:03:22 crc kubenswrapper[4771]: I0227 02:03:22.897732 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72lcz\" (UniqueName: \"kubernetes.io/projected/676a9309-6de7-4402-8cdd-64b9200ebf40-kube-api-access-72lcz\") on node \"crc\" DevicePath \"\"" Feb 27 02:03:23 crc kubenswrapper[4771]: I0227 02:03:23.687207 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="704a896d7402154b7c1457811e18512a8ac8684c2bb4a64f9b1e543b9b66daa8" Feb 27 02:03:23 crc kubenswrapper[4771]: I0227 02:03:23.687247 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7zpf6/crc-debug-zn55p" Feb 27 02:03:23 crc kubenswrapper[4771]: I0227 02:03:23.784030 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="676a9309-6de7-4402-8cdd-64b9200ebf40" path="/var/lib/kubelet/pods/676a9309-6de7-4402-8cdd-64b9200ebf40/volumes" Feb 27 02:03:24 crc kubenswrapper[4771]: I0227 02:03:24.074175 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7zpf6/crc-debug-jcv89"] Feb 27 02:03:24 crc kubenswrapper[4771]: E0227 02:03:24.075531 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="676a9309-6de7-4402-8cdd-64b9200ebf40" containerName="container-00" Feb 27 02:03:24 crc kubenswrapper[4771]: I0227 02:03:24.075573 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="676a9309-6de7-4402-8cdd-64b9200ebf40" containerName="container-00" Feb 27 02:03:24 crc kubenswrapper[4771]: I0227 02:03:24.076239 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="676a9309-6de7-4402-8cdd-64b9200ebf40" containerName="container-00" Feb 27 02:03:24 crc kubenswrapper[4771]: I0227 02:03:24.077518 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7zpf6/crc-debug-jcv89" Feb 27 02:03:24 crc kubenswrapper[4771]: I0227 02:03:24.083226 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7zpf6"/"default-dockercfg-cdp67" Feb 27 02:03:24 crc kubenswrapper[4771]: I0227 02:03:24.120668 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/24b487ef-7d56-4749-bb70-e5b47eaea491-host\") pod \"crc-debug-jcv89\" (UID: \"24b487ef-7d56-4749-bb70-e5b47eaea491\") " pod="openshift-must-gather-7zpf6/crc-debug-jcv89" Feb 27 02:03:24 crc kubenswrapper[4771]: I0227 02:03:24.120957 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md7z7\" (UniqueName: \"kubernetes.io/projected/24b487ef-7d56-4749-bb70-e5b47eaea491-kube-api-access-md7z7\") pod \"crc-debug-jcv89\" (UID: \"24b487ef-7d56-4749-bb70-e5b47eaea491\") " pod="openshift-must-gather-7zpf6/crc-debug-jcv89" Feb 27 02:03:24 crc kubenswrapper[4771]: I0227 02:03:24.224175 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/24b487ef-7d56-4749-bb70-e5b47eaea491-host\") pod \"crc-debug-jcv89\" (UID: \"24b487ef-7d56-4749-bb70-e5b47eaea491\") " pod="openshift-must-gather-7zpf6/crc-debug-jcv89" Feb 27 02:03:24 crc kubenswrapper[4771]: I0227 02:03:24.224318 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/24b487ef-7d56-4749-bb70-e5b47eaea491-host\") pod \"crc-debug-jcv89\" (UID: \"24b487ef-7d56-4749-bb70-e5b47eaea491\") " pod="openshift-must-gather-7zpf6/crc-debug-jcv89" Feb 27 02:03:24 crc kubenswrapper[4771]: I0227 02:03:24.224488 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md7z7\" (UniqueName: \"kubernetes.io/projected/24b487ef-7d56-4749-bb70-e5b47eaea491-kube-api-access-md7z7\") pod \"crc-debug-jcv89\" (UID: \"24b487ef-7d56-4749-bb70-e5b47eaea491\") " pod="openshift-must-gather-7zpf6/crc-debug-jcv89" Feb 27 02:03:24 crc kubenswrapper[4771]: I0227 02:03:24.250405 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md7z7\" (UniqueName: \"kubernetes.io/projected/24b487ef-7d56-4749-bb70-e5b47eaea491-kube-api-access-md7z7\") pod \"crc-debug-jcv89\" (UID: \"24b487ef-7d56-4749-bb70-e5b47eaea491\") " pod="openshift-must-gather-7zpf6/crc-debug-jcv89" Feb 27 02:03:24 crc kubenswrapper[4771]: I0227 02:03:24.399600 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7zpf6/crc-debug-jcv89" Feb 27 02:03:24 crc kubenswrapper[4771]: I0227 02:03:24.696680 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7zpf6/crc-debug-jcv89" event={"ID":"24b487ef-7d56-4749-bb70-e5b47eaea491","Type":"ContainerStarted","Data":"c47d4c1673062628653468d9986882cab1a0097d10df323372b5a945b973eed7"} Feb 27 02:03:24 crc kubenswrapper[4771]: I0227 02:03:24.696730 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7zpf6/crc-debug-jcv89" event={"ID":"24b487ef-7d56-4749-bb70-e5b47eaea491","Type":"ContainerStarted","Data":"75d53dfd3095134737d88e58f0c07d7405f5260d6980b33a30ea85bc381865ba"} Feb 27 02:03:24 crc kubenswrapper[4771]: I0227 02:03:24.718326 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7zpf6/crc-debug-jcv89" podStartSLOduration=0.718308678 podStartE2EDuration="718.308678ms" podCreationTimestamp="2026-02-27 02:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 02:03:24.712895732 +0000 UTC m=+3517.650457030" watchObservedRunningTime="2026-02-27 02:03:24.718308678 +0000 UTC m=+3517.655869966" Feb 27 02:03:25 crc kubenswrapper[4771]: I0227 02:03:25.707761 4771 generic.go:334] "Generic (PLEG): container finished" podID="24b487ef-7d56-4749-bb70-e5b47eaea491" containerID="c47d4c1673062628653468d9986882cab1a0097d10df323372b5a945b973eed7" exitCode=0 Feb 27 02:03:25 crc kubenswrapper[4771]: I0227 02:03:25.707855 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7zpf6/crc-debug-jcv89" event={"ID":"24b487ef-7d56-4749-bb70-e5b47eaea491","Type":"ContainerDied","Data":"c47d4c1673062628653468d9986882cab1a0097d10df323372b5a945b973eed7"} Feb 27 02:03:26 crc kubenswrapper[4771]: I0227 02:03:26.804702 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7zpf6/crc-debug-jcv89" Feb 27 02:03:26 crc kubenswrapper[4771]: I0227 02:03:26.833925 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7zpf6/crc-debug-jcv89"] Feb 27 02:03:26 crc kubenswrapper[4771]: I0227 02:03:26.843294 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7zpf6/crc-debug-jcv89"] Feb 27 02:03:26 crc kubenswrapper[4771]: I0227 02:03:26.871253 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/24b487ef-7d56-4749-bb70-e5b47eaea491-host\") pod \"24b487ef-7d56-4749-bb70-e5b47eaea491\" (UID: \"24b487ef-7d56-4749-bb70-e5b47eaea491\") " Feb 27 02:03:26 crc kubenswrapper[4771]: I0227 02:03:26.871347 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b487ef-7d56-4749-bb70-e5b47eaea491-host" (OuterVolumeSpecName: "host") pod "24b487ef-7d56-4749-bb70-e5b47eaea491" (UID: "24b487ef-7d56-4749-bb70-e5b47eaea491"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 02:03:26 crc kubenswrapper[4771]: I0227 02:03:26.871399 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md7z7\" (UniqueName: \"kubernetes.io/projected/24b487ef-7d56-4749-bb70-e5b47eaea491-kube-api-access-md7z7\") pod \"24b487ef-7d56-4749-bb70-e5b47eaea491\" (UID: \"24b487ef-7d56-4749-bb70-e5b47eaea491\") " Feb 27 02:03:26 crc kubenswrapper[4771]: I0227 02:03:26.871965 4771 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/24b487ef-7d56-4749-bb70-e5b47eaea491-host\") on node \"crc\" DevicePath \"\"" Feb 27 02:03:26 crc kubenswrapper[4771]: I0227 02:03:26.876730 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24b487ef-7d56-4749-bb70-e5b47eaea491-kube-api-access-md7z7" (OuterVolumeSpecName: "kube-api-access-md7z7") pod "24b487ef-7d56-4749-bb70-e5b47eaea491" (UID: "24b487ef-7d56-4749-bb70-e5b47eaea491"). InnerVolumeSpecName "kube-api-access-md7z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:03:26 crc kubenswrapper[4771]: I0227 02:03:26.973653 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md7z7\" (UniqueName: \"kubernetes.io/projected/24b487ef-7d56-4749-bb70-e5b47eaea491-kube-api-access-md7z7\") on node \"crc\" DevicePath \"\"" Feb 27 02:03:27 crc kubenswrapper[4771]: I0227 02:03:27.729258 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75d53dfd3095134737d88e58f0c07d7405f5260d6980b33a30ea85bc381865ba" Feb 27 02:03:27 crc kubenswrapper[4771]: I0227 02:03:27.729349 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7zpf6/crc-debug-jcv89" Feb 27 02:03:27 crc kubenswrapper[4771]: I0227 02:03:27.782018 4771 scope.go:117] "RemoveContainer" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" Feb 27 02:03:27 crc kubenswrapper[4771]: E0227 02:03:27.782352 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:03:27 crc kubenswrapper[4771]: I0227 02:03:27.786286 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24b487ef-7d56-4749-bb70-e5b47eaea491" path="/var/lib/kubelet/pods/24b487ef-7d56-4749-bb70-e5b47eaea491/volumes" Feb 27 02:03:28 crc kubenswrapper[4771]: I0227 02:03:28.041351 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7zpf6/crc-debug-j8zs4"] Feb 27 02:03:28 crc kubenswrapper[4771]: E0227 02:03:28.042382 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b487ef-7d56-4749-bb70-e5b47eaea491" containerName="container-00" Feb 27 02:03:28 crc kubenswrapper[4771]: I0227 02:03:28.042402 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b487ef-7d56-4749-bb70-e5b47eaea491" containerName="container-00" Feb 27 02:03:28 crc kubenswrapper[4771]: I0227 02:03:28.042669 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b487ef-7d56-4749-bb70-e5b47eaea491" containerName="container-00" Feb 27 02:03:28 crc kubenswrapper[4771]: I0227 02:03:28.043411 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7zpf6/crc-debug-j8zs4" Feb 27 02:03:28 crc kubenswrapper[4771]: I0227 02:03:28.045683 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7zpf6"/"default-dockercfg-cdp67" Feb 27 02:03:28 crc kubenswrapper[4771]: I0227 02:03:28.093425 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh2xb\" (UniqueName: \"kubernetes.io/projected/089b165e-a878-44ba-8c90-7efcba82bfd3-kube-api-access-bh2xb\") pod \"crc-debug-j8zs4\" (UID: \"089b165e-a878-44ba-8c90-7efcba82bfd3\") " pod="openshift-must-gather-7zpf6/crc-debug-j8zs4" Feb 27 02:03:28 crc kubenswrapper[4771]: I0227 02:03:28.093516 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/089b165e-a878-44ba-8c90-7efcba82bfd3-host\") pod \"crc-debug-j8zs4\" (UID: \"089b165e-a878-44ba-8c90-7efcba82bfd3\") " pod="openshift-must-gather-7zpf6/crc-debug-j8zs4" Feb 27 02:03:28 crc kubenswrapper[4771]: I0227 02:03:28.195938 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh2xb\" (UniqueName: \"kubernetes.io/projected/089b165e-a878-44ba-8c90-7efcba82bfd3-kube-api-access-bh2xb\") pod \"crc-debug-j8zs4\" (UID: \"089b165e-a878-44ba-8c90-7efcba82bfd3\") " pod="openshift-must-gather-7zpf6/crc-debug-j8zs4" Feb 27 02:03:28 crc kubenswrapper[4771]: I0227 02:03:28.196525 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/089b165e-a878-44ba-8c90-7efcba82bfd3-host\") pod \"crc-debug-j8zs4\" (UID: \"089b165e-a878-44ba-8c90-7efcba82bfd3\") " pod="openshift-must-gather-7zpf6/crc-debug-j8zs4" Feb 27 02:03:28 crc kubenswrapper[4771]: I0227 02:03:28.196468 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/089b165e-a878-44ba-8c90-7efcba82bfd3-host\") pod \"crc-debug-j8zs4\" (UID: \"089b165e-a878-44ba-8c90-7efcba82bfd3\") " pod="openshift-must-gather-7zpf6/crc-debug-j8zs4" Feb 27 02:03:28 crc kubenswrapper[4771]: I0227 02:03:28.218522 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh2xb\" (UniqueName: \"kubernetes.io/projected/089b165e-a878-44ba-8c90-7efcba82bfd3-kube-api-access-bh2xb\") pod \"crc-debug-j8zs4\" (UID: \"089b165e-a878-44ba-8c90-7efcba82bfd3\") " pod="openshift-must-gather-7zpf6/crc-debug-j8zs4" Feb 27 02:03:28 crc kubenswrapper[4771]: I0227 02:03:28.365086 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7zpf6/crc-debug-j8zs4" Feb 27 02:03:28 crc kubenswrapper[4771]: W0227 02:03:28.400509 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod089b165e_a878_44ba_8c90_7efcba82bfd3.slice/crio-6f1e888d3702baf1d32ea8b406f7cca355ccfefbddd20e6e6b7cbd2b951957cb WatchSource:0}: Error finding container 6f1e888d3702baf1d32ea8b406f7cca355ccfefbddd20e6e6b7cbd2b951957cb: Status 404 returned error can't find the container with id 6f1e888d3702baf1d32ea8b406f7cca355ccfefbddd20e6e6b7cbd2b951957cb Feb 27 02:03:28 crc kubenswrapper[4771]: I0227 02:03:28.742764 4771 generic.go:334] "Generic (PLEG): container finished" podID="089b165e-a878-44ba-8c90-7efcba82bfd3" containerID="4c132d5d6f6cbda4e20fe860dac2fb0d52cfdd0c89a444a673fe5951da24b7fd" exitCode=0 Feb 27 02:03:28 crc kubenswrapper[4771]: I0227 02:03:28.743153 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7zpf6/crc-debug-j8zs4" event={"ID":"089b165e-a878-44ba-8c90-7efcba82bfd3","Type":"ContainerDied","Data":"4c132d5d6f6cbda4e20fe860dac2fb0d52cfdd0c89a444a673fe5951da24b7fd"} Feb 27 02:03:28 crc kubenswrapper[4771]: I0227 02:03:28.743198 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7zpf6/crc-debug-j8zs4" event={"ID":"089b165e-a878-44ba-8c90-7efcba82bfd3","Type":"ContainerStarted","Data":"6f1e888d3702baf1d32ea8b406f7cca355ccfefbddd20e6e6b7cbd2b951957cb"} Feb 27 02:03:28 crc kubenswrapper[4771]: I0227 02:03:28.781538 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7zpf6/crc-debug-j8zs4"] Feb 27 02:03:28 crc kubenswrapper[4771]: I0227 02:03:28.791692 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7zpf6/crc-debug-j8zs4"] Feb 27 02:03:29 crc kubenswrapper[4771]: I0227 02:03:29.886033 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7zpf6/crc-debug-j8zs4" Feb 27 02:03:29 crc kubenswrapper[4771]: I0227 02:03:29.923604 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/089b165e-a878-44ba-8c90-7efcba82bfd3-host\") pod \"089b165e-a878-44ba-8c90-7efcba82bfd3\" (UID: \"089b165e-a878-44ba-8c90-7efcba82bfd3\") " Feb 27 02:03:29 crc kubenswrapper[4771]: I0227 02:03:29.923745 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/089b165e-a878-44ba-8c90-7efcba82bfd3-host" (OuterVolumeSpecName: "host") pod "089b165e-a878-44ba-8c90-7efcba82bfd3" (UID: "089b165e-a878-44ba-8c90-7efcba82bfd3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 02:03:29 crc kubenswrapper[4771]: I0227 02:03:29.923846 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh2xb\" (UniqueName: \"kubernetes.io/projected/089b165e-a878-44ba-8c90-7efcba82bfd3-kube-api-access-bh2xb\") pod \"089b165e-a878-44ba-8c90-7efcba82bfd3\" (UID: \"089b165e-a878-44ba-8c90-7efcba82bfd3\") " Feb 27 02:03:29 crc kubenswrapper[4771]: I0227 02:03:29.924402 4771 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/089b165e-a878-44ba-8c90-7efcba82bfd3-host\") on node \"crc\" DevicePath \"\"" Feb 27 02:03:29 crc kubenswrapper[4771]: I0227 02:03:29.934145 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/089b165e-a878-44ba-8c90-7efcba82bfd3-kube-api-access-bh2xb" (OuterVolumeSpecName: "kube-api-access-bh2xb") pod "089b165e-a878-44ba-8c90-7efcba82bfd3" (UID: "089b165e-a878-44ba-8c90-7efcba82bfd3"). InnerVolumeSpecName "kube-api-access-bh2xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:03:30 crc kubenswrapper[4771]: I0227 02:03:30.026295 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bh2xb\" (UniqueName: \"kubernetes.io/projected/089b165e-a878-44ba-8c90-7efcba82bfd3-kube-api-access-bh2xb\") on node \"crc\" DevicePath \"\"" Feb 27 02:03:30 crc kubenswrapper[4771]: I0227 02:03:30.774174 4771 scope.go:117] "RemoveContainer" containerID="4c132d5d6f6cbda4e20fe860dac2fb0d52cfdd0c89a444a673fe5951da24b7fd" Feb 27 02:03:30 crc kubenswrapper[4771]: I0227 02:03:30.774201 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7zpf6/crc-debug-j8zs4" Feb 27 02:03:31 crc kubenswrapper[4771]: I0227 02:03:31.785911 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="089b165e-a878-44ba-8c90-7efcba82bfd3" path="/var/lib/kubelet/pods/089b165e-a878-44ba-8c90-7efcba82bfd3/volumes" Feb 27 02:03:39 crc kubenswrapper[4771]: I0227 02:03:39.773745 4771 scope.go:117] "RemoveContainer" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" Feb 27 02:03:40 crc kubenswrapper[4771]: I0227 02:03:40.858935 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerStarted","Data":"d25c2ee99624ab7384d501c40defa38c3465b16865a65e0252cb74db46e114bb"} Feb 27 02:03:45 crc kubenswrapper[4771]: I0227 02:03:45.506225 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-69fd595d46-6k6cs_d010a73f-6034-48ea-b18b-3bad26fe39ee/barbican-api/0.log" Feb 27 02:03:45 crc kubenswrapper[4771]: I0227 02:03:45.736628 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-69fd595d46-6k6cs_d010a73f-6034-48ea-b18b-3bad26fe39ee/barbican-api-log/0.log" Feb 27 02:03:45 crc kubenswrapper[4771]: I0227 02:03:45.779302 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-66c7555cc4-mtbzr_13fb6f6e-1dda-4e09-971a-d0629bc44ff4/barbican-keystone-listener/0.log" Feb 27 02:03:45 crc kubenswrapper[4771]: I0227 02:03:45.834250 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-66c7555cc4-mtbzr_13fb6f6e-1dda-4e09-971a-d0629bc44ff4/barbican-keystone-listener-log/0.log" Feb 27 02:03:45 crc kubenswrapper[4771]: I0227 02:03:45.970497 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5894b4657f-lj4ff_24cb181d-8c43-4ae8-9af0-b28f570f7f22/barbican-worker/0.log" Feb 27 02:03:46 crc kubenswrapper[4771]: I0227 02:03:46.055995 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5894b4657f-lj4ff_24cb181d-8c43-4ae8-9af0-b28f570f7f22/barbican-worker-log/0.log" Feb 27 02:03:46 crc kubenswrapper[4771]: I0227 02:03:46.168965 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5_34ae2923-be95-45e5-a840-dfea9b17f9c4/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:03:46 crc kubenswrapper[4771]: I0227 02:03:46.273130 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_26685008-55b9-4176-98b8-f915a6004b36/ceilometer-central-agent/0.log" Feb 27 02:03:46 crc kubenswrapper[4771]: I0227 02:03:46.306689 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_26685008-55b9-4176-98b8-f915a6004b36/ceilometer-notification-agent/0.log" Feb 27 02:03:46 crc kubenswrapper[4771]: I0227 02:03:46.407890 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_26685008-55b9-4176-98b8-f915a6004b36/proxy-httpd/0.log" Feb 27 02:03:46 crc kubenswrapper[4771]: I0227 02:03:46.453441 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_26685008-55b9-4176-98b8-f915a6004b36/sg-core/0.log" Feb 27 02:03:46 crc kubenswrapper[4771]: I0227 02:03:46.562135 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3b708a5c-dd83-482a-bf4a-988909a38d76/cinder-api/0.log" Feb 27 02:03:46 crc kubenswrapper[4771]: I0227 02:03:46.643958 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3b708a5c-dd83-482a-bf4a-988909a38d76/cinder-api-log/0.log" Feb 27 02:03:46 crc kubenswrapper[4771]: I0227 02:03:46.774098 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e15c68f3-a904-4d91-a778-4e5b5a728c9f/probe/0.log" Feb 27 02:03:46 crc kubenswrapper[4771]: I0227 02:03:46.825108 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e15c68f3-a904-4d91-a778-4e5b5a728c9f/cinder-scheduler/0.log" Feb 27 02:03:47 crc kubenswrapper[4771]: I0227 02:03:47.154072 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf_acd636bf-528e-4bbe-8220-e4a9b755b025/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:03:47 crc kubenswrapper[4771]: I0227 02:03:47.208398 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl_67424e5d-eec0-4d0c-ba08-eebe40f4ac6e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:03:47 crc kubenswrapper[4771]: I0227 02:03:47.395705 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-b7nss_13aff92d-bbb5-4229-8296-90dea52e389a/init/0.log" Feb 27 02:03:47 crc kubenswrapper[4771]: I0227 02:03:47.542717 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-b7nss_13aff92d-bbb5-4229-8296-90dea52e389a/init/0.log" Feb 27 02:03:47 crc kubenswrapper[4771]: I0227 02:03:47.571792 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-b7nss_13aff92d-bbb5-4229-8296-90dea52e389a/dnsmasq-dns/0.log" Feb 27 02:03:47 crc kubenswrapper[4771]: I0227 02:03:47.620443 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-47l7f_761add5e-bade-44af-be1b-3cbcaa54f19a/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:03:47 crc kubenswrapper[4771]: I0227 02:03:47.832134 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_03b15be0-3bda-4754-b43a-35e34cb84fcb/glance-log/0.log" Feb 27 02:03:47 crc kubenswrapper[4771]: I0227 02:03:47.846470 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_03b15be0-3bda-4754-b43a-35e34cb84fcb/glance-httpd/0.log" Feb 27 02:03:47 crc kubenswrapper[4771]: I0227 02:03:47.997899 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8c197b80-0aa2-49fd-b9d6-19cbb40e59e3/glance-httpd/0.log" Feb 27 02:03:48 crc kubenswrapper[4771]: I0227 02:03:48.037788 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8c197b80-0aa2-49fd-b9d6-19cbb40e59e3/glance-log/0.log" Feb 27 02:03:48 crc kubenswrapper[4771]: I0227 02:03:48.136708 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-555c84df64-lmgxw_9db15a3b-2c83-4d54-b5ea-697e6362b4e9/horizon/0.log" Feb 27 02:03:48 crc kubenswrapper[4771]: I0227 02:03:48.376263 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5_d903dbaa-f429-4c92-8c5a-17c1622bf8bd/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:03:48 crc kubenswrapper[4771]: I0227 02:03:48.534948 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-4kmmj_53bf3a2a-497c-4432-8b0f-e8092fcb72ff/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:03:48 crc kubenswrapper[4771]: I0227 02:03:48.537275 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-555c84df64-lmgxw_9db15a3b-2c83-4d54-b5ea-697e6362b4e9/horizon-log/0.log" Feb 27 02:03:48 crc kubenswrapper[4771]: I0227 02:03:48.764897 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-56bfd8fdf6-rxxnr_b66f9559-0d35-47b3-ab89-06425ff3afd3/keystone-api/0.log" Feb 27 02:03:48 crc kubenswrapper[4771]: I0227 02:03:48.815098 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29535961-smmft_46c1ffed-6a0c-4b69-9dfe-2474731d06b7/keystone-cron/0.log" Feb 27 02:03:48 crc kubenswrapper[4771]: I0227 02:03:48.956170 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_336e9838-30f4-4164-8664-073e172d8750/kube-state-metrics/0.log" Feb 27 02:03:49 crc kubenswrapper[4771]: I0227 02:03:49.050469 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz_40c7ae0e-123b-42cf-99cf-57309d7c22b0/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:03:49 crc kubenswrapper[4771]: I0227 02:03:49.416222 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6fd6bd959-l4htk_db54a8be-2fc6-4aee-b505-e1a526407006/neutron-api/0.log" Feb 27 02:03:49 crc kubenswrapper[4771]: I0227 02:03:49.481883 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6fd6bd959-l4htk_db54a8be-2fc6-4aee-b505-e1a526407006/neutron-httpd/0.log" Feb 27 02:03:49 crc kubenswrapper[4771]: I0227 02:03:49.786305 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl_9a2ce866-27c5-4ac5-8a27-d44ba505c3d8/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:03:50 crc kubenswrapper[4771]: I0227 02:03:50.260008 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_03b297ed-ac7f-4416-b929-b3d463bc5d72/nova-cell0-conductor-conductor/0.log" Feb 27 02:03:50 crc kubenswrapper[4771]: I0227 02:03:50.264409 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_01f57ee9-e99c-48b0-834b-af553e0c7e5f/nova-api-log/0.log" Feb 27 02:03:50 crc kubenswrapper[4771]: I0227 02:03:50.341613 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_01f57ee9-e99c-48b0-834b-af553e0c7e5f/nova-api-api/0.log" Feb 27 02:03:50 crc kubenswrapper[4771]: I0227 02:03:50.581514 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_fdf27295-a275-4fb3-9e79-c3627df37a39/nova-cell1-conductor-conductor/0.log" Feb 27 02:03:50 crc kubenswrapper[4771]: I0227 02:03:50.595176 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_f748ee94-8cc7-4616-a035-a35770442cbc/nova-cell1-novncproxy-novncproxy/0.log" Feb 27 02:03:51 crc kubenswrapper[4771]: I0227 02:03:51.133479 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-6lnt2_d2a7b19f-a0a4-4aa8-80c5-f05300c19d99/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:03:51 crc kubenswrapper[4771]: I0227 02:03:51.156198 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_81dfb61e-b373-4273-b55c-0d4680f89779/nova-metadata-log/0.log" Feb 27 02:03:51 crc kubenswrapper[4771]: I0227 02:03:51.544142 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_39fb27d1-e9a6-44e4-9f92-d5f0242a8007/mysql-bootstrap/0.log" Feb 27 02:03:51 crc kubenswrapper[4771]: I0227 02:03:51.569045 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_ec376af9-95db-45e8-bb5b-1a4bec9e0197/nova-scheduler-scheduler/0.log" Feb 27 02:03:51 crc kubenswrapper[4771]: I0227 02:03:51.811059 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_39fb27d1-e9a6-44e4-9f92-d5f0242a8007/galera/0.log" Feb 27 02:03:51 crc kubenswrapper[4771]: I0227 02:03:51.813263 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_39fb27d1-e9a6-44e4-9f92-d5f0242a8007/mysql-bootstrap/0.log" Feb 27 02:03:52 crc kubenswrapper[4771]: I0227 02:03:52.034923 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8be4acd2-0f92-4f9f-9521-5da586b712f0/mysql-bootstrap/0.log" Feb 27 02:03:52 crc kubenswrapper[4771]: I0227 02:03:52.184751 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_81dfb61e-b373-4273-b55c-0d4680f89779/nova-metadata-metadata/0.log" Feb 27 02:03:52 crc kubenswrapper[4771]: I0227 02:03:52.213129 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8be4acd2-0f92-4f9f-9521-5da586b712f0/galera/0.log" Feb 27 02:03:52 crc kubenswrapper[4771]: I0227 02:03:52.228760 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8be4acd2-0f92-4f9f-9521-5da586b712f0/mysql-bootstrap/0.log" Feb 27 02:03:52 crc kubenswrapper[4771]: I0227 02:03:52.401382 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de/openstackclient/0.log" Feb 27 02:03:52 crc kubenswrapper[4771]: I0227 02:03:52.461501 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-fcmgm_3ef0bfcb-87a8-4b1d-9084-3486da00981a/openstack-network-exporter/0.log" Feb 27 02:03:52 crc kubenswrapper[4771]: I0227 02:03:52.643357 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tjchc_000564b2-d16b-45fb-ba91-e65b85bd7fb5/ovsdb-server-init/0.log" Feb 27 02:03:52 crc kubenswrapper[4771]: I0227 02:03:52.865392 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tjchc_000564b2-d16b-45fb-ba91-e65b85bd7fb5/ovsdb-server-init/0.log" Feb 27 02:03:52 crc kubenswrapper[4771]: I0227 02:03:52.890376 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tjchc_000564b2-d16b-45fb-ba91-e65b85bd7fb5/ovsdb-server/0.log" Feb 27 02:03:52 crc kubenswrapper[4771]: I0227 02:03:52.929195 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tjchc_000564b2-d16b-45fb-ba91-e65b85bd7fb5/ovs-vswitchd/0.log" Feb 27 02:03:53 crc kubenswrapper[4771]: I0227 02:03:53.125668 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-s5lkp_8c578c69-744e-425b-8bb1-76eec4b332ec/ovn-controller/0.log" Feb 27 02:03:53 crc kubenswrapper[4771]: I0227 02:03:53.192247 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-c9bfk_64b58c2b-7189-40c8-94b0-c31f167845d1/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:03:53 crc kubenswrapper[4771]: I0227 02:03:53.352912 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_65f02053-1ff7-4e60-ae6e-e25c36df39da/ovn-northd/0.log" Feb 27 02:03:53 crc kubenswrapper[4771]: I0227 02:03:53.374843 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_65f02053-1ff7-4e60-ae6e-e25c36df39da/openstack-network-exporter/0.log" Feb 27 02:03:53 crc kubenswrapper[4771]: I0227 02:03:53.544590 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_13396b98-6f5b-4800-854f-7b7d6af4cda4/openstack-network-exporter/0.log" Feb 27 02:03:53 crc kubenswrapper[4771]: I0227 02:03:53.617377 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_13396b98-6f5b-4800-854f-7b7d6af4cda4/ovsdbserver-nb/0.log" Feb 27 02:03:53 crc kubenswrapper[4771]: I0227 02:03:53.728913 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ed3a808f-7dba-4f32-a081-29eab07e84c0/openstack-network-exporter/0.log" Feb 27 02:03:53 crc kubenswrapper[4771]: I0227 02:03:53.793510 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ed3a808f-7dba-4f32-a081-29eab07e84c0/ovsdbserver-sb/0.log" Feb 27 02:03:53 crc kubenswrapper[4771]: I0227 02:03:53.920843 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5cddbc5576-b9kzz_577d7298-4011-4f66-a59c-36b823400652/placement-api/0.log" Feb 27 02:03:54 crc kubenswrapper[4771]: I0227 02:03:54.105641 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5cddbc5576-b9kzz_577d7298-4011-4f66-a59c-36b823400652/placement-log/0.log" Feb 27 02:03:54 crc kubenswrapper[4771]: I0227 02:03:54.126640 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_370e8739-d955-433e-8f61-b8e3bc1d8dc7/setup-container/0.log" Feb 27 02:03:54 crc kubenswrapper[4771]: I0227 02:03:54.364010 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_370e8739-d955-433e-8f61-b8e3bc1d8dc7/rabbitmq/0.log" Feb 27 02:03:54 crc kubenswrapper[4771]: I0227 02:03:54.376920 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_370e8739-d955-433e-8f61-b8e3bc1d8dc7/setup-container/0.log" Feb 27 02:03:54 crc kubenswrapper[4771]: I0227 02:03:54.421768 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7813115d-b642-406c-892d-61b10c9777d2/setup-container/0.log" Feb 27 02:03:54 crc kubenswrapper[4771]: I0227 02:03:54.572302 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7813115d-b642-406c-892d-61b10c9777d2/setup-container/0.log" Feb 27 02:03:54 crc kubenswrapper[4771]: I0227 02:03:54.629873 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7813115d-b642-406c-892d-61b10c9777d2/rabbitmq/0.log" Feb 27 02:03:54 crc kubenswrapper[4771]: I0227 02:03:54.837029 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k_df815e54-72eb-44e8-b6dd-a1758fd381e0/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:03:54 crc kubenswrapper[4771]: I0227 02:03:54.976661 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-f88s5_4e2d0148-506d-458b-89c3-1faf19410b6b/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:03:55 crc kubenswrapper[4771]: I0227 02:03:55.046153 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt_c6b0ecf8-2611-4192-94ad-c4f9974cbab9/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:03:55 crc kubenswrapper[4771]: I0227 02:03:55.215348 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-pr4dd_6124b0a4-176b-41d9-8ebc-db0675eeb0e4/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:03:55 crc kubenswrapper[4771]: I0227 02:03:55.280533 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-2v7s2_59c79bc1-5dcb-495e-8ce8-7c74517d2df6/ssh-known-hosts-edpm-deployment/0.log" Feb 27 02:03:55 crc kubenswrapper[4771]: I0227 02:03:55.514188 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7b8d8fb79c-qxz4q_d0f1ec21-667d-46de-abbb-cb95d29e861c/proxy-server/0.log" Feb 27 02:03:55 crc kubenswrapper[4771]: I0227 02:03:55.567899 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7b8d8fb79c-qxz4q_d0f1ec21-667d-46de-abbb-cb95d29e861c/proxy-httpd/0.log" Feb 27 02:03:55 crc kubenswrapper[4771]: I0227 02:03:55.632155 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-cm796_8a59a151-f189-4128-b462-29557b12a8da/swift-ring-rebalance/0.log" Feb 27 02:03:55 crc kubenswrapper[4771]: I0227 02:03:55.763498 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/account-reaper/0.log" Feb 27 02:03:55 crc kubenswrapper[4771]: I0227 02:03:55.790255 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/account-auditor/0.log" Feb 27 02:03:55 crc kubenswrapper[4771]: I0227 02:03:55.890069 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/account-replicator/0.log" Feb 27 02:03:55 crc kubenswrapper[4771]: I0227 02:03:55.970631 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/container-auditor/0.log" Feb 27 02:03:55 crc kubenswrapper[4771]: I0227 02:03:55.992078 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/account-server/0.log" Feb 27 02:03:56 crc kubenswrapper[4771]: I0227 02:03:56.129563 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/container-server/0.log" Feb 27 02:03:56 crc kubenswrapper[4771]: I0227 02:03:56.132291 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/container-updater/0.log" Feb 27 02:03:56 crc kubenswrapper[4771]: I0227 02:03:56.157090 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/container-replicator/0.log" Feb 27 02:03:56 crc kubenswrapper[4771]: I0227 02:03:56.281346 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/object-auditor/0.log" Feb 27 02:03:56 crc kubenswrapper[4771]: I0227 02:03:56.321572 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/object-replicator/0.log" Feb 27 02:03:56 crc kubenswrapper[4771]: I0227 02:03:56.372743 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/object-expirer/0.log" Feb 27 02:03:56 crc kubenswrapper[4771]: I0227 02:03:56.438808 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/object-server/0.log" Feb 27 02:03:56 crc kubenswrapper[4771]: I0227 02:03:56.509205 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/object-updater/0.log" Feb 27 02:03:56 crc kubenswrapper[4771]: I0227 02:03:56.583876 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/rsync/0.log" Feb 27 02:03:56 crc kubenswrapper[4771]: I0227 02:03:56.590007 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/swift-recon-cron/0.log" Feb 27 02:03:56 crc kubenswrapper[4771]: I0227 02:03:56.830423 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_4b362ce5-5892-43a0-8ec9-e280131b32ee/tempest-tests-tempest-tests-runner/0.log" Feb 27 02:03:56 crc kubenswrapper[4771]: I0227 02:03:56.874416 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-hlghk_dc880077-8590-47a1-a434-e8cebcf3fff1/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:03:57 crc kubenswrapper[4771]: I0227 02:03:57.061361 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_8967caa6-adcf-41fa-8cba-ccb9aaf3b0f4/test-operator-logs-container/0.log" Feb 27 02:03:57 crc kubenswrapper[4771]: I0227 02:03:57.151161 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5_fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:04:00 crc kubenswrapper[4771]: I0227 02:04:00.146658 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535964-8mtr2"] Feb 27 02:04:00 crc kubenswrapper[4771]: E0227 02:04:00.147686 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="089b165e-a878-44ba-8c90-7efcba82bfd3" containerName="container-00" Feb 27 02:04:00 crc kubenswrapper[4771]: I0227 02:04:00.147702 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="089b165e-a878-44ba-8c90-7efcba82bfd3" containerName="container-00" Feb 27 02:04:00 crc kubenswrapper[4771]: I0227 02:04:00.147943 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="089b165e-a878-44ba-8c90-7efcba82bfd3" containerName="container-00" Feb 27 02:04:00 crc kubenswrapper[4771]: I0227 02:04:00.149353 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535964-8mtr2" Feb 27 02:04:00 crc kubenswrapper[4771]: I0227 02:04:00.156347 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 02:04:00 crc kubenswrapper[4771]: I0227 02:04:00.157716 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 02:04:00 crc kubenswrapper[4771]: I0227 02:04:00.161955 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 02:04:00 crc kubenswrapper[4771]: I0227 02:04:00.163452 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535964-8mtr2"] Feb 27 02:04:00 crc kubenswrapper[4771]: I0227 02:04:00.308490 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whc2k\" (UniqueName: \"kubernetes.io/projected/c11c843e-c871-4ff4-96d0-6b35e86d9453-kube-api-access-whc2k\") pod \"auto-csr-approver-29535964-8mtr2\" (UID: \"c11c843e-c871-4ff4-96d0-6b35e86d9453\") " pod="openshift-infra/auto-csr-approver-29535964-8mtr2" Feb 27 02:04:00 crc kubenswrapper[4771]: I0227 02:04:00.410178 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whc2k\" (UniqueName: \"kubernetes.io/projected/c11c843e-c871-4ff4-96d0-6b35e86d9453-kube-api-access-whc2k\") pod \"auto-csr-approver-29535964-8mtr2\" (UID: \"c11c843e-c871-4ff4-96d0-6b35e86d9453\") " pod="openshift-infra/auto-csr-approver-29535964-8mtr2" Feb 27 02:04:00 crc kubenswrapper[4771]: I0227 02:04:00.428441 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whc2k\" (UniqueName: \"kubernetes.io/projected/c11c843e-c871-4ff4-96d0-6b35e86d9453-kube-api-access-whc2k\") pod \"auto-csr-approver-29535964-8mtr2\" (UID: \"c11c843e-c871-4ff4-96d0-6b35e86d9453\") " pod="openshift-infra/auto-csr-approver-29535964-8mtr2" Feb 27 02:04:00 crc kubenswrapper[4771]: I0227 02:04:00.475381 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535964-8mtr2" Feb 27 02:04:00 crc kubenswrapper[4771]: I0227 02:04:00.986405 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535964-8mtr2"] Feb 27 02:04:00 crc kubenswrapper[4771]: I0227 02:04:00.996829 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 02:04:01 crc kubenswrapper[4771]: I0227 02:04:01.042831 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535964-8mtr2" event={"ID":"c11c843e-c871-4ff4-96d0-6b35e86d9453","Type":"ContainerStarted","Data":"163d73278c26f2665a4a38614d0e14f3a19e38b8f04e9e206d722ebdbb30aa27"} Feb 27 02:04:03 crc kubenswrapper[4771]: I0227 02:04:03.064340 4771 generic.go:334] "Generic (PLEG): container finished" podID="c11c843e-c871-4ff4-96d0-6b35e86d9453" containerID="513bb3111da60e377e92c8c5a04c6b0cddf7af8484da324650a02bca60986f30" exitCode=0 Feb 27 02:04:03 crc kubenswrapper[4771]: I0227 02:04:03.064765 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535964-8mtr2" event={"ID":"c11c843e-c871-4ff4-96d0-6b35e86d9453","Type":"ContainerDied","Data":"513bb3111da60e377e92c8c5a04c6b0cddf7af8484da324650a02bca60986f30"} Feb 27 02:04:04 crc kubenswrapper[4771]: I0227 02:04:04.419632 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535964-8mtr2" Feb 27 02:04:04 crc kubenswrapper[4771]: I0227 02:04:04.569878 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whc2k\" (UniqueName: \"kubernetes.io/projected/c11c843e-c871-4ff4-96d0-6b35e86d9453-kube-api-access-whc2k\") pod \"c11c843e-c871-4ff4-96d0-6b35e86d9453\" (UID: \"c11c843e-c871-4ff4-96d0-6b35e86d9453\") " Feb 27 02:04:04 crc kubenswrapper[4771]: I0227 02:04:04.575494 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c11c843e-c871-4ff4-96d0-6b35e86d9453-kube-api-access-whc2k" (OuterVolumeSpecName: "kube-api-access-whc2k") pod "c11c843e-c871-4ff4-96d0-6b35e86d9453" (UID: "c11c843e-c871-4ff4-96d0-6b35e86d9453"). InnerVolumeSpecName "kube-api-access-whc2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:04:04 crc kubenswrapper[4771]: I0227 02:04:04.672349 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whc2k\" (UniqueName: \"kubernetes.io/projected/c11c843e-c871-4ff4-96d0-6b35e86d9453-kube-api-access-whc2k\") on node \"crc\" DevicePath \"\"" Feb 27 02:04:05 crc kubenswrapper[4771]: I0227 02:04:05.086868 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535964-8mtr2" event={"ID":"c11c843e-c871-4ff4-96d0-6b35e86d9453","Type":"ContainerDied","Data":"163d73278c26f2665a4a38614d0e14f3a19e38b8f04e9e206d722ebdbb30aa27"} Feb 27 02:04:05 crc kubenswrapper[4771]: I0227 02:04:05.086931 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="163d73278c26f2665a4a38614d0e14f3a19e38b8f04e9e206d722ebdbb30aa27" Feb 27 02:04:05 crc kubenswrapper[4771]: I0227 02:04:05.086973 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535964-8mtr2" Feb 27 02:04:05 crc kubenswrapper[4771]: I0227 02:04:05.479604 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535958-7982l"] Feb 27 02:04:05 crc kubenswrapper[4771]: I0227 02:04:05.491594 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535958-7982l"] Feb 27 02:04:05 crc kubenswrapper[4771]: I0227 02:04:05.609022 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_60504948-6e27-4eb7-b057-4634a1951a8c/memcached/0.log" Feb 27 02:04:05 crc kubenswrapper[4771]: I0227 02:04:05.783708 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="193afeb9-addb-4815-bf2f-4eebd0e2dfac" path="/var/lib/kubelet/pods/193afeb9-addb-4815-bf2f-4eebd0e2dfac/volumes" Feb 27 02:04:21 crc kubenswrapper[4771]: I0227 02:04:21.469591 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h_46247d46-066d-45e2-975a-8404fd28a0ac/util/0.log" Feb 27 02:04:21 crc kubenswrapper[4771]: I0227 02:04:21.574483 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h_46247d46-066d-45e2-975a-8404fd28a0ac/util/0.log" Feb 27 02:04:21 crc kubenswrapper[4771]: I0227 02:04:21.658378 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h_46247d46-066d-45e2-975a-8404fd28a0ac/pull/0.log" Feb 27 02:04:21 crc kubenswrapper[4771]: I0227 02:04:21.839689 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h_46247d46-066d-45e2-975a-8404fd28a0ac/pull/0.log" Feb 27 02:04:21 crc kubenswrapper[4771]: I0227 02:04:21.967723 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h_46247d46-066d-45e2-975a-8404fd28a0ac/util/0.log" Feb 27 02:04:22 crc kubenswrapper[4771]: I0227 02:04:22.036669 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h_46247d46-066d-45e2-975a-8404fd28a0ac/pull/0.log" Feb 27 02:04:22 crc kubenswrapper[4771]: I0227 02:04:22.166101 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h_46247d46-066d-45e2-975a-8404fd28a0ac/extract/0.log" Feb 27 02:04:22 crc kubenswrapper[4771]: I0227 02:04:22.376068 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-snqrx_9f4615e8-ebc8-43ff-bdec-481f86af58bf/manager/0.log" Feb 27 02:04:22 crc kubenswrapper[4771]: I0227 02:04:22.569466 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-9jpm2_f882b343-7b46-4516-9a17-833858bbfda7/manager/0.log" Feb 27 02:04:22 crc kubenswrapper[4771]: I0227 02:04:22.666759 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-784b5bb6c5-zlggr_f77508f2-411f-4644-9b48-7edbefaf3bb4/manager/0.log" Feb 27 02:04:22 crc kubenswrapper[4771]: I0227 02:04:22.792300 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-x969l_a40b776f-5677-4909-8b04-a5b2318737bc/manager/0.log" Feb 27 02:04:23 crc kubenswrapper[4771]: I0227 02:04:23.042343 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-p8rvj_646fbcd2-1bd9-4e76-a70b-c4812c6cdbf7/manager/0.log" Feb 27 02:04:23 crc kubenswrapper[4771]: I0227 02:04:23.234223 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-w5hxv_b563eec9-7160-44db-a640-4cf7e25bc893/manager/0.log" Feb 27 02:04:23 crc kubenswrapper[4771]: I0227 02:04:23.560879 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-t65sw_eb603c5e-cb7c-41e4-ac8a-f9a960141d16/manager/0.log" Feb 27 02:04:23 crc kubenswrapper[4771]: I0227 02:04:23.568201 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-4p5fg_5ea9fc68-1ea7-48fe-b692-f99747dbd694/manager/0.log" Feb 27 02:04:23 crc kubenswrapper[4771]: I0227 02:04:23.857070 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-llvjw_0c8b88b1-8f42-458c-933e-0bcd17da38cb/manager/0.log" Feb 27 02:04:23 crc kubenswrapper[4771]: I0227 02:04:23.915253 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-df8gr_17dfc012-107f-437d-bbfd-13a1250857ed/manager/0.log" Feb 27 02:04:24 crc kubenswrapper[4771]: I0227 02:04:24.145880 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6bd4687957-6j9rs_20a5fef1-ac14-40c6-bb97-6e6f39be1645/manager/0.log" Feb 27 02:04:24 crc kubenswrapper[4771]: I0227 02:04:24.348921 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-4fsjk_61b58ad1-8db7-4a41-9774-38781245baff/manager/0.log" Feb 27 02:04:24 crc kubenswrapper[4771]: I0227 02:04:24.416916 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-659dc6bbfc-65x55_7a2d4cdb-cbb2-4910-a9b5-ae2adfb04205/manager/0.log" Feb 27 02:04:24 crc kubenswrapper[4771]: I0227 02:04:24.648483 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq_e01a3024-1558-41e4-bbb4-06451d536782/manager/0.log" Feb 27 02:04:24 crc kubenswrapper[4771]: I0227 02:04:24.983866 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-b5b8f6cf4-m2tbq_79f9396a-5f0c-4909-b710-4914faa9e011/operator/0.log" Feb 27 02:04:25 crc kubenswrapper[4771]: I0227 02:04:25.227785 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-d8jl8_6846ec0e-56f5-4bad-9539-0f6578027f45/registry-server/0.log" Feb 27 02:04:25 crc kubenswrapper[4771]: I0227 02:04:25.464249 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5955d8c787-2qcds_7cf10a28-d86e-4299-8b06-84888ca3dcb9/manager/0.log" Feb 27 02:04:25 crc kubenswrapper[4771]: I0227 02:04:25.468468 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-vbhct_e5ed9ba2-1499-42b0-9a16-213f7bd6336f/manager/0.log" Feb 27 02:04:25 crc kubenswrapper[4771]: I0227 02:04:25.713632 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-2rpsh_bef6603d-191e-4d4b-b824-4a8d4f81c991/operator/0.log" Feb 27 02:04:25 crc kubenswrapper[4771]: I0227 02:04:25.883739 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-d8xdb_aece7f0f-11e5-4934-b818-f8c92e54439b/manager/0.log" Feb 27 02:04:25 crc kubenswrapper[4771]: I0227 02:04:25.981259 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-589c568786-wb7w9_a7c97c14-2dc7-409a-bb85-7e10031e839b/manager/0.log" Feb 27 02:04:26 crc kubenswrapper[4771]: I0227 02:04:26.117808 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5dc6794d5b-cp5l8_987278ec-2526-4db5-a442-58b38687805c/manager/0.log" Feb 27 02:04:26 crc kubenswrapper[4771]: I0227 02:04:26.340308 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-lg7vn_981a63b0-1a15-42f0-8d4a-0dc24dbd87b1/manager/0.log" Feb 27 02:04:26 crc kubenswrapper[4771]: I0227 02:04:26.572473 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5dc6fb848b-7nk64_b4a70780-ab41-4199-b1b8-09b01cd6a4ac/manager/0.log" Feb 27 02:04:27 crc kubenswrapper[4771]: I0227 02:04:27.664323 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-mrvth_8bd8d6ef-0025-4148-a530-1964ae763645/manager/0.log" Feb 27 02:04:30 crc kubenswrapper[4771]: I0227 02:04:30.894173 4771 scope.go:117] "RemoveContainer" containerID="0179f80dea866d464d740c87d427df2b96226568e75808bf880a34c5e2effebb" Feb 27 02:04:44 crc kubenswrapper[4771]: I0227 02:04:44.885875 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-2qwgc_62c59a17-8b65-4876-a007-1cb1f45a7c2b/control-plane-machine-set-operator/0.log" Feb 27 02:04:45 crc kubenswrapper[4771]: I0227 02:04:45.081511 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4vrtf_7bd5b18f-fa8c-46d4-a571-630a67b14023/machine-api-operator/0.log" Feb 27 02:04:45 crc kubenswrapper[4771]: I0227 02:04:45.103985 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4vrtf_7bd5b18f-fa8c-46d4-a571-630a67b14023/kube-rbac-proxy/0.log" Feb 27 02:04:58 crc kubenswrapper[4771]: I0227 02:04:58.202676 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-6n589_6a0dd098-846f-4aab-b87f-4d06728195c5/cert-manager-controller/0.log" Feb 27 02:04:58 crc kubenswrapper[4771]: I0227 02:04:58.377317 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-8pmgm_dca42308-0eb3-4c5b-a620-cbbb29c3c88f/cert-manager-cainjector/0.log" Feb 27 02:04:58 crc kubenswrapper[4771]: I0227 02:04:58.432844 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-gd2tq_d066e334-9b58-464d-80d8-899a6390d5c5/cert-manager-webhook/0.log" Feb 27 02:04:59 crc kubenswrapper[4771]: I0227 02:04:59.192098 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sw56g"] Feb 27 02:04:59 crc kubenswrapper[4771]: E0227 02:04:59.192919 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c11c843e-c871-4ff4-96d0-6b35e86d9453" containerName="oc" Feb 27 02:04:59 crc kubenswrapper[4771]: I0227 02:04:59.192946 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c11c843e-c871-4ff4-96d0-6b35e86d9453" containerName="oc" Feb 27 02:04:59 crc kubenswrapper[4771]: I0227 02:04:59.193195 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c11c843e-c871-4ff4-96d0-6b35e86d9453" containerName="oc" Feb 27 02:04:59 crc kubenswrapper[4771]: I0227 02:04:59.198151 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sw56g" Feb 27 02:04:59 crc kubenswrapper[4771]: I0227 02:04:59.202668 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sw56g"] Feb 27 02:04:59 crc kubenswrapper[4771]: I0227 02:04:59.295357 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67cec81f-f572-4b2b-a6a5-dd35a9d916b3-utilities\") pod \"redhat-marketplace-sw56g\" (UID: \"67cec81f-f572-4b2b-a6a5-dd35a9d916b3\") " pod="openshift-marketplace/redhat-marketplace-sw56g" Feb 27 02:04:59 crc kubenswrapper[4771]: I0227 02:04:59.295431 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67cec81f-f572-4b2b-a6a5-dd35a9d916b3-catalog-content\") pod \"redhat-marketplace-sw56g\" (UID: \"67cec81f-f572-4b2b-a6a5-dd35a9d916b3\") " pod="openshift-marketplace/redhat-marketplace-sw56g" Feb 27 02:04:59 crc kubenswrapper[4771]: I0227 02:04:59.295630 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmmsp\" (UniqueName: \"kubernetes.io/projected/67cec81f-f572-4b2b-a6a5-dd35a9d916b3-kube-api-access-fmmsp\") pod \"redhat-marketplace-sw56g\" (UID: \"67cec81f-f572-4b2b-a6a5-dd35a9d916b3\") " pod="openshift-marketplace/redhat-marketplace-sw56g" Feb 27 02:04:59 crc kubenswrapper[4771]: I0227 02:04:59.397683 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67cec81f-f572-4b2b-a6a5-dd35a9d916b3-utilities\") pod \"redhat-marketplace-sw56g\" (UID: \"67cec81f-f572-4b2b-a6a5-dd35a9d916b3\") " pod="openshift-marketplace/redhat-marketplace-sw56g" Feb 27 02:04:59 crc kubenswrapper[4771]: I0227 02:04:59.397771 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67cec81f-f572-4b2b-a6a5-dd35a9d916b3-catalog-content\") pod \"redhat-marketplace-sw56g\" (UID: \"67cec81f-f572-4b2b-a6a5-dd35a9d916b3\") " pod="openshift-marketplace/redhat-marketplace-sw56g" Feb 27 02:04:59 crc kubenswrapper[4771]: I0227 02:04:59.397841 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmmsp\" (UniqueName: \"kubernetes.io/projected/67cec81f-f572-4b2b-a6a5-dd35a9d916b3-kube-api-access-fmmsp\") pod \"redhat-marketplace-sw56g\" (UID: \"67cec81f-f572-4b2b-a6a5-dd35a9d916b3\") " pod="openshift-marketplace/redhat-marketplace-sw56g" Feb 27 02:04:59 crc kubenswrapper[4771]: I0227 02:04:59.398335 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67cec81f-f572-4b2b-a6a5-dd35a9d916b3-utilities\") pod \"redhat-marketplace-sw56g\" (UID: \"67cec81f-f572-4b2b-a6a5-dd35a9d916b3\") " pod="openshift-marketplace/redhat-marketplace-sw56g" Feb 27 02:04:59 crc kubenswrapper[4771]: I0227 02:04:59.398357 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67cec81f-f572-4b2b-a6a5-dd35a9d916b3-catalog-content\") pod \"redhat-marketplace-sw56g\" (UID: \"67cec81f-f572-4b2b-a6a5-dd35a9d916b3\") " pod="openshift-marketplace/redhat-marketplace-sw56g" Feb 27 02:04:59 crc kubenswrapper[4771]: I0227 02:04:59.433397 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmmsp\" (UniqueName: \"kubernetes.io/projected/67cec81f-f572-4b2b-a6a5-dd35a9d916b3-kube-api-access-fmmsp\") pod \"redhat-marketplace-sw56g\" (UID: \"67cec81f-f572-4b2b-a6a5-dd35a9d916b3\") " pod="openshift-marketplace/redhat-marketplace-sw56g" Feb 27 02:04:59 crc kubenswrapper[4771]: I0227 02:04:59.533013 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sw56g" Feb 27 02:05:00 crc kubenswrapper[4771]: I0227 02:05:00.080635 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sw56g"] Feb 27 02:05:00 crc kubenswrapper[4771]: I0227 02:05:00.573335 4771 generic.go:334] "Generic (PLEG): container finished" podID="67cec81f-f572-4b2b-a6a5-dd35a9d916b3" containerID="819788eec0d88349601136e7b775c570542dfb4b7a16be73e7a87eb783b33c2a" exitCode=0 Feb 27 02:05:00 crc kubenswrapper[4771]: I0227 02:05:00.573374 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw56g" event={"ID":"67cec81f-f572-4b2b-a6a5-dd35a9d916b3","Type":"ContainerDied","Data":"819788eec0d88349601136e7b775c570542dfb4b7a16be73e7a87eb783b33c2a"} Feb 27 02:05:00 crc kubenswrapper[4771]: I0227 02:05:00.573400 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw56g" event={"ID":"67cec81f-f572-4b2b-a6a5-dd35a9d916b3","Type":"ContainerStarted","Data":"a24912082a371e829dbe8fe0e5339ade9cfda731aaef839ae55beaef1bfd1f27"} Feb 27 02:05:01 crc kubenswrapper[4771]: I0227 02:05:01.583859 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw56g" event={"ID":"67cec81f-f572-4b2b-a6a5-dd35a9d916b3","Type":"ContainerStarted","Data":"9a1caf305446a6e94f3c4838d20643e5eee2a763d0a00eefb60b4fb29a13dfa4"} Feb 27 02:05:02 crc kubenswrapper[4771]: I0227 02:05:02.602066 4771 generic.go:334] "Generic (PLEG): container finished" podID="67cec81f-f572-4b2b-a6a5-dd35a9d916b3" containerID="9a1caf305446a6e94f3c4838d20643e5eee2a763d0a00eefb60b4fb29a13dfa4" exitCode=0 Feb 27 02:05:02 crc kubenswrapper[4771]: I0227 02:05:02.602201 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw56g" event={"ID":"67cec81f-f572-4b2b-a6a5-dd35a9d916b3","Type":"ContainerDied","Data":"9a1caf305446a6e94f3c4838d20643e5eee2a763d0a00eefb60b4fb29a13dfa4"} Feb 27 02:05:03 crc kubenswrapper[4771]: I0227 02:05:03.612741 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw56g" event={"ID":"67cec81f-f572-4b2b-a6a5-dd35a9d916b3","Type":"ContainerStarted","Data":"f134bd3c0899136d4b3978424641c5a53d71db5d3c1441a30e99051184a36a87"} Feb 27 02:05:03 crc kubenswrapper[4771]: I0227 02:05:03.638184 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sw56g" podStartSLOduration=2.18252411 podStartE2EDuration="4.638161301s" podCreationTimestamp="2026-02-27 02:04:59 +0000 UTC" firstStartedPulling="2026-02-27 02:05:00.575825225 +0000 UTC m=+3613.513386513" lastFinishedPulling="2026-02-27 02:05:03.031462406 +0000 UTC m=+3615.969023704" observedRunningTime="2026-02-27 02:05:03.630432873 +0000 UTC m=+3616.567994181" watchObservedRunningTime="2026-02-27 02:05:03.638161301 +0000 UTC m=+3616.575722599" Feb 27 02:05:09 crc kubenswrapper[4771]: I0227 02:05:09.533411 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sw56g" Feb 27 02:05:09 crc kubenswrapper[4771]: I0227 02:05:09.535074 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sw56g" Feb 27 02:05:09 crc kubenswrapper[4771]: I0227 02:05:09.617367 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sw56g" Feb 27 02:05:09 crc kubenswrapper[4771]: I0227 02:05:09.714559 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sw56g" Feb 27 02:05:09 crc kubenswrapper[4771]: I0227 02:05:09.859726 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sw56g"] Feb 27 02:05:11 crc kubenswrapper[4771]: I0227 02:05:11.680636 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sw56g" podUID="67cec81f-f572-4b2b-a6a5-dd35a9d916b3" containerName="registry-server" containerID="cri-o://f134bd3c0899136d4b3978424641c5a53d71db5d3c1441a30e99051184a36a87" gracePeriod=2 Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.188483 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sw56g" Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.279075 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmmsp\" (UniqueName: \"kubernetes.io/projected/67cec81f-f572-4b2b-a6a5-dd35a9d916b3-kube-api-access-fmmsp\") pod \"67cec81f-f572-4b2b-a6a5-dd35a9d916b3\" (UID: \"67cec81f-f572-4b2b-a6a5-dd35a9d916b3\") " Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.279707 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67cec81f-f572-4b2b-a6a5-dd35a9d916b3-utilities\") pod \"67cec81f-f572-4b2b-a6a5-dd35a9d916b3\" (UID: \"67cec81f-f572-4b2b-a6a5-dd35a9d916b3\") " Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.279924 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67cec81f-f572-4b2b-a6a5-dd35a9d916b3-catalog-content\") pod \"67cec81f-f572-4b2b-a6a5-dd35a9d916b3\" (UID: \"67cec81f-f572-4b2b-a6a5-dd35a9d916b3\") " Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.297187 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67cec81f-f572-4b2b-a6a5-dd35a9d916b3-kube-api-access-fmmsp" (OuterVolumeSpecName: "kube-api-access-fmmsp") pod "67cec81f-f572-4b2b-a6a5-dd35a9d916b3" (UID: "67cec81f-f572-4b2b-a6a5-dd35a9d916b3"). InnerVolumeSpecName "kube-api-access-fmmsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.299041 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67cec81f-f572-4b2b-a6a5-dd35a9d916b3-utilities" (OuterVolumeSpecName: "utilities") pod "67cec81f-f572-4b2b-a6a5-dd35a9d916b3" (UID: "67cec81f-f572-4b2b-a6a5-dd35a9d916b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.314312 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67cec81f-f572-4b2b-a6a5-dd35a9d916b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "67cec81f-f572-4b2b-a6a5-dd35a9d916b3" (UID: "67cec81f-f572-4b2b-a6a5-dd35a9d916b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.381877 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67cec81f-f572-4b2b-a6a5-dd35a9d916b3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.381908 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmmsp\" (UniqueName: \"kubernetes.io/projected/67cec81f-f572-4b2b-a6a5-dd35a9d916b3-kube-api-access-fmmsp\") on node \"crc\" DevicePath \"\"" Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.381922 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67cec81f-f572-4b2b-a6a5-dd35a9d916b3-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.527535 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-thhng_7c2f136b-c273-45f2-bbd2-923046cf0861/nmstate-console-plugin/0.log" Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.658420 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-ln8xd_7b2bdabc-b325-4bc2-91f8-39e9f12ec946/nmstate-handler/0.log" Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.688483 4771 generic.go:334] "Generic (PLEG): container finished" podID="67cec81f-f572-4b2b-a6a5-dd35a9d916b3" containerID="f134bd3c0899136d4b3978424641c5a53d71db5d3c1441a30e99051184a36a87" exitCode=0 Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.688520 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw56g" event={"ID":"67cec81f-f572-4b2b-a6a5-dd35a9d916b3","Type":"ContainerDied","Data":"f134bd3c0899136d4b3978424641c5a53d71db5d3c1441a30e99051184a36a87"} Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.688590 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw56g" event={"ID":"67cec81f-f572-4b2b-a6a5-dd35a9d916b3","Type":"ContainerDied","Data":"a24912082a371e829dbe8fe0e5339ade9cfda731aaef839ae55beaef1bfd1f27"} Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.688609 4771 scope.go:117] "RemoveContainer" containerID="f134bd3c0899136d4b3978424641c5a53d71db5d3c1441a30e99051184a36a87" Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.688693 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sw56g" Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.694725 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-bkc9p_6049b388-cb33-408a-848e-90a3e9767488/kube-rbac-proxy/0.log" Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.720281 4771 scope.go:117] "RemoveContainer" containerID="9a1caf305446a6e94f3c4838d20643e5eee2a763d0a00eefb60b4fb29a13dfa4" Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.724919 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sw56g"] Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.734513 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sw56g"] Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.744824 4771 scope.go:117] "RemoveContainer" containerID="819788eec0d88349601136e7b775c570542dfb4b7a16be73e7a87eb783b33c2a" Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.761732 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-bkc9p_6049b388-cb33-408a-848e-90a3e9767488/nmstate-metrics/0.log" Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.781289 4771 scope.go:117] "RemoveContainer" containerID="f134bd3c0899136d4b3978424641c5a53d71db5d3c1441a30e99051184a36a87" Feb 27 02:05:12 crc kubenswrapper[4771]: E0227 02:05:12.781728 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f134bd3c0899136d4b3978424641c5a53d71db5d3c1441a30e99051184a36a87\": container with ID starting with f134bd3c0899136d4b3978424641c5a53d71db5d3c1441a30e99051184a36a87 not found: ID does not exist" containerID="f134bd3c0899136d4b3978424641c5a53d71db5d3c1441a30e99051184a36a87" Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.781833 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f134bd3c0899136d4b3978424641c5a53d71db5d3c1441a30e99051184a36a87"} err="failed to get container status \"f134bd3c0899136d4b3978424641c5a53d71db5d3c1441a30e99051184a36a87\": rpc error: code = NotFound desc = could not find container \"f134bd3c0899136d4b3978424641c5a53d71db5d3c1441a30e99051184a36a87\": container with ID starting with f134bd3c0899136d4b3978424641c5a53d71db5d3c1441a30e99051184a36a87 not found: ID does not exist" Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.781911 4771 scope.go:117] "RemoveContainer" containerID="9a1caf305446a6e94f3c4838d20643e5eee2a763d0a00eefb60b4fb29a13dfa4" Feb 27 02:05:12 crc kubenswrapper[4771]: E0227 02:05:12.782832 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a1caf305446a6e94f3c4838d20643e5eee2a763d0a00eefb60b4fb29a13dfa4\": container with ID starting with 9a1caf305446a6e94f3c4838d20643e5eee2a763d0a00eefb60b4fb29a13dfa4 not found: ID does not exist" containerID="9a1caf305446a6e94f3c4838d20643e5eee2a763d0a00eefb60b4fb29a13dfa4" Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.782862 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a1caf305446a6e94f3c4838d20643e5eee2a763d0a00eefb60b4fb29a13dfa4"} err="failed to get container status \"9a1caf305446a6e94f3c4838d20643e5eee2a763d0a00eefb60b4fb29a13dfa4\": rpc error: code = NotFound desc = could not find container \"9a1caf305446a6e94f3c4838d20643e5eee2a763d0a00eefb60b4fb29a13dfa4\": container with ID starting with 9a1caf305446a6e94f3c4838d20643e5eee2a763d0a00eefb60b4fb29a13dfa4 not found: ID does not exist" Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.782879 4771 scope.go:117] "RemoveContainer" containerID="819788eec0d88349601136e7b775c570542dfb4b7a16be73e7a87eb783b33c2a" Feb 27 02:05:12 crc kubenswrapper[4771]: E0227 02:05:12.783140 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"819788eec0d88349601136e7b775c570542dfb4b7a16be73e7a87eb783b33c2a\": container with ID starting with 819788eec0d88349601136e7b775c570542dfb4b7a16be73e7a87eb783b33c2a not found: ID does not exist" containerID="819788eec0d88349601136e7b775c570542dfb4b7a16be73e7a87eb783b33c2a" Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.783235 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"819788eec0d88349601136e7b775c570542dfb4b7a16be73e7a87eb783b33c2a"} err="failed to get container status \"819788eec0d88349601136e7b775c570542dfb4b7a16be73e7a87eb783b33c2a\": rpc error: code = NotFound desc = could not find container \"819788eec0d88349601136e7b775c570542dfb4b7a16be73e7a87eb783b33c2a\": container with ID starting with 819788eec0d88349601136e7b775c570542dfb4b7a16be73e7a87eb783b33c2a not found: ID does not exist" Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.893835 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-tmk5t_b7560148-b519-4709-a6a8-184258052e14/nmstate-operator/0.log" Feb 27 02:05:12 crc kubenswrapper[4771]: I0227 02:05:12.996948 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-pfb4d_397a2bf0-511c-4cc9-964c-e1d2efc662ea/nmstate-webhook/0.log" Feb 27 02:05:13 crc kubenswrapper[4771]: I0227 02:05:13.784002 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67cec81f-f572-4b2b-a6a5-dd35a9d916b3" path="/var/lib/kubelet/pods/67cec81f-f572-4b2b-a6a5-dd35a9d916b3/volumes" Feb 27 02:05:41 crc kubenswrapper[4771]: I0227 02:05:41.438083 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-sgfp9_cd363b49-3f3c-46af-834d-5ab27e2ed35e/kube-rbac-proxy/0.log" Feb 27 02:05:41 crc kubenswrapper[4771]: I0227 02:05:41.571459 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-sgfp9_cd363b49-3f3c-46af-834d-5ab27e2ed35e/controller/0.log" Feb 27 02:05:41 crc kubenswrapper[4771]: I0227 02:05:41.601077 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/cp-frr-files/0.log" Feb 27 02:05:41 crc kubenswrapper[4771]: I0227 02:05:41.751342 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/cp-frr-files/0.log" Feb 27 02:05:41 crc kubenswrapper[4771]: I0227 02:05:41.779541 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/cp-metrics/0.log" Feb 27 02:05:41 crc kubenswrapper[4771]: I0227 02:05:41.786816 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/cp-reloader/0.log" Feb 27 02:05:41 crc kubenswrapper[4771]: I0227 02:05:41.851198 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/cp-reloader/0.log" Feb 27 02:05:42 crc kubenswrapper[4771]: I0227 02:05:42.066496 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/cp-reloader/0.log" Feb 27 02:05:42 crc kubenswrapper[4771]: I0227 02:05:42.069220 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/cp-frr-files/0.log" Feb 27 02:05:42 crc kubenswrapper[4771]: I0227 02:05:42.103442 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/cp-metrics/0.log" Feb 27 02:05:42 crc kubenswrapper[4771]: I0227 02:05:42.107001 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/cp-metrics/0.log" Feb 27 02:05:42 crc kubenswrapper[4771]: I0227 02:05:42.270140 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/cp-frr-files/0.log" Feb 27 02:05:42 crc kubenswrapper[4771]: I0227 02:05:42.270901 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/cp-reloader/0.log" Feb 27 02:05:42 crc kubenswrapper[4771]: I0227 02:05:42.271846 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/cp-metrics/0.log" Feb 27 02:05:42 crc kubenswrapper[4771]: I0227 02:05:42.334198 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/controller/0.log" Feb 27 02:05:42 crc kubenswrapper[4771]: I0227 02:05:42.450858 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/kube-rbac-proxy/0.log" Feb 27 02:05:42 crc kubenswrapper[4771]: I0227 02:05:42.453320 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/frr-metrics/0.log" Feb 27 02:05:42 crc kubenswrapper[4771]: I0227 02:05:42.589842 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/kube-rbac-proxy-frr/0.log" Feb 27 02:05:42 crc kubenswrapper[4771]: I0227 02:05:42.718313 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/reloader/0.log" Feb 27 02:05:42 crc kubenswrapper[4771]: I0227 02:05:42.846174 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-8mq2q_4e3da97e-a051-4d50-b905-3ed4c804cfc6/frr-k8s-webhook-server/0.log" Feb 27 02:05:42 crc kubenswrapper[4771]: I0227 02:05:42.955129 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-d75cc4945-g8fp7_3294b45f-a2de-4a92-8466-46c17ddd0238/manager/0.log" Feb 27 02:05:43 crc kubenswrapper[4771]: I0227 02:05:43.192214 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-65c77fdb5d-6ltq2_635de0be-09c0-49ad-905c-49caa1c8b50e/webhook-server/0.log" Feb 27 02:05:43 crc kubenswrapper[4771]: I0227 02:05:43.320374 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-l5k7x_61a9b00d-d330-4575-bdac-adff64f6786d/kube-rbac-proxy/0.log" Feb 27 02:05:43 crc kubenswrapper[4771]: I0227 02:05:43.767905 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-l5k7x_61a9b00d-d330-4575-bdac-adff64f6786d/speaker/0.log" Feb 27 02:05:44 crc kubenswrapper[4771]: I0227 02:05:44.018910 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/frr/0.log" Feb 27 02:05:57 crc kubenswrapper[4771]: I0227 02:05:57.666635 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck_78ae3d79-21d2-41f2-8685-9eeb9095dbb9/util/0.log" Feb 27 02:05:57 crc kubenswrapper[4771]: I0227 02:05:57.864173 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck_78ae3d79-21d2-41f2-8685-9eeb9095dbb9/pull/0.log" Feb 27 02:05:57 crc kubenswrapper[4771]: I0227 02:05:57.888752 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck_78ae3d79-21d2-41f2-8685-9eeb9095dbb9/util/0.log" Feb 27 02:05:57 crc kubenswrapper[4771]: I0227 02:05:57.889653 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck_78ae3d79-21d2-41f2-8685-9eeb9095dbb9/pull/0.log" Feb 27 02:05:58 crc kubenswrapper[4771]: I0227 02:05:58.108786 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck_78ae3d79-21d2-41f2-8685-9eeb9095dbb9/util/0.log" Feb 27 02:05:58 crc kubenswrapper[4771]: I0227 02:05:58.110234 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck_78ae3d79-21d2-41f2-8685-9eeb9095dbb9/pull/0.log" Feb 27 02:05:58 crc kubenswrapper[4771]: I0227 02:05:58.125766 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck_78ae3d79-21d2-41f2-8685-9eeb9095dbb9/extract/0.log" Feb 27 02:05:58 crc kubenswrapper[4771]: I0227 02:05:58.250502 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6b7jf_54ef043a-8831-43e4-abb7-583a36418b6c/extract-utilities/0.log" Feb 27 02:05:58 crc kubenswrapper[4771]: I0227 02:05:58.408934 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6b7jf_54ef043a-8831-43e4-abb7-583a36418b6c/extract-utilities/0.log" Feb 27 02:05:58 crc kubenswrapper[4771]: I0227 02:05:58.442934 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6b7jf_54ef043a-8831-43e4-abb7-583a36418b6c/extract-content/0.log" Feb 27 02:05:58 crc kubenswrapper[4771]: I0227 02:05:58.455545 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6b7jf_54ef043a-8831-43e4-abb7-583a36418b6c/extract-content/0.log" Feb 27 02:05:58 crc kubenswrapper[4771]: I0227 02:05:58.609564 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6b7jf_54ef043a-8831-43e4-abb7-583a36418b6c/extract-content/0.log" Feb 27 02:05:58 crc kubenswrapper[4771]: I0227 02:05:58.613011 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6b7jf_54ef043a-8831-43e4-abb7-583a36418b6c/extract-utilities/0.log" Feb 27 02:05:58 crc kubenswrapper[4771]: I0227 02:05:58.842435 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n48ck_005f8696-cd2c-46e3-8edb-d9c9c9652871/extract-utilities/0.log" Feb 27 02:05:58 crc kubenswrapper[4771]: I0227 02:05:58.953042 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 02:05:58 crc kubenswrapper[4771]: I0227 02:05:58.953086 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 02:05:59 crc kubenswrapper[4771]: I0227 02:05:59.042436 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n48ck_005f8696-cd2c-46e3-8edb-d9c9c9652871/extract-content/0.log" Feb 27 02:05:59 crc kubenswrapper[4771]: I0227 02:05:59.043666 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n48ck_005f8696-cd2c-46e3-8edb-d9c9c9652871/extract-utilities/0.log" Feb 27 02:05:59 crc kubenswrapper[4771]: I0227 02:05:59.067610 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n48ck_005f8696-cd2c-46e3-8edb-d9c9c9652871/extract-content/0.log" Feb 27 02:05:59 crc kubenswrapper[4771]: I0227 02:05:59.159360 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6b7jf_54ef043a-8831-43e4-abb7-583a36418b6c/registry-server/0.log" Feb 27 02:05:59 crc kubenswrapper[4771]: I0227 02:05:59.231430 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n48ck_005f8696-cd2c-46e3-8edb-d9c9c9652871/extract-content/0.log" Feb 27 02:05:59 crc kubenswrapper[4771]: I0227 02:05:59.263441 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n48ck_005f8696-cd2c-46e3-8edb-d9c9c9652871/extract-utilities/0.log" Feb 27 02:05:59 crc kubenswrapper[4771]: I0227 02:05:59.526490 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7_008a5eed-f47a-4fd7-8fbe-c442e115da9a/util/0.log" Feb 27 02:05:59 crc kubenswrapper[4771]: I0227 02:05:59.697757 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7_008a5eed-f47a-4fd7-8fbe-c442e115da9a/util/0.log" Feb 27 02:05:59 crc kubenswrapper[4771]: I0227 02:05:59.698529 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7_008a5eed-f47a-4fd7-8fbe-c442e115da9a/pull/0.log" Feb 27 02:05:59 crc kubenswrapper[4771]: I0227 02:05:59.761869 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7_008a5eed-f47a-4fd7-8fbe-c442e115da9a/pull/0.log" Feb 27 02:05:59 crc kubenswrapper[4771]: I0227 02:05:59.864213 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n48ck_005f8696-cd2c-46e3-8edb-d9c9c9652871/registry-server/0.log" Feb 27 02:05:59 crc kubenswrapper[4771]: I0227 02:05:59.954344 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7_008a5eed-f47a-4fd7-8fbe-c442e115da9a/pull/0.log" Feb 27 02:05:59 crc kubenswrapper[4771]: I0227 02:05:59.963184 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7_008a5eed-f47a-4fd7-8fbe-c442e115da9a/util/0.log" Feb 27 02:06:00 crc kubenswrapper[4771]: I0227 02:06:00.015158 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7_008a5eed-f47a-4fd7-8fbe-c442e115da9a/extract/0.log" Feb 27 02:06:00 crc kubenswrapper[4771]: I0227 02:06:00.136019 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535966-7tjxj"] Feb 27 02:06:00 crc kubenswrapper[4771]: E0227 02:06:00.136404 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67cec81f-f572-4b2b-a6a5-dd35a9d916b3" containerName="registry-server" Feb 27 02:06:00 crc kubenswrapper[4771]: I0227 02:06:00.136421 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="67cec81f-f572-4b2b-a6a5-dd35a9d916b3" containerName="registry-server" Feb 27 02:06:00 crc kubenswrapper[4771]: E0227 02:06:00.136437 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67cec81f-f572-4b2b-a6a5-dd35a9d916b3" containerName="extract-utilities" Feb 27 02:06:00 crc kubenswrapper[4771]: I0227 02:06:00.136444 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="67cec81f-f572-4b2b-a6a5-dd35a9d916b3" containerName="extract-utilities" Feb 27 02:06:00 crc kubenswrapper[4771]: E0227 02:06:00.136466 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67cec81f-f572-4b2b-a6a5-dd35a9d916b3" containerName="extract-content" Feb 27 02:06:00 crc kubenswrapper[4771]: I0227 02:06:00.136472 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="67cec81f-f572-4b2b-a6a5-dd35a9d916b3" containerName="extract-content" Feb 27 02:06:00 crc kubenswrapper[4771]: I0227 02:06:00.136672 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="67cec81f-f572-4b2b-a6a5-dd35a9d916b3" containerName="registry-server" Feb 27 02:06:00 crc kubenswrapper[4771]: I0227 02:06:00.137331 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535966-7tjxj" Feb 27 02:06:00 crc kubenswrapper[4771]: I0227 02:06:00.139084 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 02:06:00 crc kubenswrapper[4771]: I0227 02:06:00.139630 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 02:06:00 crc kubenswrapper[4771]: I0227 02:06:00.140744 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 02:06:00 crc kubenswrapper[4771]: I0227 02:06:00.141506 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-jffnf_9cb60be5-a0ff-489e-a473-32a72359b2ce/marketplace-operator/0.log" Feb 27 02:06:00 crc kubenswrapper[4771]: I0227 02:06:00.153875 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535966-7tjxj"] Feb 27 02:06:00 crc kubenswrapper[4771]: I0227 02:06:00.231947 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl98j_a25ddae9-53d5-4d86-9914-10fc8e695cb3/extract-utilities/0.log" Feb 27 02:06:00 crc kubenswrapper[4771]: I0227 02:06:00.235950 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf66r\" (UniqueName: \"kubernetes.io/projected/118838b1-f460-498e-8197-1894fb3b3669-kube-api-access-pf66r\") pod \"auto-csr-approver-29535966-7tjxj\" (UID: \"118838b1-f460-498e-8197-1894fb3b3669\") " pod="openshift-infra/auto-csr-approver-29535966-7tjxj" Feb 27 02:06:00 crc kubenswrapper[4771]: I0227 02:06:00.338348 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf66r\" (UniqueName: \"kubernetes.io/projected/118838b1-f460-498e-8197-1894fb3b3669-kube-api-access-pf66r\") pod \"auto-csr-approver-29535966-7tjxj\" (UID: \"118838b1-f460-498e-8197-1894fb3b3669\") " pod="openshift-infra/auto-csr-approver-29535966-7tjxj" Feb 27 02:06:00 crc kubenswrapper[4771]: I0227 02:06:00.361495 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl98j_a25ddae9-53d5-4d86-9914-10fc8e695cb3/extract-utilities/0.log" Feb 27 02:06:00 crc kubenswrapper[4771]: I0227 02:06:00.365253 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf66r\" (UniqueName: \"kubernetes.io/projected/118838b1-f460-498e-8197-1894fb3b3669-kube-api-access-pf66r\") pod \"auto-csr-approver-29535966-7tjxj\" (UID: \"118838b1-f460-498e-8197-1894fb3b3669\") " pod="openshift-infra/auto-csr-approver-29535966-7tjxj" Feb 27 02:06:00 crc kubenswrapper[4771]: I0227 02:06:00.385375 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl98j_a25ddae9-53d5-4d86-9914-10fc8e695cb3/extract-content/0.log" Feb 27 02:06:00 crc kubenswrapper[4771]: I0227 02:06:00.402800 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl98j_a25ddae9-53d5-4d86-9914-10fc8e695cb3/extract-content/0.log" Feb 27 02:06:00 crc kubenswrapper[4771]: I0227 02:06:00.454244 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535966-7tjxj" Feb 27 02:06:00 crc kubenswrapper[4771]: I0227 02:06:00.587464 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl98j_a25ddae9-53d5-4d86-9914-10fc8e695cb3/extract-content/0.log" Feb 27 02:06:00 crc kubenswrapper[4771]: I0227 02:06:00.594235 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl98j_a25ddae9-53d5-4d86-9914-10fc8e695cb3/extract-utilities/0.log" Feb 27 02:06:00 crc kubenswrapper[4771]: I0227 02:06:00.804802 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl98j_a25ddae9-53d5-4d86-9914-10fc8e695cb3/registry-server/0.log" Feb 27 02:06:00 crc kubenswrapper[4771]: I0227 02:06:00.940119 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535966-7tjxj"] Feb 27 02:06:00 crc kubenswrapper[4771]: W0227 02:06:00.945864 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod118838b1_f460_498e_8197_1894fb3b3669.slice/crio-a6bd39d6d9bd0a763fec2d7af6fad6f78cd6f6957f5b1bda443cbdc599524e5b WatchSource:0}: Error finding container a6bd39d6d9bd0a763fec2d7af6fad6f78cd6f6957f5b1bda443cbdc599524e5b: Status 404 returned error can't find the container with id a6bd39d6d9bd0a763fec2d7af6fad6f78cd6f6957f5b1bda443cbdc599524e5b Feb 27 02:06:00 crc kubenswrapper[4771]: I0227 02:06:00.949304 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nm89l_eb1594d2-dbd5-4e37-8d97-dac2a6357808/extract-utilities/0.log" Feb 27 02:06:01 crc kubenswrapper[4771]: I0227 02:06:01.070435 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nm89l_eb1594d2-dbd5-4e37-8d97-dac2a6357808/extract-utilities/0.log" Feb 27 02:06:01 crc kubenswrapper[4771]: I0227 02:06:01.104256 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nm89l_eb1594d2-dbd5-4e37-8d97-dac2a6357808/extract-content/0.log" Feb 27 02:06:01 crc kubenswrapper[4771]: I0227 02:06:01.109287 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nm89l_eb1594d2-dbd5-4e37-8d97-dac2a6357808/extract-content/0.log" Feb 27 02:06:01 crc kubenswrapper[4771]: I0227 02:06:01.193774 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535966-7tjxj" event={"ID":"118838b1-f460-498e-8197-1894fb3b3669","Type":"ContainerStarted","Data":"a6bd39d6d9bd0a763fec2d7af6fad6f78cd6f6957f5b1bda443cbdc599524e5b"} Feb 27 02:06:01 crc kubenswrapper[4771]: I0227 02:06:01.267882 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nm89l_eb1594d2-dbd5-4e37-8d97-dac2a6357808/extract-utilities/0.log" Feb 27 02:06:01 crc kubenswrapper[4771]: I0227 02:06:01.300857 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nm89l_eb1594d2-dbd5-4e37-8d97-dac2a6357808/extract-content/0.log" Feb 27 02:06:01 crc kubenswrapper[4771]: I0227 02:06:01.755033 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nm89l_eb1594d2-dbd5-4e37-8d97-dac2a6357808/registry-server/0.log" Feb 27 02:06:03 crc kubenswrapper[4771]: I0227 02:06:03.219511 4771 generic.go:334] "Generic (PLEG): container finished" podID="118838b1-f460-498e-8197-1894fb3b3669" containerID="d0127d4998e0f7d5800808abfc47c48367f75d52076c0c98605c799f35d285d4" exitCode=0 Feb 27 02:06:03 crc kubenswrapper[4771]: I0227 02:06:03.219644 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535966-7tjxj" event={"ID":"118838b1-f460-498e-8197-1894fb3b3669","Type":"ContainerDied","Data":"d0127d4998e0f7d5800808abfc47c48367f75d52076c0c98605c799f35d285d4"} Feb 27 02:06:04 crc kubenswrapper[4771]: I0227 02:06:04.636213 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535966-7tjxj" Feb 27 02:06:04 crc kubenswrapper[4771]: I0227 02:06:04.821157 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pf66r\" (UniqueName: \"kubernetes.io/projected/118838b1-f460-498e-8197-1894fb3b3669-kube-api-access-pf66r\") pod \"118838b1-f460-498e-8197-1894fb3b3669\" (UID: \"118838b1-f460-498e-8197-1894fb3b3669\") " Feb 27 02:06:04 crc kubenswrapper[4771]: I0227 02:06:04.836833 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/118838b1-f460-498e-8197-1894fb3b3669-kube-api-access-pf66r" (OuterVolumeSpecName: "kube-api-access-pf66r") pod "118838b1-f460-498e-8197-1894fb3b3669" (UID: "118838b1-f460-498e-8197-1894fb3b3669"). InnerVolumeSpecName "kube-api-access-pf66r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:06:04 crc kubenswrapper[4771]: I0227 02:06:04.923803 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pf66r\" (UniqueName: \"kubernetes.io/projected/118838b1-f460-498e-8197-1894fb3b3669-kube-api-access-pf66r\") on node \"crc\" DevicePath \"\"" Feb 27 02:06:05 crc kubenswrapper[4771]: I0227 02:06:05.243488 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535966-7tjxj" event={"ID":"118838b1-f460-498e-8197-1894fb3b3669","Type":"ContainerDied","Data":"a6bd39d6d9bd0a763fec2d7af6fad6f78cd6f6957f5b1bda443cbdc599524e5b"} Feb 27 02:06:05 crc kubenswrapper[4771]: I0227 02:06:05.243542 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535966-7tjxj" Feb 27 02:06:05 crc kubenswrapper[4771]: I0227 02:06:05.243545 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6bd39d6d9bd0a763fec2d7af6fad6f78cd6f6957f5b1bda443cbdc599524e5b" Feb 27 02:06:05 crc kubenswrapper[4771]: I0227 02:06:05.712767 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535960-plddw"] Feb 27 02:06:05 crc kubenswrapper[4771]: I0227 02:06:05.720424 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535960-plddw"] Feb 27 02:06:05 crc kubenswrapper[4771]: I0227 02:06:05.787434 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1605ec4-f8ca-4b1f-a83e-6e2b3c47f0b5" path="/var/lib/kubelet/pods/e1605ec4-f8ca-4b1f-a83e-6e2b3c47f0b5/volumes" Feb 27 02:06:25 crc kubenswrapper[4771]: E0227 02:06:25.191990 4771 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.189:48552->38.102.83.189:46269: read tcp 38.102.83.189:48552->38.102.83.189:46269: read: connection reset by peer Feb 27 02:06:28 crc kubenswrapper[4771]: I0227 02:06:28.953422 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 02:06:28 crc kubenswrapper[4771]: I0227 02:06:28.954085 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 02:06:31 crc kubenswrapper[4771]: I0227 02:06:31.023371 4771 scope.go:117] "RemoveContainer" containerID="877f8124a79b3607236d0a11442fbf3d7a87931db76d21be89efbeb0295a1431" Feb 27 02:06:58 crc kubenswrapper[4771]: I0227 02:06:58.957508 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 02:06:58 crc kubenswrapper[4771]: I0227 02:06:58.958231 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 02:06:58 crc kubenswrapper[4771]: I0227 02:06:58.958320 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 02:06:58 crc kubenswrapper[4771]: I0227 02:06:58.959747 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d25c2ee99624ab7384d501c40defa38c3465b16865a65e0252cb74db46e114bb"} pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 02:06:58 crc kubenswrapper[4771]: I0227 02:06:58.959865 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" containerID="cri-o://d25c2ee99624ab7384d501c40defa38c3465b16865a65e0252cb74db46e114bb" gracePeriod=600 Feb 27 02:06:59 crc kubenswrapper[4771]: I0227 02:06:59.783217 4771 generic.go:334] "Generic (PLEG): container finished" podID="ca81e505-d53f-496e-bd26-7cec669591e4" containerID="d25c2ee99624ab7384d501c40defa38c3465b16865a65e0252cb74db46e114bb" exitCode=0 Feb 27 02:06:59 crc kubenswrapper[4771]: I0227 02:06:59.804705 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerDied","Data":"d25c2ee99624ab7384d501c40defa38c3465b16865a65e0252cb74db46e114bb"} Feb 27 02:06:59 crc kubenswrapper[4771]: I0227 02:06:59.804769 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerStarted","Data":"b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb"} Feb 27 02:06:59 crc kubenswrapper[4771]: I0227 02:06:59.804793 4771 scope.go:117] "RemoveContainer" containerID="a317be423bd46af4cd2543179730008bb4360bf295a905d70ffa836be236abcc" Feb 27 02:07:47 crc kubenswrapper[4771]: I0227 02:07:47.325508 4771 generic.go:334] "Generic (PLEG): container finished" podID="27073f48-03fd-4fce-99f4-730ba4479ae8" containerID="1221c8926954ba14c11d9f618491b45f2fc799e61e3cc9abb9fe6b116a465b58" exitCode=0 Feb 27 02:07:47 crc kubenswrapper[4771]: I0227 02:07:47.325610 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7zpf6/must-gather-69wsd" event={"ID":"27073f48-03fd-4fce-99f4-730ba4479ae8","Type":"ContainerDied","Data":"1221c8926954ba14c11d9f618491b45f2fc799e61e3cc9abb9fe6b116a465b58"} Feb 27 02:07:47 crc kubenswrapper[4771]: I0227 02:07:47.326976 4771 scope.go:117] "RemoveContainer" containerID="1221c8926954ba14c11d9f618491b45f2fc799e61e3cc9abb9fe6b116a465b58" Feb 27 02:07:47 crc kubenswrapper[4771]: I0227 02:07:47.501059 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7zpf6_must-gather-69wsd_27073f48-03fd-4fce-99f4-730ba4479ae8/gather/0.log" Feb 27 02:07:55 crc kubenswrapper[4771]: I0227 02:07:55.332786 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7zpf6/must-gather-69wsd"] Feb 27 02:07:55 crc kubenswrapper[4771]: I0227 02:07:55.333649 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-7zpf6/must-gather-69wsd" podUID="27073f48-03fd-4fce-99f4-730ba4479ae8" containerName="copy" containerID="cri-o://c98e5510848b2ac3a844f4adc3cef2b61206b6b0f94dac9561a3a603cf8f3d14" gracePeriod=2 Feb 27 02:07:55 crc kubenswrapper[4771]: I0227 02:07:55.344851 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7zpf6/must-gather-69wsd"] Feb 27 02:07:55 crc kubenswrapper[4771]: I0227 02:07:55.829278 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7zpf6_must-gather-69wsd_27073f48-03fd-4fce-99f4-730ba4479ae8/copy/0.log" Feb 27 02:07:55 crc kubenswrapper[4771]: I0227 02:07:55.829937 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7zpf6/must-gather-69wsd" Feb 27 02:07:55 crc kubenswrapper[4771]: I0227 02:07:55.918202 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/27073f48-03fd-4fce-99f4-730ba4479ae8-must-gather-output\") pod \"27073f48-03fd-4fce-99f4-730ba4479ae8\" (UID: \"27073f48-03fd-4fce-99f4-730ba4479ae8\") " Feb 27 02:07:55 crc kubenswrapper[4771]: I0227 02:07:55.918359 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j8qr\" (UniqueName: \"kubernetes.io/projected/27073f48-03fd-4fce-99f4-730ba4479ae8-kube-api-access-5j8qr\") pod \"27073f48-03fd-4fce-99f4-730ba4479ae8\" (UID: \"27073f48-03fd-4fce-99f4-730ba4479ae8\") " Feb 27 02:07:55 crc kubenswrapper[4771]: I0227 02:07:55.945879 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27073f48-03fd-4fce-99f4-730ba4479ae8-kube-api-access-5j8qr" (OuterVolumeSpecName: "kube-api-access-5j8qr") pod "27073f48-03fd-4fce-99f4-730ba4479ae8" (UID: "27073f48-03fd-4fce-99f4-730ba4479ae8"). InnerVolumeSpecName "kube-api-access-5j8qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:07:56 crc kubenswrapper[4771]: I0227 02:07:56.028994 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j8qr\" (UniqueName: \"kubernetes.io/projected/27073f48-03fd-4fce-99f4-730ba4479ae8-kube-api-access-5j8qr\") on node \"crc\" DevicePath \"\"" Feb 27 02:07:56 crc kubenswrapper[4771]: I0227 02:07:56.103587 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27073f48-03fd-4fce-99f4-730ba4479ae8-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "27073f48-03fd-4fce-99f4-730ba4479ae8" (UID: "27073f48-03fd-4fce-99f4-730ba4479ae8"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 02:07:56 crc kubenswrapper[4771]: I0227 02:07:56.130789 4771 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/27073f48-03fd-4fce-99f4-730ba4479ae8-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 27 02:07:56 crc kubenswrapper[4771]: I0227 02:07:56.432273 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7zpf6_must-gather-69wsd_27073f48-03fd-4fce-99f4-730ba4479ae8/copy/0.log" Feb 27 02:07:56 crc kubenswrapper[4771]: I0227 02:07:56.433947 4771 generic.go:334] "Generic (PLEG): container finished" podID="27073f48-03fd-4fce-99f4-730ba4479ae8" containerID="c98e5510848b2ac3a844f4adc3cef2b61206b6b0f94dac9561a3a603cf8f3d14" exitCode=143 Feb 27 02:07:56 crc kubenswrapper[4771]: I0227 02:07:56.434007 4771 scope.go:117] "RemoveContainer" containerID="c98e5510848b2ac3a844f4adc3cef2b61206b6b0f94dac9561a3a603cf8f3d14" Feb 27 02:07:56 crc kubenswrapper[4771]: I0227 02:07:56.434093 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7zpf6/must-gather-69wsd" Feb 27 02:07:56 crc kubenswrapper[4771]: I0227 02:07:56.819669 4771 scope.go:117] "RemoveContainer" containerID="1221c8926954ba14c11d9f618491b45f2fc799e61e3cc9abb9fe6b116a465b58" Feb 27 02:07:56 crc kubenswrapper[4771]: I0227 02:07:56.894578 4771 scope.go:117] "RemoveContainer" containerID="c98e5510848b2ac3a844f4adc3cef2b61206b6b0f94dac9561a3a603cf8f3d14" Feb 27 02:07:56 crc kubenswrapper[4771]: E0227 02:07:56.895045 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c98e5510848b2ac3a844f4adc3cef2b61206b6b0f94dac9561a3a603cf8f3d14\": container with ID starting with c98e5510848b2ac3a844f4adc3cef2b61206b6b0f94dac9561a3a603cf8f3d14 not found: ID does not exist" containerID="c98e5510848b2ac3a844f4adc3cef2b61206b6b0f94dac9561a3a603cf8f3d14" Feb 27 02:07:56 crc kubenswrapper[4771]: I0227 02:07:56.895090 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c98e5510848b2ac3a844f4adc3cef2b61206b6b0f94dac9561a3a603cf8f3d14"} err="failed to get container status \"c98e5510848b2ac3a844f4adc3cef2b61206b6b0f94dac9561a3a603cf8f3d14\": rpc error: code = NotFound desc = could not find container \"c98e5510848b2ac3a844f4adc3cef2b61206b6b0f94dac9561a3a603cf8f3d14\": container with ID starting with c98e5510848b2ac3a844f4adc3cef2b61206b6b0f94dac9561a3a603cf8f3d14 not found: ID does not exist" Feb 27 02:07:56 crc kubenswrapper[4771]: I0227 02:07:56.895115 4771 scope.go:117] "RemoveContainer" containerID="1221c8926954ba14c11d9f618491b45f2fc799e61e3cc9abb9fe6b116a465b58" Feb 27 02:07:56 crc kubenswrapper[4771]: E0227 02:07:56.898141 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1221c8926954ba14c11d9f618491b45f2fc799e61e3cc9abb9fe6b116a465b58\": container with ID starting with 1221c8926954ba14c11d9f618491b45f2fc799e61e3cc9abb9fe6b116a465b58 not found: ID does not exist" containerID="1221c8926954ba14c11d9f618491b45f2fc799e61e3cc9abb9fe6b116a465b58" Feb 27 02:07:56 crc kubenswrapper[4771]: I0227 02:07:56.898184 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1221c8926954ba14c11d9f618491b45f2fc799e61e3cc9abb9fe6b116a465b58"} err="failed to get container status \"1221c8926954ba14c11d9f618491b45f2fc799e61e3cc9abb9fe6b116a465b58\": rpc error: code = NotFound desc = could not find container \"1221c8926954ba14c11d9f618491b45f2fc799e61e3cc9abb9fe6b116a465b58\": container with ID starting with 1221c8926954ba14c11d9f618491b45f2fc799e61e3cc9abb9fe6b116a465b58 not found: ID does not exist" Feb 27 02:07:57 crc kubenswrapper[4771]: I0227 02:07:57.787979 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27073f48-03fd-4fce-99f4-730ba4479ae8" path="/var/lib/kubelet/pods/27073f48-03fd-4fce-99f4-730ba4479ae8/volumes" Feb 27 02:08:00 crc kubenswrapper[4771]: I0227 02:08:00.141267 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535968-qgjbh"] Feb 27 02:08:00 crc kubenswrapper[4771]: E0227 02:08:00.142126 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27073f48-03fd-4fce-99f4-730ba4479ae8" containerName="copy" Feb 27 02:08:00 crc kubenswrapper[4771]: I0227 02:08:00.142138 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="27073f48-03fd-4fce-99f4-730ba4479ae8" containerName="copy" Feb 27 02:08:00 crc kubenswrapper[4771]: E0227 02:08:00.142169 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27073f48-03fd-4fce-99f4-730ba4479ae8" containerName="gather" Feb 27 02:08:00 crc kubenswrapper[4771]: I0227 02:08:00.142175 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="27073f48-03fd-4fce-99f4-730ba4479ae8" containerName="gather" Feb 27 02:08:00 crc kubenswrapper[4771]: E0227 02:08:00.142189 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="118838b1-f460-498e-8197-1894fb3b3669" containerName="oc" Feb 27 02:08:00 crc kubenswrapper[4771]: I0227 02:08:00.142195 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="118838b1-f460-498e-8197-1894fb3b3669" containerName="oc" Feb 27 02:08:00 crc kubenswrapper[4771]: I0227 02:08:00.142357 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="27073f48-03fd-4fce-99f4-730ba4479ae8" containerName="copy" Feb 27 02:08:00 crc kubenswrapper[4771]: I0227 02:08:00.142374 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="118838b1-f460-498e-8197-1894fb3b3669" containerName="oc" Feb 27 02:08:00 crc kubenswrapper[4771]: I0227 02:08:00.142387 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="27073f48-03fd-4fce-99f4-730ba4479ae8" containerName="gather" Feb 27 02:08:00 crc kubenswrapper[4771]: I0227 02:08:00.142992 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535968-qgjbh" Feb 27 02:08:00 crc kubenswrapper[4771]: I0227 02:08:00.144593 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 02:08:00 crc kubenswrapper[4771]: I0227 02:08:00.144619 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 02:08:00 crc kubenswrapper[4771]: I0227 02:08:00.146001 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 02:08:00 crc kubenswrapper[4771]: I0227 02:08:00.152605 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535968-qgjbh"] Feb 27 02:08:00 crc kubenswrapper[4771]: I0227 02:08:00.222840 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xqfk\" (UniqueName: \"kubernetes.io/projected/eaff08b9-734f-4294-9467-5bd95b60d836-kube-api-access-8xqfk\") pod \"auto-csr-approver-29535968-qgjbh\" (UID: \"eaff08b9-734f-4294-9467-5bd95b60d836\") " pod="openshift-infra/auto-csr-approver-29535968-qgjbh" Feb 27 02:08:00 crc kubenswrapper[4771]: I0227 02:08:00.324724 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xqfk\" (UniqueName: \"kubernetes.io/projected/eaff08b9-734f-4294-9467-5bd95b60d836-kube-api-access-8xqfk\") pod \"auto-csr-approver-29535968-qgjbh\" (UID: \"eaff08b9-734f-4294-9467-5bd95b60d836\") " pod="openshift-infra/auto-csr-approver-29535968-qgjbh" Feb 27 02:08:00 crc kubenswrapper[4771]: I0227 02:08:00.352600 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xqfk\" (UniqueName: \"kubernetes.io/projected/eaff08b9-734f-4294-9467-5bd95b60d836-kube-api-access-8xqfk\") pod \"auto-csr-approver-29535968-qgjbh\" (UID: \"eaff08b9-734f-4294-9467-5bd95b60d836\") " pod="openshift-infra/auto-csr-approver-29535968-qgjbh" Feb 27 02:08:00 crc kubenswrapper[4771]: I0227 02:08:00.467472 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535968-qgjbh" Feb 27 02:08:00 crc kubenswrapper[4771]: I0227 02:08:00.983394 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535968-qgjbh"] Feb 27 02:08:01 crc kubenswrapper[4771]: I0227 02:08:01.493342 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535968-qgjbh" event={"ID":"eaff08b9-734f-4294-9467-5bd95b60d836","Type":"ContainerStarted","Data":"e40b00a25d3fa4822a3a3278a89c191ecc9dbf38b8830adf05abcac28850c628"} Feb 27 02:08:02 crc kubenswrapper[4771]: I0227 02:08:02.502859 4771 generic.go:334] "Generic (PLEG): container finished" podID="eaff08b9-734f-4294-9467-5bd95b60d836" containerID="8011ca3f487eb8d145b3f1c27f6e89cfcad8277d511ff936c8a94a1b41fdfc8c" exitCode=0 Feb 27 02:08:02 crc kubenswrapper[4771]: I0227 02:08:02.502909 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535968-qgjbh" event={"ID":"eaff08b9-734f-4294-9467-5bd95b60d836","Type":"ContainerDied","Data":"8011ca3f487eb8d145b3f1c27f6e89cfcad8277d511ff936c8a94a1b41fdfc8c"} Feb 27 02:08:03 crc kubenswrapper[4771]: I0227 02:08:03.845176 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535968-qgjbh" Feb 27 02:08:03 crc kubenswrapper[4771]: I0227 02:08:03.930734 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xqfk\" (UniqueName: \"kubernetes.io/projected/eaff08b9-734f-4294-9467-5bd95b60d836-kube-api-access-8xqfk\") pod \"eaff08b9-734f-4294-9467-5bd95b60d836\" (UID: \"eaff08b9-734f-4294-9467-5bd95b60d836\") " Feb 27 02:08:03 crc kubenswrapper[4771]: I0227 02:08:03.938662 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaff08b9-734f-4294-9467-5bd95b60d836-kube-api-access-8xqfk" (OuterVolumeSpecName: "kube-api-access-8xqfk") pod "eaff08b9-734f-4294-9467-5bd95b60d836" (UID: "eaff08b9-734f-4294-9467-5bd95b60d836"). InnerVolumeSpecName "kube-api-access-8xqfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:08:04 crc kubenswrapper[4771]: I0227 02:08:04.033138 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xqfk\" (UniqueName: \"kubernetes.io/projected/eaff08b9-734f-4294-9467-5bd95b60d836-kube-api-access-8xqfk\") on node \"crc\" DevicePath \"\"" Feb 27 02:08:04 crc kubenswrapper[4771]: I0227 02:08:04.529692 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535968-qgjbh" event={"ID":"eaff08b9-734f-4294-9467-5bd95b60d836","Type":"ContainerDied","Data":"e40b00a25d3fa4822a3a3278a89c191ecc9dbf38b8830adf05abcac28850c628"} Feb 27 02:08:04 crc kubenswrapper[4771]: I0227 02:08:04.529744 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e40b00a25d3fa4822a3a3278a89c191ecc9dbf38b8830adf05abcac28850c628" Feb 27 02:08:04 crc kubenswrapper[4771]: I0227 02:08:04.529848 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535968-qgjbh" Feb 27 02:08:04 crc kubenswrapper[4771]: I0227 02:08:04.904354 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535962-jjvmv"] Feb 27 02:08:04 crc kubenswrapper[4771]: I0227 02:08:04.929907 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535962-jjvmv"] Feb 27 02:08:05 crc kubenswrapper[4771]: I0227 02:08:05.783661 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="181d3584-175e-49e7-9efc-58426d4b4903" path="/var/lib/kubelet/pods/181d3584-175e-49e7-9efc-58426d4b4903/volumes" Feb 27 02:08:31 crc kubenswrapper[4771]: I0227 02:08:31.138926 4771 scope.go:117] "RemoveContainer" containerID="7db21d7f6dd657851cbd3d0512cf3dd1d2643656cf5f8a28b2c89c5d03ae58bf" Feb 27 02:09:28 crc kubenswrapper[4771]: I0227 02:09:28.952839 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 02:09:28 crc kubenswrapper[4771]: I0227 02:09:28.953591 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 02:09:31 crc kubenswrapper[4771]: I0227 02:09:31.221418 4771 scope.go:117] "RemoveContainer" containerID="967763f8e5adef6c55bbc7a0788966df348fbcf05d5212fa79c08317cf897070" Feb 27 02:09:31 crc kubenswrapper[4771]: I0227 02:09:31.249969 4771 scope.go:117] "RemoveContainer" containerID="c47d4c1673062628653468d9986882cab1a0097d10df323372b5a945b973eed7" Feb 27 02:09:58 crc kubenswrapper[4771]: I0227 02:09:58.953429 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 02:09:58 crc kubenswrapper[4771]: I0227 02:09:58.954187 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 02:10:00 crc kubenswrapper[4771]: I0227 02:10:00.158117 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535970-cq27p"] Feb 27 02:10:00 crc kubenswrapper[4771]: E0227 02:10:00.158795 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaff08b9-734f-4294-9467-5bd95b60d836" containerName="oc" Feb 27 02:10:00 crc kubenswrapper[4771]: I0227 02:10:00.158813 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaff08b9-734f-4294-9467-5bd95b60d836" containerName="oc" Feb 27 02:10:00 crc kubenswrapper[4771]: I0227 02:10:00.159116 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaff08b9-734f-4294-9467-5bd95b60d836" containerName="oc" Feb 27 02:10:00 crc kubenswrapper[4771]: I0227 02:10:00.160135 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535970-cq27p" Feb 27 02:10:00 crc kubenswrapper[4771]: I0227 02:10:00.162741 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 02:10:00 crc kubenswrapper[4771]: I0227 02:10:00.162880 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 02:10:00 crc kubenswrapper[4771]: I0227 02:10:00.164013 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 02:10:00 crc kubenswrapper[4771]: I0227 02:10:00.168907 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535970-cq27p"] Feb 27 02:10:00 crc kubenswrapper[4771]: I0227 02:10:00.252439 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g8k6\" (UniqueName: \"kubernetes.io/projected/c0b2f301-2490-4c19-b932-77fface25a45-kube-api-access-5g8k6\") pod \"auto-csr-approver-29535970-cq27p\" (UID: \"c0b2f301-2490-4c19-b932-77fface25a45\") " pod="openshift-infra/auto-csr-approver-29535970-cq27p" Feb 27 02:10:00 crc kubenswrapper[4771]: I0227 02:10:00.354539 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g8k6\" (UniqueName: \"kubernetes.io/projected/c0b2f301-2490-4c19-b932-77fface25a45-kube-api-access-5g8k6\") pod \"auto-csr-approver-29535970-cq27p\" (UID: \"c0b2f301-2490-4c19-b932-77fface25a45\") " pod="openshift-infra/auto-csr-approver-29535970-cq27p" Feb 27 02:10:00 crc kubenswrapper[4771]: I0227 02:10:00.381706 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g8k6\" (UniqueName: \"kubernetes.io/projected/c0b2f301-2490-4c19-b932-77fface25a45-kube-api-access-5g8k6\") pod \"auto-csr-approver-29535970-cq27p\" (UID: \"c0b2f301-2490-4c19-b932-77fface25a45\") " pod="openshift-infra/auto-csr-approver-29535970-cq27p" Feb 27 02:10:00 crc kubenswrapper[4771]: I0227 02:10:00.487806 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535970-cq27p" Feb 27 02:10:01 crc kubenswrapper[4771]: W0227 02:10:01.006021 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0b2f301_2490_4c19_b932_77fface25a45.slice/crio-044f642fdbe6977aa5b309e9a94a175e3d6327f17b16eed5dbd160ccc8c12d45 WatchSource:0}: Error finding container 044f642fdbe6977aa5b309e9a94a175e3d6327f17b16eed5dbd160ccc8c12d45: Status 404 returned error can't find the container with id 044f642fdbe6977aa5b309e9a94a175e3d6327f17b16eed5dbd160ccc8c12d45 Feb 27 02:10:01 crc kubenswrapper[4771]: I0227 02:10:01.010533 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 02:10:01 crc kubenswrapper[4771]: I0227 02:10:01.017731 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535970-cq27p"] Feb 27 02:10:01 crc kubenswrapper[4771]: I0227 02:10:01.109519 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535970-cq27p" event={"ID":"c0b2f301-2490-4c19-b932-77fface25a45","Type":"ContainerStarted","Data":"044f642fdbe6977aa5b309e9a94a175e3d6327f17b16eed5dbd160ccc8c12d45"} Feb 27 02:10:03 crc kubenswrapper[4771]: I0227 02:10:03.133516 4771 generic.go:334] "Generic (PLEG): container finished" podID="c0b2f301-2490-4c19-b932-77fface25a45" containerID="bf1719b71f52a0faa6d35e36cd9db3739459261dfc7ab5492a9c2f1163665cf8" exitCode=0 Feb 27 02:10:03 crc kubenswrapper[4771]: I0227 02:10:03.133619 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535970-cq27p" event={"ID":"c0b2f301-2490-4c19-b932-77fface25a45","Type":"ContainerDied","Data":"bf1719b71f52a0faa6d35e36cd9db3739459261dfc7ab5492a9c2f1163665cf8"} Feb 27 02:10:04 crc kubenswrapper[4771]: I0227 02:10:04.577297 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535970-cq27p" Feb 27 02:10:04 crc kubenswrapper[4771]: I0227 02:10:04.648083 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5g8k6\" (UniqueName: \"kubernetes.io/projected/c0b2f301-2490-4c19-b932-77fface25a45-kube-api-access-5g8k6\") pod \"c0b2f301-2490-4c19-b932-77fface25a45\" (UID: \"c0b2f301-2490-4c19-b932-77fface25a45\") " Feb 27 02:10:04 crc kubenswrapper[4771]: I0227 02:10:04.653846 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0b2f301-2490-4c19-b932-77fface25a45-kube-api-access-5g8k6" (OuterVolumeSpecName: "kube-api-access-5g8k6") pod "c0b2f301-2490-4c19-b932-77fface25a45" (UID: "c0b2f301-2490-4c19-b932-77fface25a45"). InnerVolumeSpecName "kube-api-access-5g8k6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:10:04 crc kubenswrapper[4771]: I0227 02:10:04.752980 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5g8k6\" (UniqueName: \"kubernetes.io/projected/c0b2f301-2490-4c19-b932-77fface25a45-kube-api-access-5g8k6\") on node \"crc\" DevicePath \"\"" Feb 27 02:10:05 crc kubenswrapper[4771]: I0227 02:10:05.159880 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535970-cq27p" event={"ID":"c0b2f301-2490-4c19-b932-77fface25a45","Type":"ContainerDied","Data":"044f642fdbe6977aa5b309e9a94a175e3d6327f17b16eed5dbd160ccc8c12d45"} Feb 27 02:10:05 crc kubenswrapper[4771]: I0227 02:10:05.159924 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="044f642fdbe6977aa5b309e9a94a175e3d6327f17b16eed5dbd160ccc8c12d45" Feb 27 02:10:05 crc kubenswrapper[4771]: I0227 02:10:05.160001 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535970-cq27p" Feb 27 02:10:05 crc kubenswrapper[4771]: I0227 02:10:05.674601 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535964-8mtr2"] Feb 27 02:10:05 crc kubenswrapper[4771]: I0227 02:10:05.683214 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535964-8mtr2"] Feb 27 02:10:05 crc kubenswrapper[4771]: I0227 02:10:05.785020 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c11c843e-c871-4ff4-96d0-6b35e86d9453" path="/var/lib/kubelet/pods/c11c843e-c871-4ff4-96d0-6b35e86d9453/volumes" Feb 27 02:10:28 crc kubenswrapper[4771]: I0227 02:10:28.953806 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 02:10:28 crc kubenswrapper[4771]: I0227 02:10:28.954635 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 02:10:28 crc kubenswrapper[4771]: I0227 02:10:28.954706 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 02:10:28 crc kubenswrapper[4771]: I0227 02:10:28.955656 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb"} pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 02:10:28 crc kubenswrapper[4771]: I0227 02:10:28.955760 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" containerID="cri-o://b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" gracePeriod=600 Feb 27 02:10:29 crc kubenswrapper[4771]: E0227 02:10:29.103832 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:10:29 crc kubenswrapper[4771]: I0227 02:10:29.400662 4771 generic.go:334] "Generic (PLEG): container finished" podID="ca81e505-d53f-496e-bd26-7cec669591e4" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" exitCode=0 Feb 27 02:10:29 crc kubenswrapper[4771]: I0227 02:10:29.400717 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerDied","Data":"b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb"} Feb 27 02:10:29 crc kubenswrapper[4771]: I0227 02:10:29.400752 4771 scope.go:117] "RemoveContainer" containerID="d25c2ee99624ab7384d501c40defa38c3465b16865a65e0252cb74db46e114bb" Feb 27 02:10:29 crc kubenswrapper[4771]: I0227 02:10:29.401837 4771 scope.go:117] "RemoveContainer" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" Feb 27 02:10:29 crc kubenswrapper[4771]: E0227 02:10:29.402432 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:10:31 crc kubenswrapper[4771]: I0227 02:10:31.378373 4771 scope.go:117] "RemoveContainer" containerID="513bb3111da60e377e92c8c5a04c6b0cddf7af8484da324650a02bca60986f30" Feb 27 02:10:39 crc kubenswrapper[4771]: I0227 02:10:39.774404 4771 scope.go:117] "RemoveContainer" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" Feb 27 02:10:39 crc kubenswrapper[4771]: E0227 02:10:39.775487 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:10:50 crc kubenswrapper[4771]: I0227 02:10:50.580456 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n5q7k/must-gather-54md8"] Feb 27 02:10:50 crc kubenswrapper[4771]: E0227 02:10:50.581277 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0b2f301-2490-4c19-b932-77fface25a45" containerName="oc" Feb 27 02:10:50 crc kubenswrapper[4771]: I0227 02:10:50.581288 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0b2f301-2490-4c19-b932-77fface25a45" containerName="oc" Feb 27 02:10:50 crc kubenswrapper[4771]: I0227 02:10:50.581486 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0b2f301-2490-4c19-b932-77fface25a45" containerName="oc" Feb 27 02:10:50 crc kubenswrapper[4771]: I0227 02:10:50.582378 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5q7k/must-gather-54md8" Feb 27 02:10:50 crc kubenswrapper[4771]: I0227 02:10:50.584329 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-n5q7k"/"default-dockercfg-gtm25" Feb 27 02:10:50 crc kubenswrapper[4771]: I0227 02:10:50.585456 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-n5q7k"/"openshift-service-ca.crt" Feb 27 02:10:50 crc kubenswrapper[4771]: I0227 02:10:50.585564 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-n5q7k"/"kube-root-ca.crt" Feb 27 02:10:50 crc kubenswrapper[4771]: I0227 02:10:50.595063 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n5q7k/must-gather-54md8"] Feb 27 02:10:50 crc kubenswrapper[4771]: I0227 02:10:50.659767 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c82vr\" (UniqueName: \"kubernetes.io/projected/375bb02e-1244-4971-8c93-07ee9b85b707-kube-api-access-c82vr\") pod \"must-gather-54md8\" (UID: \"375bb02e-1244-4971-8c93-07ee9b85b707\") " pod="openshift-must-gather-n5q7k/must-gather-54md8" Feb 27 02:10:50 crc kubenswrapper[4771]: I0227 02:10:50.659957 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/375bb02e-1244-4971-8c93-07ee9b85b707-must-gather-output\") pod \"must-gather-54md8\" (UID: \"375bb02e-1244-4971-8c93-07ee9b85b707\") " pod="openshift-must-gather-n5q7k/must-gather-54md8" Feb 27 02:10:50 crc kubenswrapper[4771]: I0227 02:10:50.761480 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/375bb02e-1244-4971-8c93-07ee9b85b707-must-gather-output\") pod \"must-gather-54md8\" (UID: \"375bb02e-1244-4971-8c93-07ee9b85b707\") " pod="openshift-must-gather-n5q7k/must-gather-54md8" Feb 27 02:10:50 crc kubenswrapper[4771]: I0227 02:10:50.761542 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c82vr\" (UniqueName: \"kubernetes.io/projected/375bb02e-1244-4971-8c93-07ee9b85b707-kube-api-access-c82vr\") pod \"must-gather-54md8\" (UID: \"375bb02e-1244-4971-8c93-07ee9b85b707\") " pod="openshift-must-gather-n5q7k/must-gather-54md8" Feb 27 02:10:50 crc kubenswrapper[4771]: I0227 02:10:50.761930 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/375bb02e-1244-4971-8c93-07ee9b85b707-must-gather-output\") pod \"must-gather-54md8\" (UID: \"375bb02e-1244-4971-8c93-07ee9b85b707\") " pod="openshift-must-gather-n5q7k/must-gather-54md8" Feb 27 02:10:50 crc kubenswrapper[4771]: I0227 02:10:50.773399 4771 scope.go:117] "RemoveContainer" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" Feb 27 02:10:50 crc kubenswrapper[4771]: E0227 02:10:50.773729 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:10:50 crc kubenswrapper[4771]: I0227 02:10:50.783615 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c82vr\" (UniqueName: \"kubernetes.io/projected/375bb02e-1244-4971-8c93-07ee9b85b707-kube-api-access-c82vr\") pod \"must-gather-54md8\" (UID: \"375bb02e-1244-4971-8c93-07ee9b85b707\") " pod="openshift-must-gather-n5q7k/must-gather-54md8" Feb 27 02:10:50 crc kubenswrapper[4771]: I0227 02:10:50.903082 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5q7k/must-gather-54md8" Feb 27 02:10:51 crc kubenswrapper[4771]: I0227 02:10:51.393050 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n5q7k/must-gather-54md8"] Feb 27 02:10:51 crc kubenswrapper[4771]: I0227 02:10:51.661249 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n5q7k/must-gather-54md8" event={"ID":"375bb02e-1244-4971-8c93-07ee9b85b707","Type":"ContainerStarted","Data":"a046516739d441cc3c839166309d6fa4a1541cb245e806eee211ff89b5085a2e"} Feb 27 02:10:52 crc kubenswrapper[4771]: I0227 02:10:52.670493 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n5q7k/must-gather-54md8" event={"ID":"375bb02e-1244-4971-8c93-07ee9b85b707","Type":"ContainerStarted","Data":"8811ee2926d09f09109a1ccdd110cac73fda5825aca403d8fb6f38d32be101d5"} Feb 27 02:10:52 crc kubenswrapper[4771]: I0227 02:10:52.670865 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n5q7k/must-gather-54md8" event={"ID":"375bb02e-1244-4971-8c93-07ee9b85b707","Type":"ContainerStarted","Data":"303446f56183c30347d19b02835526b8788297756fdb321a9df3f3f88c5ee6be"} Feb 27 02:10:52 crc kubenswrapper[4771]: I0227 02:10:52.694509 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n5q7k/must-gather-54md8" podStartSLOduration=2.694484446 podStartE2EDuration="2.694484446s" podCreationTimestamp="2026-02-27 02:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 02:10:52.687907507 +0000 UTC m=+3965.625468795" watchObservedRunningTime="2026-02-27 02:10:52.694484446 +0000 UTC m=+3965.632045744" Feb 27 02:10:55 crc kubenswrapper[4771]: I0227 02:10:55.288671 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n5q7k/crc-debug-lqsb5"] Feb 27 02:10:55 crc kubenswrapper[4771]: I0227 02:10:55.290495 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5q7k/crc-debug-lqsb5" Feb 27 02:10:55 crc kubenswrapper[4771]: I0227 02:10:55.361629 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hblsb\" (UniqueName: \"kubernetes.io/projected/41fa3980-d2b1-4131-b0ff-d4542044a27e-kube-api-access-hblsb\") pod \"crc-debug-lqsb5\" (UID: \"41fa3980-d2b1-4131-b0ff-d4542044a27e\") " pod="openshift-must-gather-n5q7k/crc-debug-lqsb5" Feb 27 02:10:55 crc kubenswrapper[4771]: I0227 02:10:55.361689 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41fa3980-d2b1-4131-b0ff-d4542044a27e-host\") pod \"crc-debug-lqsb5\" (UID: \"41fa3980-d2b1-4131-b0ff-d4542044a27e\") " pod="openshift-must-gather-n5q7k/crc-debug-lqsb5" Feb 27 02:10:55 crc kubenswrapper[4771]: I0227 02:10:55.463537 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hblsb\" (UniqueName: \"kubernetes.io/projected/41fa3980-d2b1-4131-b0ff-d4542044a27e-kube-api-access-hblsb\") pod \"crc-debug-lqsb5\" (UID: \"41fa3980-d2b1-4131-b0ff-d4542044a27e\") " pod="openshift-must-gather-n5q7k/crc-debug-lqsb5" Feb 27 02:10:55 crc kubenswrapper[4771]: I0227 02:10:55.463843 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41fa3980-d2b1-4131-b0ff-d4542044a27e-host\") pod \"crc-debug-lqsb5\" (UID: \"41fa3980-d2b1-4131-b0ff-d4542044a27e\") " pod="openshift-must-gather-n5q7k/crc-debug-lqsb5" Feb 27 02:10:55 crc kubenswrapper[4771]: I0227 02:10:55.463989 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41fa3980-d2b1-4131-b0ff-d4542044a27e-host\") pod \"crc-debug-lqsb5\" (UID: \"41fa3980-d2b1-4131-b0ff-d4542044a27e\") " pod="openshift-must-gather-n5q7k/crc-debug-lqsb5" Feb 27 02:10:55 crc kubenswrapper[4771]: I0227 02:10:55.495537 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hblsb\" (UniqueName: \"kubernetes.io/projected/41fa3980-d2b1-4131-b0ff-d4542044a27e-kube-api-access-hblsb\") pod \"crc-debug-lqsb5\" (UID: \"41fa3980-d2b1-4131-b0ff-d4542044a27e\") " pod="openshift-must-gather-n5q7k/crc-debug-lqsb5" Feb 27 02:10:55 crc kubenswrapper[4771]: I0227 02:10:55.608633 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5q7k/crc-debug-lqsb5" Feb 27 02:10:55 crc kubenswrapper[4771]: I0227 02:10:55.706831 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n5q7k/crc-debug-lqsb5" event={"ID":"41fa3980-d2b1-4131-b0ff-d4542044a27e","Type":"ContainerStarted","Data":"41ff4edc957cfacdd5c89c714f9611a52b924dd4be39dd0c6d9c7a4d991d400a"} Feb 27 02:10:56 crc kubenswrapper[4771]: I0227 02:10:56.715454 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n5q7k/crc-debug-lqsb5" event={"ID":"41fa3980-d2b1-4131-b0ff-d4542044a27e","Type":"ContainerStarted","Data":"10ef0aeeffab415e47440c0349df5bd70ceea595aa089132e8ef194402a60324"} Feb 27 02:10:56 crc kubenswrapper[4771]: I0227 02:10:56.728976 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n5q7k/crc-debug-lqsb5" podStartSLOduration=1.728961304 podStartE2EDuration="1.728961304s" podCreationTimestamp="2026-02-27 02:10:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 02:10:56.727167838 +0000 UTC m=+3969.664729156" watchObservedRunningTime="2026-02-27 02:10:56.728961304 +0000 UTC m=+3969.666522592" Feb 27 02:11:01 crc kubenswrapper[4771]: I0227 02:11:01.773983 4771 scope.go:117] "RemoveContainer" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" Feb 27 02:11:01 crc kubenswrapper[4771]: E0227 02:11:01.774822 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:11:12 crc kubenswrapper[4771]: I0227 02:11:12.774045 4771 scope.go:117] "RemoveContainer" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" Feb 27 02:11:12 crc kubenswrapper[4771]: E0227 02:11:12.774877 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:11:26 crc kubenswrapper[4771]: I0227 02:11:26.773391 4771 scope.go:117] "RemoveContainer" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" Feb 27 02:11:26 crc kubenswrapper[4771]: E0227 02:11:26.774343 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:11:30 crc kubenswrapper[4771]: I0227 02:11:30.043709 4771 generic.go:334] "Generic (PLEG): container finished" podID="41fa3980-d2b1-4131-b0ff-d4542044a27e" containerID="10ef0aeeffab415e47440c0349df5bd70ceea595aa089132e8ef194402a60324" exitCode=0 Feb 27 02:11:30 crc kubenswrapper[4771]: I0227 02:11:30.043892 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n5q7k/crc-debug-lqsb5" event={"ID":"41fa3980-d2b1-4131-b0ff-d4542044a27e","Type":"ContainerDied","Data":"10ef0aeeffab415e47440c0349df5bd70ceea595aa089132e8ef194402a60324"} Feb 27 02:11:31 crc kubenswrapper[4771]: I0227 02:11:31.176294 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5q7k/crc-debug-lqsb5" Feb 27 02:11:31 crc kubenswrapper[4771]: I0227 02:11:31.198332 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41fa3980-d2b1-4131-b0ff-d4542044a27e-host\") pod \"41fa3980-d2b1-4131-b0ff-d4542044a27e\" (UID: \"41fa3980-d2b1-4131-b0ff-d4542044a27e\") " Feb 27 02:11:31 crc kubenswrapper[4771]: I0227 02:11:31.198496 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hblsb\" (UniqueName: \"kubernetes.io/projected/41fa3980-d2b1-4131-b0ff-d4542044a27e-kube-api-access-hblsb\") pod \"41fa3980-d2b1-4131-b0ff-d4542044a27e\" (UID: \"41fa3980-d2b1-4131-b0ff-d4542044a27e\") " Feb 27 02:11:31 crc kubenswrapper[4771]: I0227 02:11:31.198497 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41fa3980-d2b1-4131-b0ff-d4542044a27e-host" (OuterVolumeSpecName: "host") pod "41fa3980-d2b1-4131-b0ff-d4542044a27e" (UID: "41fa3980-d2b1-4131-b0ff-d4542044a27e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 02:11:31 crc kubenswrapper[4771]: I0227 02:11:31.199102 4771 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41fa3980-d2b1-4131-b0ff-d4542044a27e-host\") on node \"crc\" DevicePath \"\"" Feb 27 02:11:31 crc kubenswrapper[4771]: I0227 02:11:31.206583 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n5q7k/crc-debug-lqsb5"] Feb 27 02:11:31 crc kubenswrapper[4771]: I0227 02:11:31.210987 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41fa3980-d2b1-4131-b0ff-d4542044a27e-kube-api-access-hblsb" (OuterVolumeSpecName: "kube-api-access-hblsb") pod "41fa3980-d2b1-4131-b0ff-d4542044a27e" (UID: "41fa3980-d2b1-4131-b0ff-d4542044a27e"). InnerVolumeSpecName "kube-api-access-hblsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:11:31 crc kubenswrapper[4771]: I0227 02:11:31.214631 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n5q7k/crc-debug-lqsb5"] Feb 27 02:11:31 crc kubenswrapper[4771]: I0227 02:11:31.301057 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hblsb\" (UniqueName: \"kubernetes.io/projected/41fa3980-d2b1-4131-b0ff-d4542044a27e-kube-api-access-hblsb\") on node \"crc\" DevicePath \"\"" Feb 27 02:11:31 crc kubenswrapper[4771]: I0227 02:11:31.787094 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41fa3980-d2b1-4131-b0ff-d4542044a27e" path="/var/lib/kubelet/pods/41fa3980-d2b1-4131-b0ff-d4542044a27e/volumes" Feb 27 02:11:32 crc kubenswrapper[4771]: I0227 02:11:32.062442 4771 scope.go:117] "RemoveContainer" containerID="10ef0aeeffab415e47440c0349df5bd70ceea595aa089132e8ef194402a60324" Feb 27 02:11:32 crc kubenswrapper[4771]: I0227 02:11:32.062512 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5q7k/crc-debug-lqsb5" Feb 27 02:11:32 crc kubenswrapper[4771]: I0227 02:11:32.502478 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n5q7k/crc-debug-jxkzf"] Feb 27 02:11:32 crc kubenswrapper[4771]: E0227 02:11:32.502974 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41fa3980-d2b1-4131-b0ff-d4542044a27e" containerName="container-00" Feb 27 02:11:32 crc kubenswrapper[4771]: I0227 02:11:32.502992 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="41fa3980-d2b1-4131-b0ff-d4542044a27e" containerName="container-00" Feb 27 02:11:32 crc kubenswrapper[4771]: I0227 02:11:32.503251 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="41fa3980-d2b1-4131-b0ff-d4542044a27e" containerName="container-00" Feb 27 02:11:32 crc kubenswrapper[4771]: I0227 02:11:32.504033 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5q7k/crc-debug-jxkzf" Feb 27 02:11:32 crc kubenswrapper[4771]: I0227 02:11:32.634086 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq99p\" (UniqueName: \"kubernetes.io/projected/a9ce2045-086c-4748-8cb8-c352b6336944-kube-api-access-hq99p\") pod \"crc-debug-jxkzf\" (UID: \"a9ce2045-086c-4748-8cb8-c352b6336944\") " pod="openshift-must-gather-n5q7k/crc-debug-jxkzf" Feb 27 02:11:32 crc kubenswrapper[4771]: I0227 02:11:32.634157 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9ce2045-086c-4748-8cb8-c352b6336944-host\") pod \"crc-debug-jxkzf\" (UID: \"a9ce2045-086c-4748-8cb8-c352b6336944\") " pod="openshift-must-gather-n5q7k/crc-debug-jxkzf" Feb 27 02:11:32 crc kubenswrapper[4771]: I0227 02:11:32.736169 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq99p\" (UniqueName: \"kubernetes.io/projected/a9ce2045-086c-4748-8cb8-c352b6336944-kube-api-access-hq99p\") pod \"crc-debug-jxkzf\" (UID: \"a9ce2045-086c-4748-8cb8-c352b6336944\") " pod="openshift-must-gather-n5q7k/crc-debug-jxkzf" Feb 27 02:11:32 crc kubenswrapper[4771]: I0227 02:11:32.736242 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9ce2045-086c-4748-8cb8-c352b6336944-host\") pod \"crc-debug-jxkzf\" (UID: \"a9ce2045-086c-4748-8cb8-c352b6336944\") " pod="openshift-must-gather-n5q7k/crc-debug-jxkzf" Feb 27 02:11:32 crc kubenswrapper[4771]: I0227 02:11:32.736334 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9ce2045-086c-4748-8cb8-c352b6336944-host\") pod \"crc-debug-jxkzf\" (UID: \"a9ce2045-086c-4748-8cb8-c352b6336944\") " pod="openshift-must-gather-n5q7k/crc-debug-jxkzf" Feb 27 02:11:32 crc kubenswrapper[4771]: I0227 02:11:32.761305 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq99p\" (UniqueName: \"kubernetes.io/projected/a9ce2045-086c-4748-8cb8-c352b6336944-kube-api-access-hq99p\") pod \"crc-debug-jxkzf\" (UID: \"a9ce2045-086c-4748-8cb8-c352b6336944\") " pod="openshift-must-gather-n5q7k/crc-debug-jxkzf" Feb 27 02:11:32 crc kubenswrapper[4771]: I0227 02:11:32.826707 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5q7k/crc-debug-jxkzf" Feb 27 02:11:33 crc kubenswrapper[4771]: I0227 02:11:33.074873 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n5q7k/crc-debug-jxkzf" event={"ID":"a9ce2045-086c-4748-8cb8-c352b6336944","Type":"ContainerStarted","Data":"c2e6431f333c2c8381744fc71983efa0691f8b51e895ab01c1f08d3aebfea54b"} Feb 27 02:11:34 crc kubenswrapper[4771]: I0227 02:11:34.090840 4771 generic.go:334] "Generic (PLEG): container finished" podID="a9ce2045-086c-4748-8cb8-c352b6336944" containerID="b5836347d57a6fecb6836604d8592c52f044092dac8b1f2c4ed3554be10e0a28" exitCode=0 Feb 27 02:11:34 crc kubenswrapper[4771]: I0227 02:11:34.090884 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n5q7k/crc-debug-jxkzf" event={"ID":"a9ce2045-086c-4748-8cb8-c352b6336944","Type":"ContainerDied","Data":"b5836347d57a6fecb6836604d8592c52f044092dac8b1f2c4ed3554be10e0a28"} Feb 27 02:11:34 crc kubenswrapper[4771]: I0227 02:11:34.504324 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n5q7k/crc-debug-jxkzf"] Feb 27 02:11:34 crc kubenswrapper[4771]: I0227 02:11:34.513217 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n5q7k/crc-debug-jxkzf"] Feb 27 02:11:35 crc kubenswrapper[4771]: I0227 02:11:35.201795 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5q7k/crc-debug-jxkzf" Feb 27 02:11:35 crc kubenswrapper[4771]: I0227 02:11:35.281812 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9ce2045-086c-4748-8cb8-c352b6336944-host\") pod \"a9ce2045-086c-4748-8cb8-c352b6336944\" (UID: \"a9ce2045-086c-4748-8cb8-c352b6336944\") " Feb 27 02:11:35 crc kubenswrapper[4771]: I0227 02:11:35.281954 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9ce2045-086c-4748-8cb8-c352b6336944-host" (OuterVolumeSpecName: "host") pod "a9ce2045-086c-4748-8cb8-c352b6336944" (UID: "a9ce2045-086c-4748-8cb8-c352b6336944"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 02:11:35 crc kubenswrapper[4771]: I0227 02:11:35.282315 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hq99p\" (UniqueName: \"kubernetes.io/projected/a9ce2045-086c-4748-8cb8-c352b6336944-kube-api-access-hq99p\") pod \"a9ce2045-086c-4748-8cb8-c352b6336944\" (UID: \"a9ce2045-086c-4748-8cb8-c352b6336944\") " Feb 27 02:11:35 crc kubenswrapper[4771]: I0227 02:11:35.282988 4771 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9ce2045-086c-4748-8cb8-c352b6336944-host\") on node \"crc\" DevicePath \"\"" Feb 27 02:11:35 crc kubenswrapper[4771]: I0227 02:11:35.287421 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9ce2045-086c-4748-8cb8-c352b6336944-kube-api-access-hq99p" (OuterVolumeSpecName: "kube-api-access-hq99p") pod "a9ce2045-086c-4748-8cb8-c352b6336944" (UID: "a9ce2045-086c-4748-8cb8-c352b6336944"). InnerVolumeSpecName "kube-api-access-hq99p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:11:35 crc kubenswrapper[4771]: I0227 02:11:35.385137 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hq99p\" (UniqueName: \"kubernetes.io/projected/a9ce2045-086c-4748-8cb8-c352b6336944-kube-api-access-hq99p\") on node \"crc\" DevicePath \"\"" Feb 27 02:11:35 crc kubenswrapper[4771]: I0227 02:11:35.684328 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n5q7k/crc-debug-8vlk9"] Feb 27 02:11:35 crc kubenswrapper[4771]: E0227 02:11:35.684719 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9ce2045-086c-4748-8cb8-c352b6336944" containerName="container-00" Feb 27 02:11:35 crc kubenswrapper[4771]: I0227 02:11:35.684733 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9ce2045-086c-4748-8cb8-c352b6336944" containerName="container-00" Feb 27 02:11:35 crc kubenswrapper[4771]: I0227 02:11:35.684902 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9ce2045-086c-4748-8cb8-c352b6336944" containerName="container-00" Feb 27 02:11:35 crc kubenswrapper[4771]: I0227 02:11:35.685447 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5q7k/crc-debug-8vlk9" Feb 27 02:11:35 crc kubenswrapper[4771]: I0227 02:11:35.782795 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9ce2045-086c-4748-8cb8-c352b6336944" path="/var/lib/kubelet/pods/a9ce2045-086c-4748-8cb8-c352b6336944/volumes" Feb 27 02:11:35 crc kubenswrapper[4771]: I0227 02:11:35.799761 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8-host\") pod \"crc-debug-8vlk9\" (UID: \"829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8\") " pod="openshift-must-gather-n5q7k/crc-debug-8vlk9" Feb 27 02:11:35 crc kubenswrapper[4771]: I0227 02:11:35.799844 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdhf9\" (UniqueName: \"kubernetes.io/projected/829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8-kube-api-access-gdhf9\") pod \"crc-debug-8vlk9\" (UID: \"829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8\") " pod="openshift-must-gather-n5q7k/crc-debug-8vlk9" Feb 27 02:11:35 crc kubenswrapper[4771]: I0227 02:11:35.901918 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8-host\") pod \"crc-debug-8vlk9\" (UID: \"829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8\") " pod="openshift-must-gather-n5q7k/crc-debug-8vlk9" Feb 27 02:11:35 crc kubenswrapper[4771]: I0227 02:11:35.902087 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdhf9\" (UniqueName: \"kubernetes.io/projected/829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8-kube-api-access-gdhf9\") pod \"crc-debug-8vlk9\" (UID: \"829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8\") " pod="openshift-must-gather-n5q7k/crc-debug-8vlk9" Feb 27 02:11:35 crc kubenswrapper[4771]: I0227 02:11:35.902675 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8-host\") pod \"crc-debug-8vlk9\" (UID: \"829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8\") " pod="openshift-must-gather-n5q7k/crc-debug-8vlk9" Feb 27 02:11:35 crc kubenswrapper[4771]: I0227 02:11:35.937496 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdhf9\" (UniqueName: \"kubernetes.io/projected/829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8-kube-api-access-gdhf9\") pod \"crc-debug-8vlk9\" (UID: \"829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8\") " pod="openshift-must-gather-n5q7k/crc-debug-8vlk9" Feb 27 02:11:36 crc kubenswrapper[4771]: I0227 02:11:36.002098 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5q7k/crc-debug-8vlk9" Feb 27 02:11:36 crc kubenswrapper[4771]: I0227 02:11:36.133059 4771 scope.go:117] "RemoveContainer" containerID="b5836347d57a6fecb6836604d8592c52f044092dac8b1f2c4ed3554be10e0a28" Feb 27 02:11:36 crc kubenswrapper[4771]: I0227 02:11:36.133186 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5q7k/crc-debug-jxkzf" Feb 27 02:11:36 crc kubenswrapper[4771]: I0227 02:11:36.141102 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n5q7k/crc-debug-8vlk9" event={"ID":"829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8","Type":"ContainerStarted","Data":"d6df02daf62f18fbefc1e86b70ec701f1a128c3518a59c3e76b7eb12be3b8c75"} Feb 27 02:11:37 crc kubenswrapper[4771]: I0227 02:11:37.153864 4771 generic.go:334] "Generic (PLEG): container finished" podID="829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8" containerID="b5d8ccc475e1bc2f6146e246dc2312aa3d9937a09642ad268f99740e2715d866" exitCode=0 Feb 27 02:11:37 crc kubenswrapper[4771]: I0227 02:11:37.154112 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n5q7k/crc-debug-8vlk9" event={"ID":"829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8","Type":"ContainerDied","Data":"b5d8ccc475e1bc2f6146e246dc2312aa3d9937a09642ad268f99740e2715d866"} Feb 27 02:11:37 crc kubenswrapper[4771]: I0227 02:11:37.209539 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n5q7k/crc-debug-8vlk9"] Feb 27 02:11:37 crc kubenswrapper[4771]: I0227 02:11:37.225499 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n5q7k/crc-debug-8vlk9"] Feb 27 02:11:38 crc kubenswrapper[4771]: I0227 02:11:38.259949 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5q7k/crc-debug-8vlk9" Feb 27 02:11:38 crc kubenswrapper[4771]: I0227 02:11:38.347094 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8-host\") pod \"829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8\" (UID: \"829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8\") " Feb 27 02:11:38 crc kubenswrapper[4771]: I0227 02:11:38.347335 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdhf9\" (UniqueName: \"kubernetes.io/projected/829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8-kube-api-access-gdhf9\") pod \"829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8\" (UID: \"829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8\") " Feb 27 02:11:38 crc kubenswrapper[4771]: I0227 02:11:38.347499 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8-host" (OuterVolumeSpecName: "host") pod "829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8" (UID: "829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 02:11:38 crc kubenswrapper[4771]: I0227 02:11:38.348108 4771 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8-host\") on node \"crc\" DevicePath \"\"" Feb 27 02:11:38 crc kubenswrapper[4771]: I0227 02:11:38.356848 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8-kube-api-access-gdhf9" (OuterVolumeSpecName: "kube-api-access-gdhf9") pod "829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8" (UID: "829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8"). InnerVolumeSpecName "kube-api-access-gdhf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:11:38 crc kubenswrapper[4771]: I0227 02:11:38.450254 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdhf9\" (UniqueName: \"kubernetes.io/projected/829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8-kube-api-access-gdhf9\") on node \"crc\" DevicePath \"\"" Feb 27 02:11:39 crc kubenswrapper[4771]: I0227 02:11:39.173602 4771 scope.go:117] "RemoveContainer" containerID="b5d8ccc475e1bc2f6146e246dc2312aa3d9937a09642ad268f99740e2715d866" Feb 27 02:11:39 crc kubenswrapper[4771]: I0227 02:11:39.173677 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5q7k/crc-debug-8vlk9" Feb 27 02:11:39 crc kubenswrapper[4771]: I0227 02:11:39.773116 4771 scope.go:117] "RemoveContainer" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" Feb 27 02:11:39 crc kubenswrapper[4771]: E0227 02:11:39.773381 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:11:39 crc kubenswrapper[4771]: I0227 02:11:39.782666 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8" path="/var/lib/kubelet/pods/829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8/volumes" Feb 27 02:11:51 crc kubenswrapper[4771]: I0227 02:11:51.774234 4771 scope.go:117] "RemoveContainer" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" Feb 27 02:11:51 crc kubenswrapper[4771]: E0227 02:11:51.775189 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:12:00 crc kubenswrapper[4771]: I0227 02:12:00.146117 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535972-5rfsz"] Feb 27 02:12:00 crc kubenswrapper[4771]: E0227 02:12:00.147044 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8" containerName="container-00" Feb 27 02:12:00 crc kubenswrapper[4771]: I0227 02:12:00.147058 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8" containerName="container-00" Feb 27 02:12:00 crc kubenswrapper[4771]: I0227 02:12:00.147247 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="829de0bc-71e7-4e1c-83fa-5a8c7ca18aa8" containerName="container-00" Feb 27 02:12:00 crc kubenswrapper[4771]: I0227 02:12:00.147866 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535972-5rfsz" Feb 27 02:12:00 crc kubenswrapper[4771]: I0227 02:12:00.149650 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 02:12:00 crc kubenswrapper[4771]: I0227 02:12:00.149951 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 02:12:00 crc kubenswrapper[4771]: I0227 02:12:00.150076 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 02:12:00 crc kubenswrapper[4771]: I0227 02:12:00.154608 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535972-5rfsz"] Feb 27 02:12:00 crc kubenswrapper[4771]: I0227 02:12:00.164839 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cxz2\" (UniqueName: \"kubernetes.io/projected/d2947f34-0e2b-4968-9c29-ef67acacebb0-kube-api-access-4cxz2\") pod \"auto-csr-approver-29535972-5rfsz\" (UID: \"d2947f34-0e2b-4968-9c29-ef67acacebb0\") " pod="openshift-infra/auto-csr-approver-29535972-5rfsz" Feb 27 02:12:00 crc kubenswrapper[4771]: I0227 02:12:00.266342 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cxz2\" (UniqueName: \"kubernetes.io/projected/d2947f34-0e2b-4968-9c29-ef67acacebb0-kube-api-access-4cxz2\") pod \"auto-csr-approver-29535972-5rfsz\" (UID: \"d2947f34-0e2b-4968-9c29-ef67acacebb0\") " pod="openshift-infra/auto-csr-approver-29535972-5rfsz" Feb 27 02:12:00 crc kubenswrapper[4771]: I0227 02:12:00.288362 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cxz2\" (UniqueName: \"kubernetes.io/projected/d2947f34-0e2b-4968-9c29-ef67acacebb0-kube-api-access-4cxz2\") pod \"auto-csr-approver-29535972-5rfsz\" (UID: \"d2947f34-0e2b-4968-9c29-ef67acacebb0\") " pod="openshift-infra/auto-csr-approver-29535972-5rfsz" Feb 27 02:12:00 crc kubenswrapper[4771]: I0227 02:12:00.469371 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535972-5rfsz" Feb 27 02:12:01 crc kubenswrapper[4771]: I0227 02:12:01.017705 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535972-5rfsz"] Feb 27 02:12:01 crc kubenswrapper[4771]: I0227 02:12:01.365375 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535972-5rfsz" event={"ID":"d2947f34-0e2b-4968-9c29-ef67acacebb0","Type":"ContainerStarted","Data":"225292533d13827de004eebf7b77f6e51edc1fe0aef1773ad84e3c87c77ec368"} Feb 27 02:12:02 crc kubenswrapper[4771]: I0227 02:12:02.774969 4771 scope.go:117] "RemoveContainer" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" Feb 27 02:12:02 crc kubenswrapper[4771]: E0227 02:12:02.775802 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:12:03 crc kubenswrapper[4771]: I0227 02:12:03.388785 4771 generic.go:334] "Generic (PLEG): container finished" podID="d2947f34-0e2b-4968-9c29-ef67acacebb0" containerID="17eed80d8b151144d9ba77d19620947864ff00c767218f3ffaefa7ec5b42e1d1" exitCode=0 Feb 27 02:12:03 crc kubenswrapper[4771]: I0227 02:12:03.388929 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535972-5rfsz" event={"ID":"d2947f34-0e2b-4968-9c29-ef67acacebb0","Type":"ContainerDied","Data":"17eed80d8b151144d9ba77d19620947864ff00c767218f3ffaefa7ec5b42e1d1"} Feb 27 02:12:04 crc kubenswrapper[4771]: I0227 02:12:04.304135 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tddzt"] Feb 27 02:12:04 crc kubenswrapper[4771]: I0227 02:12:04.308581 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tddzt" Feb 27 02:12:04 crc kubenswrapper[4771]: I0227 02:12:04.354719 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tddzt"] Feb 27 02:12:04 crc kubenswrapper[4771]: I0227 02:12:04.358693 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94094ea8-6702-4670-bd98-190530cccf8b-utilities\") pod \"certified-operators-tddzt\" (UID: \"94094ea8-6702-4670-bd98-190530cccf8b\") " pod="openshift-marketplace/certified-operators-tddzt" Feb 27 02:12:04 crc kubenswrapper[4771]: I0227 02:12:04.358771 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrcrq\" (UniqueName: \"kubernetes.io/projected/94094ea8-6702-4670-bd98-190530cccf8b-kube-api-access-qrcrq\") pod \"certified-operators-tddzt\" (UID: \"94094ea8-6702-4670-bd98-190530cccf8b\") " pod="openshift-marketplace/certified-operators-tddzt" Feb 27 02:12:04 crc kubenswrapper[4771]: I0227 02:12:04.359046 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94094ea8-6702-4670-bd98-190530cccf8b-catalog-content\") pod \"certified-operators-tddzt\" (UID: \"94094ea8-6702-4670-bd98-190530cccf8b\") " pod="openshift-marketplace/certified-operators-tddzt" Feb 27 02:12:04 crc kubenswrapper[4771]: I0227 02:12:04.460976 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94094ea8-6702-4670-bd98-190530cccf8b-utilities\") pod \"certified-operators-tddzt\" (UID: \"94094ea8-6702-4670-bd98-190530cccf8b\") " pod="openshift-marketplace/certified-operators-tddzt" Feb 27 02:12:04 crc kubenswrapper[4771]: I0227 02:12:04.461041 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrcrq\" (UniqueName: \"kubernetes.io/projected/94094ea8-6702-4670-bd98-190530cccf8b-kube-api-access-qrcrq\") pod \"certified-operators-tddzt\" (UID: \"94094ea8-6702-4670-bd98-190530cccf8b\") " pod="openshift-marketplace/certified-operators-tddzt" Feb 27 02:12:04 crc kubenswrapper[4771]: I0227 02:12:04.461270 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94094ea8-6702-4670-bd98-190530cccf8b-catalog-content\") pod \"certified-operators-tddzt\" (UID: \"94094ea8-6702-4670-bd98-190530cccf8b\") " pod="openshift-marketplace/certified-operators-tddzt" Feb 27 02:12:04 crc kubenswrapper[4771]: I0227 02:12:04.461457 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94094ea8-6702-4670-bd98-190530cccf8b-utilities\") pod \"certified-operators-tddzt\" (UID: \"94094ea8-6702-4670-bd98-190530cccf8b\") " pod="openshift-marketplace/certified-operators-tddzt" Feb 27 02:12:04 crc kubenswrapper[4771]: I0227 02:12:04.461813 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94094ea8-6702-4670-bd98-190530cccf8b-catalog-content\") pod \"certified-operators-tddzt\" (UID: \"94094ea8-6702-4670-bd98-190530cccf8b\") " pod="openshift-marketplace/certified-operators-tddzt" Feb 27 02:12:04 crc kubenswrapper[4771]: I0227 02:12:04.481450 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrcrq\" (UniqueName: \"kubernetes.io/projected/94094ea8-6702-4670-bd98-190530cccf8b-kube-api-access-qrcrq\") pod \"certified-operators-tddzt\" (UID: \"94094ea8-6702-4670-bd98-190530cccf8b\") " pod="openshift-marketplace/certified-operators-tddzt" Feb 27 02:12:04 crc kubenswrapper[4771]: I0227 02:12:04.640114 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tddzt" Feb 27 02:12:04 crc kubenswrapper[4771]: I0227 02:12:04.822042 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535972-5rfsz" Feb 27 02:12:04 crc kubenswrapper[4771]: I0227 02:12:04.871359 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cxz2\" (UniqueName: \"kubernetes.io/projected/d2947f34-0e2b-4968-9c29-ef67acacebb0-kube-api-access-4cxz2\") pod \"d2947f34-0e2b-4968-9c29-ef67acacebb0\" (UID: \"d2947f34-0e2b-4968-9c29-ef67acacebb0\") " Feb 27 02:12:04 crc kubenswrapper[4771]: I0227 02:12:04.896871 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2947f34-0e2b-4968-9c29-ef67acacebb0-kube-api-access-4cxz2" (OuterVolumeSpecName: "kube-api-access-4cxz2") pod "d2947f34-0e2b-4968-9c29-ef67acacebb0" (UID: "d2947f34-0e2b-4968-9c29-ef67acacebb0"). InnerVolumeSpecName "kube-api-access-4cxz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:12:04 crc kubenswrapper[4771]: I0227 02:12:04.974735 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cxz2\" (UniqueName: \"kubernetes.io/projected/d2947f34-0e2b-4968-9c29-ef67acacebb0-kube-api-access-4cxz2\") on node \"crc\" DevicePath \"\"" Feb 27 02:12:05 crc kubenswrapper[4771]: I0227 02:12:05.151154 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tddzt"] Feb 27 02:12:05 crc kubenswrapper[4771]: W0227 02:12:05.151211 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94094ea8_6702_4670_bd98_190530cccf8b.slice/crio-e8ae0b769f640f70d5468e95ea4e12ea118fe153ca601e4fa6acbd118624950f WatchSource:0}: Error finding container e8ae0b769f640f70d5468e95ea4e12ea118fe153ca601e4fa6acbd118624950f: Status 404 returned error can't find the container with id e8ae0b769f640f70d5468e95ea4e12ea118fe153ca601e4fa6acbd118624950f Feb 27 02:12:05 crc kubenswrapper[4771]: I0227 02:12:05.410824 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535972-5rfsz" event={"ID":"d2947f34-0e2b-4968-9c29-ef67acacebb0","Type":"ContainerDied","Data":"225292533d13827de004eebf7b77f6e51edc1fe0aef1773ad84e3c87c77ec368"} Feb 27 02:12:05 crc kubenswrapper[4771]: I0227 02:12:05.412203 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="225292533d13827de004eebf7b77f6e51edc1fe0aef1773ad84e3c87c77ec368" Feb 27 02:12:05 crc kubenswrapper[4771]: I0227 02:12:05.410907 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535972-5rfsz" Feb 27 02:12:05 crc kubenswrapper[4771]: I0227 02:12:05.413761 4771 generic.go:334] "Generic (PLEG): container finished" podID="94094ea8-6702-4670-bd98-190530cccf8b" containerID="a1dd8866d077c1dbbd1161e2030ae4ec5931e6400f353696f851e2d31c74eb67" exitCode=0 Feb 27 02:12:05 crc kubenswrapper[4771]: I0227 02:12:05.413813 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tddzt" event={"ID":"94094ea8-6702-4670-bd98-190530cccf8b","Type":"ContainerDied","Data":"a1dd8866d077c1dbbd1161e2030ae4ec5931e6400f353696f851e2d31c74eb67"} Feb 27 02:12:05 crc kubenswrapper[4771]: I0227 02:12:05.413850 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tddzt" event={"ID":"94094ea8-6702-4670-bd98-190530cccf8b","Type":"ContainerStarted","Data":"e8ae0b769f640f70d5468e95ea4e12ea118fe153ca601e4fa6acbd118624950f"} Feb 27 02:12:05 crc kubenswrapper[4771]: I0227 02:12:05.910186 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535966-7tjxj"] Feb 27 02:12:05 crc kubenswrapper[4771]: I0227 02:12:05.922693 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535966-7tjxj"] Feb 27 02:12:06 crc kubenswrapper[4771]: I0227 02:12:06.424496 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tddzt" event={"ID":"94094ea8-6702-4670-bd98-190530cccf8b","Type":"ContainerStarted","Data":"9a65822d01bc8b1ece0e614a2aaad9120c7a084739dbaea9ab3bd8ffd3780576"} Feb 27 02:12:07 crc kubenswrapper[4771]: I0227 02:12:07.436764 4771 generic.go:334] "Generic (PLEG): container finished" podID="94094ea8-6702-4670-bd98-190530cccf8b" containerID="9a65822d01bc8b1ece0e614a2aaad9120c7a084739dbaea9ab3bd8ffd3780576" exitCode=0 Feb 27 02:12:07 crc kubenswrapper[4771]: I0227 02:12:07.436812 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tddzt" event={"ID":"94094ea8-6702-4670-bd98-190530cccf8b","Type":"ContainerDied","Data":"9a65822d01bc8b1ece0e614a2aaad9120c7a084739dbaea9ab3bd8ffd3780576"} Feb 27 02:12:07 crc kubenswrapper[4771]: I0227 02:12:07.785909 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="118838b1-f460-498e-8197-1894fb3b3669" path="/var/lib/kubelet/pods/118838b1-f460-498e-8197-1894fb3b3669/volumes" Feb 27 02:12:08 crc kubenswrapper[4771]: I0227 02:12:08.160615 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-69fd595d46-6k6cs_d010a73f-6034-48ea-b18b-3bad26fe39ee/barbican-api/0.log" Feb 27 02:12:08 crc kubenswrapper[4771]: I0227 02:12:08.337813 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-69fd595d46-6k6cs_d010a73f-6034-48ea-b18b-3bad26fe39ee/barbican-api-log/0.log" Feb 27 02:12:08 crc kubenswrapper[4771]: I0227 02:12:08.381184 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-66c7555cc4-mtbzr_13fb6f6e-1dda-4e09-971a-d0629bc44ff4/barbican-keystone-listener/0.log" Feb 27 02:12:08 crc kubenswrapper[4771]: I0227 02:12:08.445517 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tddzt" event={"ID":"94094ea8-6702-4670-bd98-190530cccf8b","Type":"ContainerStarted","Data":"80383ba112f7e3285cffdc996dd7a76a7bbb87e389ff3057f5152b7885e1d552"} Feb 27 02:12:08 crc kubenswrapper[4771]: I0227 02:12:08.464249 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-66c7555cc4-mtbzr_13fb6f6e-1dda-4e09-971a-d0629bc44ff4/barbican-keystone-listener-log/0.log" Feb 27 02:12:08 crc kubenswrapper[4771]: I0227 02:12:08.465666 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tddzt" podStartSLOduration=2.016726772 podStartE2EDuration="4.465651089s" podCreationTimestamp="2026-02-27 02:12:04 +0000 UTC" firstStartedPulling="2026-02-27 02:12:05.415798991 +0000 UTC m=+4038.353360279" lastFinishedPulling="2026-02-27 02:12:07.864723308 +0000 UTC m=+4040.802284596" observedRunningTime="2026-02-27 02:12:08.463087124 +0000 UTC m=+4041.400648412" watchObservedRunningTime="2026-02-27 02:12:08.465651089 +0000 UTC m=+4041.403212377" Feb 27 02:12:08 crc kubenswrapper[4771]: I0227 02:12:08.644879 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5894b4657f-lj4ff_24cb181d-8c43-4ae8-9af0-b28f570f7f22/barbican-worker/0.log" Feb 27 02:12:08 crc kubenswrapper[4771]: I0227 02:12:08.656813 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5894b4657f-lj4ff_24cb181d-8c43-4ae8-9af0-b28f570f7f22/barbican-worker-log/0.log" Feb 27 02:12:08 crc kubenswrapper[4771]: I0227 02:12:08.872358 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-hmqg5_34ae2923-be95-45e5-a840-dfea9b17f9c4/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:12:08 crc kubenswrapper[4771]: I0227 02:12:08.907004 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_26685008-55b9-4176-98b8-f915a6004b36/ceilometer-central-agent/0.log" Feb 27 02:12:09 crc kubenswrapper[4771]: I0227 02:12:09.064582 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_26685008-55b9-4176-98b8-f915a6004b36/ceilometer-notification-agent/0.log" Feb 27 02:12:09 crc kubenswrapper[4771]: I0227 02:12:09.106759 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_26685008-55b9-4176-98b8-f915a6004b36/proxy-httpd/0.log" Feb 27 02:12:09 crc kubenswrapper[4771]: I0227 02:12:09.176929 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_26685008-55b9-4176-98b8-f915a6004b36/sg-core/0.log" Feb 27 02:12:09 crc kubenswrapper[4771]: I0227 02:12:09.342587 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3b708a5c-dd83-482a-bf4a-988909a38d76/cinder-api/0.log" Feb 27 02:12:09 crc kubenswrapper[4771]: I0227 02:12:09.386518 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3b708a5c-dd83-482a-bf4a-988909a38d76/cinder-api-log/0.log" Feb 27 02:12:09 crc kubenswrapper[4771]: I0227 02:12:09.509447 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e15c68f3-a904-4d91-a778-4e5b5a728c9f/cinder-scheduler/0.log" Feb 27 02:12:09 crc kubenswrapper[4771]: I0227 02:12:09.629204 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e15c68f3-a904-4d91-a778-4e5b5a728c9f/probe/0.log" Feb 27 02:12:09 crc kubenswrapper[4771]: I0227 02:12:09.718344 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-cvrnf_acd636bf-528e-4bbe-8220-e4a9b755b025/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:12:09 crc kubenswrapper[4771]: I0227 02:12:09.933308 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-jt6xl_67424e5d-eec0-4d0c-ba08-eebe40f4ac6e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:12:09 crc kubenswrapper[4771]: I0227 02:12:09.961789 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-b7nss_13aff92d-bbb5-4229-8296-90dea52e389a/init/0.log" Feb 27 02:12:10 crc kubenswrapper[4771]: I0227 02:12:10.146598 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-b7nss_13aff92d-bbb5-4229-8296-90dea52e389a/init/0.log" Feb 27 02:12:10 crc kubenswrapper[4771]: I0227 02:12:10.190937 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-b7nss_13aff92d-bbb5-4229-8296-90dea52e389a/dnsmasq-dns/0.log" Feb 27 02:12:10 crc kubenswrapper[4771]: I0227 02:12:10.236072 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-47l7f_761add5e-bade-44af-be1b-3cbcaa54f19a/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:12:10 crc kubenswrapper[4771]: I0227 02:12:10.380460 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_03b15be0-3bda-4754-b43a-35e34cb84fcb/glance-httpd/0.log" Feb 27 02:12:10 crc kubenswrapper[4771]: I0227 02:12:10.394197 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_03b15be0-3bda-4754-b43a-35e34cb84fcb/glance-log/0.log" Feb 27 02:12:10 crc kubenswrapper[4771]: I0227 02:12:10.550750 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8c197b80-0aa2-49fd-b9d6-19cbb40e59e3/glance-log/0.log" Feb 27 02:12:10 crc kubenswrapper[4771]: I0227 02:12:10.630566 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8c197b80-0aa2-49fd-b9d6-19cbb40e59e3/glance-httpd/0.log" Feb 27 02:12:10 crc kubenswrapper[4771]: I0227 02:12:10.744410 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-555c84df64-lmgxw_9db15a3b-2c83-4d54-b5ea-697e6362b4e9/horizon/0.log" Feb 27 02:12:10 crc kubenswrapper[4771]: I0227 02:12:10.873276 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-tpsc5_d903dbaa-f429-4c92-8c5a-17c1622bf8bd/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:12:11 crc kubenswrapper[4771]: I0227 02:12:11.122879 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-555c84df64-lmgxw_9db15a3b-2c83-4d54-b5ea-697e6362b4e9/horizon-log/0.log" Feb 27 02:12:11 crc kubenswrapper[4771]: I0227 02:12:11.135358 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-4kmmj_53bf3a2a-497c-4432-8b0f-e8092fcb72ff/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:12:11 crc kubenswrapper[4771]: I0227 02:12:11.326031 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29535961-smmft_46c1ffed-6a0c-4b69-9dfe-2474731d06b7/keystone-cron/0.log" Feb 27 02:12:11 crc kubenswrapper[4771]: I0227 02:12:11.403578 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-56bfd8fdf6-rxxnr_b66f9559-0d35-47b3-ab89-06425ff3afd3/keystone-api/0.log" Feb 27 02:12:11 crc kubenswrapper[4771]: I0227 02:12:11.559115 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_336e9838-30f4-4164-8664-073e172d8750/kube-state-metrics/0.log" Feb 27 02:12:11 crc kubenswrapper[4771]: I0227 02:12:11.620382 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-7vzmz_40c7ae0e-123b-42cf-99cf-57309d7c22b0/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:12:11 crc kubenswrapper[4771]: I0227 02:12:11.966628 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6fd6bd959-l4htk_db54a8be-2fc6-4aee-b505-e1a526407006/neutron-httpd/0.log" Feb 27 02:12:12 crc kubenswrapper[4771]: I0227 02:12:12.056337 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6fd6bd959-l4htk_db54a8be-2fc6-4aee-b505-e1a526407006/neutron-api/0.log" Feb 27 02:12:12 crc kubenswrapper[4771]: I0227 02:12:12.190733 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-dnvjl_9a2ce866-27c5-4ac5-8a27-d44ba505c3d8/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:12:12 crc kubenswrapper[4771]: I0227 02:12:12.667497 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_01f57ee9-e99c-48b0-834b-af553e0c7e5f/nova-api-log/0.log" Feb 27 02:12:12 crc kubenswrapper[4771]: I0227 02:12:12.741596 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_03b297ed-ac7f-4416-b929-b3d463bc5d72/nova-cell0-conductor-conductor/0.log" Feb 27 02:12:13 crc kubenswrapper[4771]: I0227 02:12:13.139824 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_fdf27295-a275-4fb3-9e79-c3627df37a39/nova-cell1-conductor-conductor/0.log" Feb 27 02:12:13 crc kubenswrapper[4771]: I0227 02:12:13.227983 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_f748ee94-8cc7-4616-a035-a35770442cbc/nova-cell1-novncproxy-novncproxy/0.log" Feb 27 02:12:13 crc kubenswrapper[4771]: I0227 02:12:13.260132 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_01f57ee9-e99c-48b0-834b-af553e0c7e5f/nova-api-api/0.log" Feb 27 02:12:13 crc kubenswrapper[4771]: I0227 02:12:13.389273 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-6lnt2_d2a7b19f-a0a4-4aa8-80c5-f05300c19d99/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:12:13 crc kubenswrapper[4771]: I0227 02:12:13.578637 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_81dfb61e-b373-4273-b55c-0d4680f89779/nova-metadata-log/0.log" Feb 27 02:12:13 crc kubenswrapper[4771]: I0227 02:12:13.801284 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_ec376af9-95db-45e8-bb5b-1a4bec9e0197/nova-scheduler-scheduler/0.log" Feb 27 02:12:14 crc kubenswrapper[4771]: I0227 02:12:14.028092 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_39fb27d1-e9a6-44e4-9f92-d5f0242a8007/mysql-bootstrap/0.log" Feb 27 02:12:14 crc kubenswrapper[4771]: I0227 02:12:14.236204 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_39fb27d1-e9a6-44e4-9f92-d5f0242a8007/mysql-bootstrap/0.log" Feb 27 02:12:14 crc kubenswrapper[4771]: I0227 02:12:14.247702 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_39fb27d1-e9a6-44e4-9f92-d5f0242a8007/galera/0.log" Feb 27 02:12:14 crc kubenswrapper[4771]: I0227 02:12:14.403358 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8be4acd2-0f92-4f9f-9521-5da586b712f0/mysql-bootstrap/0.log" Feb 27 02:12:14 crc kubenswrapper[4771]: I0227 02:12:14.639921 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tddzt" Feb 27 02:12:14 crc kubenswrapper[4771]: I0227 02:12:14.640337 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tddzt" Feb 27 02:12:14 crc kubenswrapper[4771]: I0227 02:12:14.647871 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8be4acd2-0f92-4f9f-9521-5da586b712f0/galera/0.log" Feb 27 02:12:14 crc kubenswrapper[4771]: I0227 02:12:14.656932 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8be4acd2-0f92-4f9f-9521-5da586b712f0/mysql-bootstrap/0.log" Feb 27 02:12:14 crc kubenswrapper[4771]: I0227 02:12:14.689813 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tddzt" Feb 27 02:12:14 crc kubenswrapper[4771]: I0227 02:12:14.821563 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_2c2dc0ad-4c8c-42bf-a442-b0c51ed3f8de/openstackclient/0.log" Feb 27 02:12:14 crc kubenswrapper[4771]: I0227 02:12:14.837007 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_81dfb61e-b373-4273-b55c-0d4680f89779/nova-metadata-metadata/0.log" Feb 27 02:12:14 crc kubenswrapper[4771]: I0227 02:12:14.907976 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-fcmgm_3ef0bfcb-87a8-4b1d-9084-3486da00981a/openstack-network-exporter/0.log" Feb 27 02:12:15 crc kubenswrapper[4771]: I0227 02:12:15.105652 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tjchc_000564b2-d16b-45fb-ba91-e65b85bd7fb5/ovsdb-server-init/0.log" Feb 27 02:12:15 crc kubenswrapper[4771]: I0227 02:12:15.246692 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tjchc_000564b2-d16b-45fb-ba91-e65b85bd7fb5/ovsdb-server-init/0.log" Feb 27 02:12:15 crc kubenswrapper[4771]: I0227 02:12:15.259513 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tjchc_000564b2-d16b-45fb-ba91-e65b85bd7fb5/ovs-vswitchd/0.log" Feb 27 02:12:15 crc kubenswrapper[4771]: I0227 02:12:15.295107 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tjchc_000564b2-d16b-45fb-ba91-e65b85bd7fb5/ovsdb-server/0.log" Feb 27 02:12:15 crc kubenswrapper[4771]: I0227 02:12:15.466315 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-s5lkp_8c578c69-744e-425b-8bb1-76eec4b332ec/ovn-controller/0.log" Feb 27 02:12:15 crc kubenswrapper[4771]: I0227 02:12:15.518820 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-c9bfk_64b58c2b-7189-40c8-94b0-c31f167845d1/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:12:15 crc kubenswrapper[4771]: I0227 02:12:15.558318 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tddzt" Feb 27 02:12:15 crc kubenswrapper[4771]: I0227 02:12:15.612107 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tddzt"] Feb 27 02:12:15 crc kubenswrapper[4771]: I0227 02:12:15.677188 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_65f02053-1ff7-4e60-ae6e-e25c36df39da/openstack-network-exporter/0.log" Feb 27 02:12:15 crc kubenswrapper[4771]: I0227 02:12:15.693175 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_65f02053-1ff7-4e60-ae6e-e25c36df39da/ovn-northd/0.log" Feb 27 02:12:15 crc kubenswrapper[4771]: I0227 02:12:15.776145 4771 scope.go:117] "RemoveContainer" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" Feb 27 02:12:15 crc kubenswrapper[4771]: E0227 02:12:15.776398 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:12:15 crc kubenswrapper[4771]: I0227 02:12:15.785131 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_13396b98-6f5b-4800-854f-7b7d6af4cda4/openstack-network-exporter/0.log" Feb 27 02:12:15 crc kubenswrapper[4771]: I0227 02:12:15.881868 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_13396b98-6f5b-4800-854f-7b7d6af4cda4/ovsdbserver-nb/0.log" Feb 27 02:12:15 crc kubenswrapper[4771]: I0227 02:12:15.949228 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ed3a808f-7dba-4f32-a081-29eab07e84c0/openstack-network-exporter/0.log" Feb 27 02:12:16 crc kubenswrapper[4771]: I0227 02:12:16.009502 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ed3a808f-7dba-4f32-a081-29eab07e84c0/ovsdbserver-sb/0.log" Feb 27 02:12:16 crc kubenswrapper[4771]: I0227 02:12:16.209365 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5cddbc5576-b9kzz_577d7298-4011-4f66-a59c-36b823400652/placement-api/0.log" Feb 27 02:12:16 crc kubenswrapper[4771]: I0227 02:12:16.257886 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5cddbc5576-b9kzz_577d7298-4011-4f66-a59c-36b823400652/placement-log/0.log" Feb 27 02:12:16 crc kubenswrapper[4771]: I0227 02:12:16.384629 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_370e8739-d955-433e-8f61-b8e3bc1d8dc7/setup-container/0.log" Feb 27 02:12:16 crc kubenswrapper[4771]: I0227 02:12:16.585164 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_370e8739-d955-433e-8f61-b8e3bc1d8dc7/setup-container/0.log" Feb 27 02:12:16 crc kubenswrapper[4771]: I0227 02:12:16.602563 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7813115d-b642-406c-892d-61b10c9777d2/setup-container/0.log" Feb 27 02:12:16 crc kubenswrapper[4771]: I0227 02:12:16.660120 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_370e8739-d955-433e-8f61-b8e3bc1d8dc7/rabbitmq/0.log" Feb 27 02:12:16 crc kubenswrapper[4771]: I0227 02:12:16.858670 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7813115d-b642-406c-892d-61b10c9777d2/setup-container/0.log" Feb 27 02:12:16 crc kubenswrapper[4771]: I0227 02:12:16.881915 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7813115d-b642-406c-892d-61b10c9777d2/rabbitmq/0.log" Feb 27 02:12:16 crc kubenswrapper[4771]: I0227 02:12:16.905687 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-hxm4k_df815e54-72eb-44e8-b6dd-a1758fd381e0/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:12:17 crc kubenswrapper[4771]: I0227 02:12:17.062906 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-f88s5_4e2d0148-506d-458b-89c3-1faf19410b6b/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:12:17 crc kubenswrapper[4771]: I0227 02:12:17.150784 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-p5lbt_c6b0ecf8-2611-4192-94ad-c4f9974cbab9/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:12:17 crc kubenswrapper[4771]: I0227 02:12:17.349349 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-pr4dd_6124b0a4-176b-41d9-8ebc-db0675eeb0e4/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:12:17 crc kubenswrapper[4771]: I0227 02:12:17.465069 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-2v7s2_59c79bc1-5dcb-495e-8ce8-7c74517d2df6/ssh-known-hosts-edpm-deployment/0.log" Feb 27 02:12:17 crc kubenswrapper[4771]: I0227 02:12:17.523527 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tddzt" podUID="94094ea8-6702-4670-bd98-190530cccf8b" containerName="registry-server" containerID="cri-o://80383ba112f7e3285cffdc996dd7a76a7bbb87e389ff3057f5152b7885e1d552" gracePeriod=2 Feb 27 02:12:17 crc kubenswrapper[4771]: I0227 02:12:17.682182 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7b8d8fb79c-qxz4q_d0f1ec21-667d-46de-abbb-cb95d29e861c/proxy-server/0.log" Feb 27 02:12:17 crc kubenswrapper[4771]: I0227 02:12:17.748034 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-cm796_8a59a151-f189-4128-b462-29557b12a8da/swift-ring-rebalance/0.log" Feb 27 02:12:17 crc kubenswrapper[4771]: I0227 02:12:17.805890 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7b8d8fb79c-qxz4q_d0f1ec21-667d-46de-abbb-cb95d29e861c/proxy-httpd/0.log" Feb 27 02:12:17 crc kubenswrapper[4771]: I0227 02:12:17.931101 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/account-auditor/0.log" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.011044 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/account-reaper/0.log" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.018874 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tddzt" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.031592 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94094ea8-6702-4670-bd98-190530cccf8b-catalog-content\") pod \"94094ea8-6702-4670-bd98-190530cccf8b\" (UID: \"94094ea8-6702-4670-bd98-190530cccf8b\") " Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.031743 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrcrq\" (UniqueName: \"kubernetes.io/projected/94094ea8-6702-4670-bd98-190530cccf8b-kube-api-access-qrcrq\") pod \"94094ea8-6702-4670-bd98-190530cccf8b\" (UID: \"94094ea8-6702-4670-bd98-190530cccf8b\") " Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.031796 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94094ea8-6702-4670-bd98-190530cccf8b-utilities\") pod \"94094ea8-6702-4670-bd98-190530cccf8b\" (UID: \"94094ea8-6702-4670-bd98-190530cccf8b\") " Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.032773 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94094ea8-6702-4670-bd98-190530cccf8b-utilities" (OuterVolumeSpecName: "utilities") pod "94094ea8-6702-4670-bd98-190530cccf8b" (UID: "94094ea8-6702-4670-bd98-190530cccf8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.073737 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94094ea8-6702-4670-bd98-190530cccf8b-kube-api-access-qrcrq" (OuterVolumeSpecName: "kube-api-access-qrcrq") pod "94094ea8-6702-4670-bd98-190530cccf8b" (UID: "94094ea8-6702-4670-bd98-190530cccf8b"). InnerVolumeSpecName "kube-api-access-qrcrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.096648 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94094ea8-6702-4670-bd98-190530cccf8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94094ea8-6702-4670-bd98-190530cccf8b" (UID: "94094ea8-6702-4670-bd98-190530cccf8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.114117 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/account-replicator/0.log" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.133272 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrcrq\" (UniqueName: \"kubernetes.io/projected/94094ea8-6702-4670-bd98-190530cccf8b-kube-api-access-qrcrq\") on node \"crc\" DevicePath \"\"" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.133318 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94094ea8-6702-4670-bd98-190530cccf8b-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.133333 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94094ea8-6702-4670-bd98-190530cccf8b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.171131 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/container-auditor/0.log" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.219168 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/account-server/0.log" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.289365 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/container-replicator/0.log" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.342850 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/container-server/0.log" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.384878 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/container-updater/0.log" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.419180 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/object-auditor/0.log" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.533842 4771 generic.go:334] "Generic (PLEG): container finished" podID="94094ea8-6702-4670-bd98-190530cccf8b" containerID="80383ba112f7e3285cffdc996dd7a76a7bbb87e389ff3057f5152b7885e1d552" exitCode=0 Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.533889 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tddzt" event={"ID":"94094ea8-6702-4670-bd98-190530cccf8b","Type":"ContainerDied","Data":"80383ba112f7e3285cffdc996dd7a76a7bbb87e389ff3057f5152b7885e1d552"} Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.533942 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tddzt" event={"ID":"94094ea8-6702-4670-bd98-190530cccf8b","Type":"ContainerDied","Data":"e8ae0b769f640f70d5468e95ea4e12ea118fe153ca601e4fa6acbd118624950f"} Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.533962 4771 scope.go:117] "RemoveContainer" containerID="80383ba112f7e3285cffdc996dd7a76a7bbb87e389ff3057f5152b7885e1d552" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.534112 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tddzt" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.540178 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/object-expirer/0.log" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.562574 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/object-replicator/0.log" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.589224 4771 scope.go:117] "RemoveContainer" containerID="9a65822d01bc8b1ece0e614a2aaad9120c7a084739dbaea9ab3bd8ffd3780576" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.594571 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tddzt"] Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.597140 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/object-server/0.log" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.611484 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tddzt"] Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.624001 4771 scope.go:117] "RemoveContainer" containerID="a1dd8866d077c1dbbd1161e2030ae4ec5931e6400f353696f851e2d31c74eb67" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.629305 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/object-updater/0.log" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.668101 4771 scope.go:117] "RemoveContainer" containerID="80383ba112f7e3285cffdc996dd7a76a7bbb87e389ff3057f5152b7885e1d552" Feb 27 02:12:18 crc kubenswrapper[4771]: E0227 02:12:18.669673 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80383ba112f7e3285cffdc996dd7a76a7bbb87e389ff3057f5152b7885e1d552\": container with ID starting with 80383ba112f7e3285cffdc996dd7a76a7bbb87e389ff3057f5152b7885e1d552 not found: ID does not exist" containerID="80383ba112f7e3285cffdc996dd7a76a7bbb87e389ff3057f5152b7885e1d552" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.669720 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80383ba112f7e3285cffdc996dd7a76a7bbb87e389ff3057f5152b7885e1d552"} err="failed to get container status \"80383ba112f7e3285cffdc996dd7a76a7bbb87e389ff3057f5152b7885e1d552\": rpc error: code = NotFound desc = could not find container \"80383ba112f7e3285cffdc996dd7a76a7bbb87e389ff3057f5152b7885e1d552\": container with ID starting with 80383ba112f7e3285cffdc996dd7a76a7bbb87e389ff3057f5152b7885e1d552 not found: ID does not exist" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.669746 4771 scope.go:117] "RemoveContainer" containerID="9a65822d01bc8b1ece0e614a2aaad9120c7a084739dbaea9ab3bd8ffd3780576" Feb 27 02:12:18 crc kubenswrapper[4771]: E0227 02:12:18.670481 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a65822d01bc8b1ece0e614a2aaad9120c7a084739dbaea9ab3bd8ffd3780576\": container with ID starting with 9a65822d01bc8b1ece0e614a2aaad9120c7a084739dbaea9ab3bd8ffd3780576 not found: ID does not exist" containerID="9a65822d01bc8b1ece0e614a2aaad9120c7a084739dbaea9ab3bd8ffd3780576" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.670562 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a65822d01bc8b1ece0e614a2aaad9120c7a084739dbaea9ab3bd8ffd3780576"} err="failed to get container status \"9a65822d01bc8b1ece0e614a2aaad9120c7a084739dbaea9ab3bd8ffd3780576\": rpc error: code = NotFound desc = could not find container \"9a65822d01bc8b1ece0e614a2aaad9120c7a084739dbaea9ab3bd8ffd3780576\": container with ID starting with 9a65822d01bc8b1ece0e614a2aaad9120c7a084739dbaea9ab3bd8ffd3780576 not found: ID does not exist" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.670596 4771 scope.go:117] "RemoveContainer" containerID="a1dd8866d077c1dbbd1161e2030ae4ec5931e6400f353696f851e2d31c74eb67" Feb 27 02:12:18 crc kubenswrapper[4771]: E0227 02:12:18.671292 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1dd8866d077c1dbbd1161e2030ae4ec5931e6400f353696f851e2d31c74eb67\": container with ID starting with a1dd8866d077c1dbbd1161e2030ae4ec5931e6400f353696f851e2d31c74eb67 not found: ID does not exist" containerID="a1dd8866d077c1dbbd1161e2030ae4ec5931e6400f353696f851e2d31c74eb67" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.671387 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1dd8866d077c1dbbd1161e2030ae4ec5931e6400f353696f851e2d31c74eb67"} err="failed to get container status \"a1dd8866d077c1dbbd1161e2030ae4ec5931e6400f353696f851e2d31c74eb67\": rpc error: code = NotFound desc = could not find container \"a1dd8866d077c1dbbd1161e2030ae4ec5931e6400f353696f851e2d31c74eb67\": container with ID starting with a1dd8866d077c1dbbd1161e2030ae4ec5931e6400f353696f851e2d31c74eb67 not found: ID does not exist" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.761374 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/rsync/0.log" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.792957 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_251e5c6f-c762-4a6e-9253-81f94d592239/swift-recon-cron/0.log" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.911572 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-hlghk_dc880077-8590-47a1-a434-e8cebcf3fff1/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:12:18 crc kubenswrapper[4771]: I0227 02:12:18.980411 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_4b362ce5-5892-43a0-8ec9-e280131b32ee/tempest-tests-tempest-tests-runner/0.log" Feb 27 02:12:19 crc kubenswrapper[4771]: I0227 02:12:19.124506 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_8967caa6-adcf-41fa-8cba-ccb9aaf3b0f4/test-operator-logs-container/0.log" Feb 27 02:12:19 crc kubenswrapper[4771]: I0227 02:12:19.261592 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-cbjb5_fb85d6f3-f2d1-40e4-8cb8-4f3f2d58142b/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 27 02:12:19 crc kubenswrapper[4771]: I0227 02:12:19.782864 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94094ea8-6702-4670-bd98-190530cccf8b" path="/var/lib/kubelet/pods/94094ea8-6702-4670-bd98-190530cccf8b/volumes" Feb 27 02:12:26 crc kubenswrapper[4771]: I0227 02:12:26.773196 4771 scope.go:117] "RemoveContainer" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" Feb 27 02:12:26 crc kubenswrapper[4771]: E0227 02:12:26.773727 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:12:31 crc kubenswrapper[4771]: I0227 02:12:31.092087 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_60504948-6e27-4eb7-b057-4634a1951a8c/memcached/0.log" Feb 27 02:12:31 crc kubenswrapper[4771]: I0227 02:12:31.477462 4771 scope.go:117] "RemoveContainer" containerID="d0127d4998e0f7d5800808abfc47c48367f75d52076c0c98605c799f35d285d4" Feb 27 02:12:37 crc kubenswrapper[4771]: I0227 02:12:37.777828 4771 scope.go:117] "RemoveContainer" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" Feb 27 02:12:37 crc kubenswrapper[4771]: E0227 02:12:37.778798 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:12:39 crc kubenswrapper[4771]: I0227 02:12:39.586490 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zfs8z"] Feb 27 02:12:39 crc kubenswrapper[4771]: E0227 02:12:39.588744 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94094ea8-6702-4670-bd98-190530cccf8b" containerName="extract-utilities" Feb 27 02:12:39 crc kubenswrapper[4771]: I0227 02:12:39.588852 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="94094ea8-6702-4670-bd98-190530cccf8b" containerName="extract-utilities" Feb 27 02:12:39 crc kubenswrapper[4771]: E0227 02:12:39.588933 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94094ea8-6702-4670-bd98-190530cccf8b" containerName="extract-content" Feb 27 02:12:39 crc kubenswrapper[4771]: I0227 02:12:39.589009 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="94094ea8-6702-4670-bd98-190530cccf8b" containerName="extract-content" Feb 27 02:12:39 crc kubenswrapper[4771]: E0227 02:12:39.589102 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94094ea8-6702-4670-bd98-190530cccf8b" containerName="registry-server" Feb 27 02:12:39 crc kubenswrapper[4771]: I0227 02:12:39.589172 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="94094ea8-6702-4670-bd98-190530cccf8b" containerName="registry-server" Feb 27 02:12:39 crc kubenswrapper[4771]: E0227 02:12:39.589238 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2947f34-0e2b-4968-9c29-ef67acacebb0" containerName="oc" Feb 27 02:12:39 crc kubenswrapper[4771]: I0227 02:12:39.589292 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2947f34-0e2b-4968-9c29-ef67acacebb0" containerName="oc" Feb 27 02:12:39 crc kubenswrapper[4771]: I0227 02:12:39.589527 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="94094ea8-6702-4670-bd98-190530cccf8b" containerName="registry-server" Feb 27 02:12:39 crc kubenswrapper[4771]: I0227 02:12:39.589654 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2947f34-0e2b-4968-9c29-ef67acacebb0" containerName="oc" Feb 27 02:12:39 crc kubenswrapper[4771]: I0227 02:12:39.591147 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zfs8z" Feb 27 02:12:39 crc kubenswrapper[4771]: I0227 02:12:39.599017 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zfs8z"] Feb 27 02:12:39 crc kubenswrapper[4771]: I0227 02:12:39.618174 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5hph\" (UniqueName: \"kubernetes.io/projected/c36e541b-bdad-4d20-9c0c-a97cc050f58a-kube-api-access-n5hph\") pod \"redhat-operators-zfs8z\" (UID: \"c36e541b-bdad-4d20-9c0c-a97cc050f58a\") " pod="openshift-marketplace/redhat-operators-zfs8z" Feb 27 02:12:39 crc kubenswrapper[4771]: I0227 02:12:39.618433 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c36e541b-bdad-4d20-9c0c-a97cc050f58a-utilities\") pod \"redhat-operators-zfs8z\" (UID: \"c36e541b-bdad-4d20-9c0c-a97cc050f58a\") " pod="openshift-marketplace/redhat-operators-zfs8z" Feb 27 02:12:39 crc kubenswrapper[4771]: I0227 02:12:39.618683 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c36e541b-bdad-4d20-9c0c-a97cc050f58a-catalog-content\") pod \"redhat-operators-zfs8z\" (UID: \"c36e541b-bdad-4d20-9c0c-a97cc050f58a\") " pod="openshift-marketplace/redhat-operators-zfs8z" Feb 27 02:12:39 crc kubenswrapper[4771]: I0227 02:12:39.720639 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5hph\" (UniqueName: \"kubernetes.io/projected/c36e541b-bdad-4d20-9c0c-a97cc050f58a-kube-api-access-n5hph\") pod \"redhat-operators-zfs8z\" (UID: \"c36e541b-bdad-4d20-9c0c-a97cc050f58a\") " pod="openshift-marketplace/redhat-operators-zfs8z" Feb 27 02:12:39 crc kubenswrapper[4771]: I0227 02:12:39.720726 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c36e541b-bdad-4d20-9c0c-a97cc050f58a-utilities\") pod \"redhat-operators-zfs8z\" (UID: \"c36e541b-bdad-4d20-9c0c-a97cc050f58a\") " pod="openshift-marketplace/redhat-operators-zfs8z" Feb 27 02:12:39 crc kubenswrapper[4771]: I0227 02:12:39.720791 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c36e541b-bdad-4d20-9c0c-a97cc050f58a-catalog-content\") pod \"redhat-operators-zfs8z\" (UID: \"c36e541b-bdad-4d20-9c0c-a97cc050f58a\") " pod="openshift-marketplace/redhat-operators-zfs8z" Feb 27 02:12:39 crc kubenswrapper[4771]: I0227 02:12:39.721362 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c36e541b-bdad-4d20-9c0c-a97cc050f58a-catalog-content\") pod \"redhat-operators-zfs8z\" (UID: \"c36e541b-bdad-4d20-9c0c-a97cc050f58a\") " pod="openshift-marketplace/redhat-operators-zfs8z" Feb 27 02:12:39 crc kubenswrapper[4771]: I0227 02:12:39.721475 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c36e541b-bdad-4d20-9c0c-a97cc050f58a-utilities\") pod \"redhat-operators-zfs8z\" (UID: \"c36e541b-bdad-4d20-9c0c-a97cc050f58a\") " pod="openshift-marketplace/redhat-operators-zfs8z" Feb 27 02:12:39 crc kubenswrapper[4771]: I0227 02:12:39.745379 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5hph\" (UniqueName: \"kubernetes.io/projected/c36e541b-bdad-4d20-9c0c-a97cc050f58a-kube-api-access-n5hph\") pod \"redhat-operators-zfs8z\" (UID: \"c36e541b-bdad-4d20-9c0c-a97cc050f58a\") " pod="openshift-marketplace/redhat-operators-zfs8z" Feb 27 02:12:39 crc kubenswrapper[4771]: I0227 02:12:39.921870 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zfs8z" Feb 27 02:12:40 crc kubenswrapper[4771]: I0227 02:12:40.402472 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zfs8z"] Feb 27 02:12:40 crc kubenswrapper[4771]: I0227 02:12:40.766410 4771 generic.go:334] "Generic (PLEG): container finished" podID="c36e541b-bdad-4d20-9c0c-a97cc050f58a" containerID="bbab0faac94c43c08647e792e52f30f85f6dad50e22fdfccf0be159081000e80" exitCode=0 Feb 27 02:12:40 crc kubenswrapper[4771]: I0227 02:12:40.766460 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfs8z" event={"ID":"c36e541b-bdad-4d20-9c0c-a97cc050f58a","Type":"ContainerDied","Data":"bbab0faac94c43c08647e792e52f30f85f6dad50e22fdfccf0be159081000e80"} Feb 27 02:12:40 crc kubenswrapper[4771]: I0227 02:12:40.766877 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfs8z" event={"ID":"c36e541b-bdad-4d20-9c0c-a97cc050f58a","Type":"ContainerStarted","Data":"db60a77a0c04fb556fd4e3235bddb0e66dc1a023859b6e708b81b19db51b37dc"} Feb 27 02:12:41 crc kubenswrapper[4771]: I0227 02:12:41.793946 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfs8z" event={"ID":"c36e541b-bdad-4d20-9c0c-a97cc050f58a","Type":"ContainerStarted","Data":"5a0b0626649ac614e0149a199c3532d215856365ff55b0e74a416d4253c66309"} Feb 27 02:12:43 crc kubenswrapper[4771]: I0227 02:12:43.802040 4771 generic.go:334] "Generic (PLEG): container finished" podID="c36e541b-bdad-4d20-9c0c-a97cc050f58a" containerID="5a0b0626649ac614e0149a199c3532d215856365ff55b0e74a416d4253c66309" exitCode=0 Feb 27 02:12:43 crc kubenswrapper[4771]: I0227 02:12:43.802100 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfs8z" event={"ID":"c36e541b-bdad-4d20-9c0c-a97cc050f58a","Type":"ContainerDied","Data":"5a0b0626649ac614e0149a199c3532d215856365ff55b0e74a416d4253c66309"} Feb 27 02:12:44 crc kubenswrapper[4771]: I0227 02:12:44.817122 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfs8z" event={"ID":"c36e541b-bdad-4d20-9c0c-a97cc050f58a","Type":"ContainerStarted","Data":"2bab9f893f2106bc2edb27105d53ef86125cf0c5d9065f87a791bedea663cbde"} Feb 27 02:12:46 crc kubenswrapper[4771]: I0227 02:12:46.713913 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h_46247d46-066d-45e2-975a-8404fd28a0ac/util/0.log" Feb 27 02:12:46 crc kubenswrapper[4771]: I0227 02:12:46.906257 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h_46247d46-066d-45e2-975a-8404fd28a0ac/util/0.log" Feb 27 02:12:46 crc kubenswrapper[4771]: I0227 02:12:46.919923 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h_46247d46-066d-45e2-975a-8404fd28a0ac/pull/0.log" Feb 27 02:12:47 crc kubenswrapper[4771]: I0227 02:12:47.571303 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h_46247d46-066d-45e2-975a-8404fd28a0ac/pull/0.log" Feb 27 02:12:47 crc kubenswrapper[4771]: I0227 02:12:47.634478 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h_46247d46-066d-45e2-975a-8404fd28a0ac/util/0.log" Feb 27 02:12:47 crc kubenswrapper[4771]: I0227 02:12:47.958527 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h_46247d46-066d-45e2-975a-8404fd28a0ac/extract/0.log" Feb 27 02:12:47 crc kubenswrapper[4771]: I0227 02:12:47.961485 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d4ae269b85a93c719cf03f5d93793e77d783a0952e3fc349b078a0be53fzl5h_46247d46-066d-45e2-975a-8404fd28a0ac/pull/0.log" Feb 27 02:12:48 crc kubenswrapper[4771]: I0227 02:12:48.142028 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-snqrx_9f4615e8-ebc8-43ff-bdec-481f86af58bf/manager/0.log" Feb 27 02:12:48 crc kubenswrapper[4771]: I0227 02:12:48.517990 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-784b5bb6c5-zlggr_f77508f2-411f-4644-9b48-7edbefaf3bb4/manager/0.log" Feb 27 02:12:48 crc kubenswrapper[4771]: I0227 02:12:48.582008 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-9jpm2_f882b343-7b46-4516-9a17-833858bbfda7/manager/0.log" Feb 27 02:12:48 crc kubenswrapper[4771]: I0227 02:12:48.787479 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-x969l_a40b776f-5677-4909-8b04-a5b2318737bc/manager/0.log" Feb 27 02:12:48 crc kubenswrapper[4771]: I0227 02:12:48.982037 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-p8rvj_646fbcd2-1bd9-4e76-a70b-c4812c6cdbf7/manager/0.log" Feb 27 02:12:49 crc kubenswrapper[4771]: I0227 02:12:49.413175 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-w5hxv_b563eec9-7160-44db-a640-4cf7e25bc893/manager/0.log" Feb 27 02:12:49 crc kubenswrapper[4771]: I0227 02:12:49.437444 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-4p5fg_5ea9fc68-1ea7-48fe-b692-f99747dbd694/manager/0.log" Feb 27 02:12:49 crc kubenswrapper[4771]: I0227 02:12:49.841992 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-t65sw_eb603c5e-cb7c-41e4-ac8a-f9a960141d16/manager/0.log" Feb 27 02:12:49 crc kubenswrapper[4771]: I0227 02:12:49.870735 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-llvjw_0c8b88b1-8f42-458c-933e-0bcd17da38cb/manager/0.log" Feb 27 02:12:49 crc kubenswrapper[4771]: I0227 02:12:49.922028 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zfs8z" Feb 27 02:12:49 crc kubenswrapper[4771]: I0227 02:12:49.922189 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zfs8z" Feb 27 02:12:50 crc kubenswrapper[4771]: I0227 02:12:50.099935 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-df8gr_17dfc012-107f-437d-bbfd-13a1250857ed/manager/0.log" Feb 27 02:12:50 crc kubenswrapper[4771]: I0227 02:12:50.225539 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6bd4687957-6j9rs_20a5fef1-ac14-40c6-bb97-6e6f39be1645/manager/0.log" Feb 27 02:12:50 crc kubenswrapper[4771]: I0227 02:12:50.473893 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-4fsjk_61b58ad1-8db7-4a41-9774-38781245baff/manager/0.log" Feb 27 02:12:50 crc kubenswrapper[4771]: I0227 02:12:50.570394 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-659dc6bbfc-65x55_7a2d4cdb-cbb2-4910-a9b5-ae2adfb04205/manager/0.log" Feb 27 02:12:50 crc kubenswrapper[4771]: I0227 02:12:50.697998 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9cxbvmq_e01a3024-1558-41e4-bbb4-06451d536782/manager/0.log" Feb 27 02:12:50 crc kubenswrapper[4771]: I0227 02:12:50.971600 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zfs8z" podUID="c36e541b-bdad-4d20-9c0c-a97cc050f58a" containerName="registry-server" probeResult="failure" output=< Feb 27 02:12:50 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 27 02:12:50 crc kubenswrapper[4771]: > Feb 27 02:12:50 crc kubenswrapper[4771]: I0227 02:12:50.979747 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-b5b8f6cf4-m2tbq_79f9396a-5f0c-4909-b710-4914faa9e011/operator/0.log" Feb 27 02:12:51 crc kubenswrapper[4771]: I0227 02:12:51.088088 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-d8jl8_6846ec0e-56f5-4bad-9539-0f6578027f45/registry-server/0.log" Feb 27 02:12:51 crc kubenswrapper[4771]: I0227 02:12:51.238945 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5955d8c787-2qcds_7cf10a28-d86e-4299-8b06-84888ca3dcb9/manager/0.log" Feb 27 02:12:51 crc kubenswrapper[4771]: I0227 02:12:51.452921 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-vbhct_e5ed9ba2-1499-42b0-9a16-213f7bd6336f/manager/0.log" Feb 27 02:12:51 crc kubenswrapper[4771]: I0227 02:12:51.585602 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-2rpsh_bef6603d-191e-4d4b-b824-4a8d4f81c991/operator/0.log" Feb 27 02:12:51 crc kubenswrapper[4771]: I0227 02:12:51.701668 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-d8xdb_aece7f0f-11e5-4934-b818-f8c92e54439b/manager/0.log" Feb 27 02:12:51 crc kubenswrapper[4771]: I0227 02:12:51.774656 4771 scope.go:117] "RemoveContainer" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" Feb 27 02:12:51 crc kubenswrapper[4771]: E0227 02:12:51.775047 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:12:51 crc kubenswrapper[4771]: I0227 02:12:51.981405 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-589c568786-wb7w9_a7c97c14-2dc7-409a-bb85-7e10031e839b/manager/0.log" Feb 27 02:12:51 crc kubenswrapper[4771]: I0227 02:12:51.990013 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5dc6794d5b-cp5l8_987278ec-2526-4db5-a442-58b38687805c/manager/0.log" Feb 27 02:12:52 crc kubenswrapper[4771]: I0227 02:12:52.203612 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-lg7vn_981a63b0-1a15-42f0-8d4a-0dc24dbd87b1/manager/0.log" Feb 27 02:12:52 crc kubenswrapper[4771]: I0227 02:12:52.627559 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5dc6fb848b-7nk64_b4a70780-ab41-4199-b1b8-09b01cd6a4ac/manager/0.log" Feb 27 02:12:56 crc kubenswrapper[4771]: I0227 02:12:56.203054 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-mrvth_8bd8d6ef-0025-4148-a530-1964ae763645/manager/0.log" Feb 27 02:13:00 crc kubenswrapper[4771]: I0227 02:13:00.986378 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zfs8z" podUID="c36e541b-bdad-4d20-9c0c-a97cc050f58a" containerName="registry-server" probeResult="failure" output=< Feb 27 02:13:00 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 27 02:13:00 crc kubenswrapper[4771]: > Feb 27 02:13:06 crc kubenswrapper[4771]: I0227 02:13:06.773240 4771 scope.go:117] "RemoveContainer" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" Feb 27 02:13:06 crc kubenswrapper[4771]: E0227 02:13:06.773938 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:13:10 crc kubenswrapper[4771]: I0227 02:13:10.978002 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zfs8z" podUID="c36e541b-bdad-4d20-9c0c-a97cc050f58a" containerName="registry-server" probeResult="failure" output=< Feb 27 02:13:10 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Feb 27 02:13:10 crc kubenswrapper[4771]: > Feb 27 02:13:14 crc kubenswrapper[4771]: I0227 02:13:14.173243 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-2qwgc_62c59a17-8b65-4876-a007-1cb1f45a7c2b/control-plane-machine-set-operator/0.log" Feb 27 02:13:14 crc kubenswrapper[4771]: I0227 02:13:14.368333 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4vrtf_7bd5b18f-fa8c-46d4-a571-630a67b14023/kube-rbac-proxy/0.log" Feb 27 02:13:14 crc kubenswrapper[4771]: I0227 02:13:14.393402 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4vrtf_7bd5b18f-fa8c-46d4-a571-630a67b14023/machine-api-operator/0.log" Feb 27 02:13:17 crc kubenswrapper[4771]: I0227 02:13:17.814376 4771 scope.go:117] "RemoveContainer" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" Feb 27 02:13:17 crc kubenswrapper[4771]: E0227 02:13:17.815375 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:13:19 crc kubenswrapper[4771]: I0227 02:13:19.964381 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zfs8z" Feb 27 02:13:19 crc kubenswrapper[4771]: I0227 02:13:19.982251 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zfs8z" podStartSLOduration=37.48908363 podStartE2EDuration="40.982236096s" podCreationTimestamp="2026-02-27 02:12:39 +0000 UTC" firstStartedPulling="2026-02-27 02:12:40.768789553 +0000 UTC m=+4073.706350841" lastFinishedPulling="2026-02-27 02:12:44.261942019 +0000 UTC m=+4077.199503307" observedRunningTime="2026-02-27 02:12:44.843713008 +0000 UTC m=+4077.781274306" watchObservedRunningTime="2026-02-27 02:13:19.982236096 +0000 UTC m=+4112.919797384" Feb 27 02:13:20 crc kubenswrapper[4771]: I0227 02:13:20.034311 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zfs8z" Feb 27 02:13:20 crc kubenswrapper[4771]: I0227 02:13:20.200005 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zfs8z"] Feb 27 02:13:21 crc kubenswrapper[4771]: I0227 02:13:21.359807 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zfs8z" podUID="c36e541b-bdad-4d20-9c0c-a97cc050f58a" containerName="registry-server" containerID="cri-o://2bab9f893f2106bc2edb27105d53ef86125cf0c5d9065f87a791bedea663cbde" gracePeriod=2 Feb 27 02:13:21 crc kubenswrapper[4771]: I0227 02:13:21.860473 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zfs8z" Feb 27 02:13:21 crc kubenswrapper[4771]: I0227 02:13:21.991221 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c36e541b-bdad-4d20-9c0c-a97cc050f58a-utilities\") pod \"c36e541b-bdad-4d20-9c0c-a97cc050f58a\" (UID: \"c36e541b-bdad-4d20-9c0c-a97cc050f58a\") " Feb 27 02:13:21 crc kubenswrapper[4771]: I0227 02:13:21.991279 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5hph\" (UniqueName: \"kubernetes.io/projected/c36e541b-bdad-4d20-9c0c-a97cc050f58a-kube-api-access-n5hph\") pod \"c36e541b-bdad-4d20-9c0c-a97cc050f58a\" (UID: \"c36e541b-bdad-4d20-9c0c-a97cc050f58a\") " Feb 27 02:13:21 crc kubenswrapper[4771]: I0227 02:13:21.991340 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c36e541b-bdad-4d20-9c0c-a97cc050f58a-catalog-content\") pod \"c36e541b-bdad-4d20-9c0c-a97cc050f58a\" (UID: \"c36e541b-bdad-4d20-9c0c-a97cc050f58a\") " Feb 27 02:13:21 crc kubenswrapper[4771]: I0227 02:13:21.991837 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c36e541b-bdad-4d20-9c0c-a97cc050f58a-utilities" (OuterVolumeSpecName: "utilities") pod "c36e541b-bdad-4d20-9c0c-a97cc050f58a" (UID: "c36e541b-bdad-4d20-9c0c-a97cc050f58a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 02:13:21 crc kubenswrapper[4771]: I0227 02:13:21.992540 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c36e541b-bdad-4d20-9c0c-a97cc050f58a-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 02:13:21 crc kubenswrapper[4771]: I0227 02:13:21.999025 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c36e541b-bdad-4d20-9c0c-a97cc050f58a-kube-api-access-n5hph" (OuterVolumeSpecName: "kube-api-access-n5hph") pod "c36e541b-bdad-4d20-9c0c-a97cc050f58a" (UID: "c36e541b-bdad-4d20-9c0c-a97cc050f58a"). InnerVolumeSpecName "kube-api-access-n5hph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:13:22 crc kubenswrapper[4771]: I0227 02:13:22.094945 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5hph\" (UniqueName: \"kubernetes.io/projected/c36e541b-bdad-4d20-9c0c-a97cc050f58a-kube-api-access-n5hph\") on node \"crc\" DevicePath \"\"" Feb 27 02:13:22 crc kubenswrapper[4771]: I0227 02:13:22.151790 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c36e541b-bdad-4d20-9c0c-a97cc050f58a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c36e541b-bdad-4d20-9c0c-a97cc050f58a" (UID: "c36e541b-bdad-4d20-9c0c-a97cc050f58a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 02:13:22 crc kubenswrapper[4771]: I0227 02:13:22.196945 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c36e541b-bdad-4d20-9c0c-a97cc050f58a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 02:13:22 crc kubenswrapper[4771]: I0227 02:13:22.373191 4771 generic.go:334] "Generic (PLEG): container finished" podID="c36e541b-bdad-4d20-9c0c-a97cc050f58a" containerID="2bab9f893f2106bc2edb27105d53ef86125cf0c5d9065f87a791bedea663cbde" exitCode=0 Feb 27 02:13:22 crc kubenswrapper[4771]: I0227 02:13:22.373237 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfs8z" event={"ID":"c36e541b-bdad-4d20-9c0c-a97cc050f58a","Type":"ContainerDied","Data":"2bab9f893f2106bc2edb27105d53ef86125cf0c5d9065f87a791bedea663cbde"} Feb 27 02:13:22 crc kubenswrapper[4771]: I0227 02:13:22.373281 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfs8z" event={"ID":"c36e541b-bdad-4d20-9c0c-a97cc050f58a","Type":"ContainerDied","Data":"db60a77a0c04fb556fd4e3235bddb0e66dc1a023859b6e708b81b19db51b37dc"} Feb 27 02:13:22 crc kubenswrapper[4771]: I0227 02:13:22.373305 4771 scope.go:117] "RemoveContainer" containerID="2bab9f893f2106bc2edb27105d53ef86125cf0c5d9065f87a791bedea663cbde" Feb 27 02:13:22 crc kubenswrapper[4771]: I0227 02:13:22.373313 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zfs8z" Feb 27 02:13:22 crc kubenswrapper[4771]: I0227 02:13:22.405882 4771 scope.go:117] "RemoveContainer" containerID="5a0b0626649ac614e0149a199c3532d215856365ff55b0e74a416d4253c66309" Feb 27 02:13:22 crc kubenswrapper[4771]: I0227 02:13:22.446862 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zfs8z"] Feb 27 02:13:22 crc kubenswrapper[4771]: I0227 02:13:22.456909 4771 scope.go:117] "RemoveContainer" containerID="bbab0faac94c43c08647e792e52f30f85f6dad50e22fdfccf0be159081000e80" Feb 27 02:13:22 crc kubenswrapper[4771]: I0227 02:13:22.462664 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zfs8z"] Feb 27 02:13:22 crc kubenswrapper[4771]: I0227 02:13:22.508430 4771 scope.go:117] "RemoveContainer" containerID="2bab9f893f2106bc2edb27105d53ef86125cf0c5d9065f87a791bedea663cbde" Feb 27 02:13:22 crc kubenswrapper[4771]: E0227 02:13:22.509099 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bab9f893f2106bc2edb27105d53ef86125cf0c5d9065f87a791bedea663cbde\": container with ID starting with 2bab9f893f2106bc2edb27105d53ef86125cf0c5d9065f87a791bedea663cbde not found: ID does not exist" containerID="2bab9f893f2106bc2edb27105d53ef86125cf0c5d9065f87a791bedea663cbde" Feb 27 02:13:22 crc kubenswrapper[4771]: I0227 02:13:22.509143 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bab9f893f2106bc2edb27105d53ef86125cf0c5d9065f87a791bedea663cbde"} err="failed to get container status \"2bab9f893f2106bc2edb27105d53ef86125cf0c5d9065f87a791bedea663cbde\": rpc error: code = NotFound desc = could not find container \"2bab9f893f2106bc2edb27105d53ef86125cf0c5d9065f87a791bedea663cbde\": container with ID starting with 2bab9f893f2106bc2edb27105d53ef86125cf0c5d9065f87a791bedea663cbde not found: ID does not exist" Feb 27 02:13:22 crc kubenswrapper[4771]: I0227 02:13:22.509173 4771 scope.go:117] "RemoveContainer" containerID="5a0b0626649ac614e0149a199c3532d215856365ff55b0e74a416d4253c66309" Feb 27 02:13:22 crc kubenswrapper[4771]: E0227 02:13:22.509805 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a0b0626649ac614e0149a199c3532d215856365ff55b0e74a416d4253c66309\": container with ID starting with 5a0b0626649ac614e0149a199c3532d215856365ff55b0e74a416d4253c66309 not found: ID does not exist" containerID="5a0b0626649ac614e0149a199c3532d215856365ff55b0e74a416d4253c66309" Feb 27 02:13:22 crc kubenswrapper[4771]: I0227 02:13:22.509827 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a0b0626649ac614e0149a199c3532d215856365ff55b0e74a416d4253c66309"} err="failed to get container status \"5a0b0626649ac614e0149a199c3532d215856365ff55b0e74a416d4253c66309\": rpc error: code = NotFound desc = could not find container \"5a0b0626649ac614e0149a199c3532d215856365ff55b0e74a416d4253c66309\": container with ID starting with 5a0b0626649ac614e0149a199c3532d215856365ff55b0e74a416d4253c66309 not found: ID does not exist" Feb 27 02:13:22 crc kubenswrapper[4771]: I0227 02:13:22.509839 4771 scope.go:117] "RemoveContainer" containerID="bbab0faac94c43c08647e792e52f30f85f6dad50e22fdfccf0be159081000e80" Feb 27 02:13:22 crc kubenswrapper[4771]: E0227 02:13:22.511772 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbab0faac94c43c08647e792e52f30f85f6dad50e22fdfccf0be159081000e80\": container with ID starting with bbab0faac94c43c08647e792e52f30f85f6dad50e22fdfccf0be159081000e80 not found: ID does not exist" containerID="bbab0faac94c43c08647e792e52f30f85f6dad50e22fdfccf0be159081000e80" Feb 27 02:13:22 crc kubenswrapper[4771]: I0227 02:13:22.511830 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbab0faac94c43c08647e792e52f30f85f6dad50e22fdfccf0be159081000e80"} err="failed to get container status \"bbab0faac94c43c08647e792e52f30f85f6dad50e22fdfccf0be159081000e80\": rpc error: code = NotFound desc = could not find container \"bbab0faac94c43c08647e792e52f30f85f6dad50e22fdfccf0be159081000e80\": container with ID starting with bbab0faac94c43c08647e792e52f30f85f6dad50e22fdfccf0be159081000e80 not found: ID does not exist" Feb 27 02:13:23 crc kubenswrapper[4771]: I0227 02:13:23.787229 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c36e541b-bdad-4d20-9c0c-a97cc050f58a" path="/var/lib/kubelet/pods/c36e541b-bdad-4d20-9c0c-a97cc050f58a/volumes" Feb 27 02:13:28 crc kubenswrapper[4771]: I0227 02:13:28.448728 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-6n589_6a0dd098-846f-4aab-b87f-4d06728195c5/cert-manager-controller/0.log" Feb 27 02:13:28 crc kubenswrapper[4771]: I0227 02:13:28.564757 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-8pmgm_dca42308-0eb3-4c5b-a620-cbbb29c3c88f/cert-manager-cainjector/0.log" Feb 27 02:13:28 crc kubenswrapper[4771]: I0227 02:13:28.687118 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-gd2tq_d066e334-9b58-464d-80d8-899a6390d5c5/cert-manager-webhook/0.log" Feb 27 02:13:31 crc kubenswrapper[4771]: I0227 02:13:31.773395 4771 scope.go:117] "RemoveContainer" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" Feb 27 02:13:31 crc kubenswrapper[4771]: E0227 02:13:31.774266 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:13:37 crc kubenswrapper[4771]: I0227 02:13:37.973926 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4pwsw"] Feb 27 02:13:37 crc kubenswrapper[4771]: E0227 02:13:37.975013 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c36e541b-bdad-4d20-9c0c-a97cc050f58a" containerName="extract-content" Feb 27 02:13:37 crc kubenswrapper[4771]: I0227 02:13:37.975033 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c36e541b-bdad-4d20-9c0c-a97cc050f58a" containerName="extract-content" Feb 27 02:13:37 crc kubenswrapper[4771]: E0227 02:13:37.975070 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c36e541b-bdad-4d20-9c0c-a97cc050f58a" containerName="registry-server" Feb 27 02:13:37 crc kubenswrapper[4771]: I0227 02:13:37.975081 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c36e541b-bdad-4d20-9c0c-a97cc050f58a" containerName="registry-server" Feb 27 02:13:37 crc kubenswrapper[4771]: E0227 02:13:37.975119 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c36e541b-bdad-4d20-9c0c-a97cc050f58a" containerName="extract-utilities" Feb 27 02:13:37 crc kubenswrapper[4771]: I0227 02:13:37.975133 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c36e541b-bdad-4d20-9c0c-a97cc050f58a" containerName="extract-utilities" Feb 27 02:13:37 crc kubenswrapper[4771]: I0227 02:13:37.975441 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c36e541b-bdad-4d20-9c0c-a97cc050f58a" containerName="registry-server" Feb 27 02:13:37 crc kubenswrapper[4771]: I0227 02:13:37.977587 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4pwsw" Feb 27 02:13:37 crc kubenswrapper[4771]: I0227 02:13:37.997961 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4pwsw"] Feb 27 02:13:38 crc kubenswrapper[4771]: I0227 02:13:38.020374 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br8x8\" (UniqueName: \"kubernetes.io/projected/6d5fb64b-1b6e-497a-a023-09eb1509c282-kube-api-access-br8x8\") pod \"community-operators-4pwsw\" (UID: \"6d5fb64b-1b6e-497a-a023-09eb1509c282\") " pod="openshift-marketplace/community-operators-4pwsw" Feb 27 02:13:38 crc kubenswrapper[4771]: I0227 02:13:38.020487 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d5fb64b-1b6e-497a-a023-09eb1509c282-utilities\") pod \"community-operators-4pwsw\" (UID: \"6d5fb64b-1b6e-497a-a023-09eb1509c282\") " pod="openshift-marketplace/community-operators-4pwsw" Feb 27 02:13:38 crc kubenswrapper[4771]: I0227 02:13:38.020622 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d5fb64b-1b6e-497a-a023-09eb1509c282-catalog-content\") pod \"community-operators-4pwsw\" (UID: \"6d5fb64b-1b6e-497a-a023-09eb1509c282\") " pod="openshift-marketplace/community-operators-4pwsw" Feb 27 02:13:38 crc kubenswrapper[4771]: I0227 02:13:38.122113 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br8x8\" (UniqueName: \"kubernetes.io/projected/6d5fb64b-1b6e-497a-a023-09eb1509c282-kube-api-access-br8x8\") pod \"community-operators-4pwsw\" (UID: \"6d5fb64b-1b6e-497a-a023-09eb1509c282\") " pod="openshift-marketplace/community-operators-4pwsw" Feb 27 02:13:38 crc kubenswrapper[4771]: I0227 02:13:38.122182 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d5fb64b-1b6e-497a-a023-09eb1509c282-utilities\") pod \"community-operators-4pwsw\" (UID: \"6d5fb64b-1b6e-497a-a023-09eb1509c282\") " pod="openshift-marketplace/community-operators-4pwsw" Feb 27 02:13:38 crc kubenswrapper[4771]: I0227 02:13:38.122296 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d5fb64b-1b6e-497a-a023-09eb1509c282-catalog-content\") pod \"community-operators-4pwsw\" (UID: \"6d5fb64b-1b6e-497a-a023-09eb1509c282\") " pod="openshift-marketplace/community-operators-4pwsw" Feb 27 02:13:38 crc kubenswrapper[4771]: I0227 02:13:38.122948 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d5fb64b-1b6e-497a-a023-09eb1509c282-catalog-content\") pod \"community-operators-4pwsw\" (UID: \"6d5fb64b-1b6e-497a-a023-09eb1509c282\") " pod="openshift-marketplace/community-operators-4pwsw" Feb 27 02:13:38 crc kubenswrapper[4771]: I0227 02:13:38.122959 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d5fb64b-1b6e-497a-a023-09eb1509c282-utilities\") pod \"community-operators-4pwsw\" (UID: \"6d5fb64b-1b6e-497a-a023-09eb1509c282\") " pod="openshift-marketplace/community-operators-4pwsw" Feb 27 02:13:38 crc kubenswrapper[4771]: I0227 02:13:38.147451 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br8x8\" (UniqueName: \"kubernetes.io/projected/6d5fb64b-1b6e-497a-a023-09eb1509c282-kube-api-access-br8x8\") pod \"community-operators-4pwsw\" (UID: \"6d5fb64b-1b6e-497a-a023-09eb1509c282\") " pod="openshift-marketplace/community-operators-4pwsw" Feb 27 02:13:38 crc kubenswrapper[4771]: I0227 02:13:38.344696 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4pwsw" Feb 27 02:13:38 crc kubenswrapper[4771]: I0227 02:13:38.824515 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4pwsw"] Feb 27 02:13:39 crc kubenswrapper[4771]: I0227 02:13:39.526593 4771 generic.go:334] "Generic (PLEG): container finished" podID="6d5fb64b-1b6e-497a-a023-09eb1509c282" containerID="d255dbf3ae1079fe30344ba9d42b8763d380c1c8bb27d08bd341ce955c72dcb4" exitCode=0 Feb 27 02:13:39 crc kubenswrapper[4771]: I0227 02:13:39.526688 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pwsw" event={"ID":"6d5fb64b-1b6e-497a-a023-09eb1509c282","Type":"ContainerDied","Data":"d255dbf3ae1079fe30344ba9d42b8763d380c1c8bb27d08bd341ce955c72dcb4"} Feb 27 02:13:39 crc kubenswrapper[4771]: I0227 02:13:39.526953 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pwsw" event={"ID":"6d5fb64b-1b6e-497a-a023-09eb1509c282","Type":"ContainerStarted","Data":"b6000d49e0c507538800b3102da69daeadf9f0b31b0fbdca252372fa84307731"} Feb 27 02:13:40 crc kubenswrapper[4771]: I0227 02:13:40.538206 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pwsw" event={"ID":"6d5fb64b-1b6e-497a-a023-09eb1509c282","Type":"ContainerStarted","Data":"c581e021dc1885078e9b6433de1b170f8fca8a8496890a62f3d58f29ea208b1a"} Feb 27 02:13:41 crc kubenswrapper[4771]: I0227 02:13:41.549117 4771 generic.go:334] "Generic (PLEG): container finished" podID="6d5fb64b-1b6e-497a-a023-09eb1509c282" containerID="c581e021dc1885078e9b6433de1b170f8fca8a8496890a62f3d58f29ea208b1a" exitCode=0 Feb 27 02:13:41 crc kubenswrapper[4771]: I0227 02:13:41.549165 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pwsw" event={"ID":"6d5fb64b-1b6e-497a-a023-09eb1509c282","Type":"ContainerDied","Data":"c581e021dc1885078e9b6433de1b170f8fca8a8496890a62f3d58f29ea208b1a"} Feb 27 02:13:42 crc kubenswrapper[4771]: I0227 02:13:42.562415 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pwsw" event={"ID":"6d5fb64b-1b6e-497a-a023-09eb1509c282","Type":"ContainerStarted","Data":"10fc7b1555b8e058900c1c66ebcfbf1fb7379c84d8cad8e281d92114374c76bb"} Feb 27 02:13:42 crc kubenswrapper[4771]: I0227 02:13:42.609972 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4pwsw" podStartSLOduration=3.169117388 podStartE2EDuration="5.609952548s" podCreationTimestamp="2026-02-27 02:13:37 +0000 UTC" firstStartedPulling="2026-02-27 02:13:39.528228231 +0000 UTC m=+4132.465789519" lastFinishedPulling="2026-02-27 02:13:41.969063371 +0000 UTC m=+4134.906624679" observedRunningTime="2026-02-27 02:13:42.604640922 +0000 UTC m=+4135.542202210" watchObservedRunningTime="2026-02-27 02:13:42.609952548 +0000 UTC m=+4135.547513836" Feb 27 02:13:42 crc kubenswrapper[4771]: I0227 02:13:42.773734 4771 scope.go:117] "RemoveContainer" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" Feb 27 02:13:42 crc kubenswrapper[4771]: E0227 02:13:42.773974 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:13:44 crc kubenswrapper[4771]: I0227 02:13:44.619472 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-thhng_7c2f136b-c273-45f2-bbd2-923046cf0861/nmstate-console-plugin/0.log" Feb 27 02:13:44 crc kubenswrapper[4771]: I0227 02:13:44.671524 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-ln8xd_7b2bdabc-b325-4bc2-91f8-39e9f12ec946/nmstate-handler/0.log" Feb 27 02:13:44 crc kubenswrapper[4771]: I0227 02:13:44.877270 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-bkc9p_6049b388-cb33-408a-848e-90a3e9767488/kube-rbac-proxy/0.log" Feb 27 02:13:44 crc kubenswrapper[4771]: I0227 02:13:44.903747 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-bkc9p_6049b388-cb33-408a-848e-90a3e9767488/nmstate-metrics/0.log" Feb 27 02:13:45 crc kubenswrapper[4771]: I0227 02:13:45.043714 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-tmk5t_b7560148-b519-4709-a6a8-184258052e14/nmstate-operator/0.log" Feb 27 02:13:45 crc kubenswrapper[4771]: I0227 02:13:45.101540 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-pfb4d_397a2bf0-511c-4cc9-964c-e1d2efc662ea/nmstate-webhook/0.log" Feb 27 02:13:48 crc kubenswrapper[4771]: I0227 02:13:48.346173 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4pwsw" Feb 27 02:13:48 crc kubenswrapper[4771]: I0227 02:13:48.346741 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4pwsw" Feb 27 02:13:48 crc kubenswrapper[4771]: I0227 02:13:48.389880 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4pwsw" Feb 27 02:13:48 crc kubenswrapper[4771]: I0227 02:13:48.675644 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4pwsw" Feb 27 02:13:48 crc kubenswrapper[4771]: I0227 02:13:48.722344 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4pwsw"] Feb 27 02:13:50 crc kubenswrapper[4771]: I0227 02:13:50.649527 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4pwsw" podUID="6d5fb64b-1b6e-497a-a023-09eb1509c282" containerName="registry-server" containerID="cri-o://10fc7b1555b8e058900c1c66ebcfbf1fb7379c84d8cad8e281d92114374c76bb" gracePeriod=2 Feb 27 02:13:51 crc kubenswrapper[4771]: I0227 02:13:51.111681 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4pwsw" Feb 27 02:13:51 crc kubenswrapper[4771]: I0227 02:13:51.267586 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d5fb64b-1b6e-497a-a023-09eb1509c282-utilities\") pod \"6d5fb64b-1b6e-497a-a023-09eb1509c282\" (UID: \"6d5fb64b-1b6e-497a-a023-09eb1509c282\") " Feb 27 02:13:51 crc kubenswrapper[4771]: I0227 02:13:51.267729 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d5fb64b-1b6e-497a-a023-09eb1509c282-catalog-content\") pod \"6d5fb64b-1b6e-497a-a023-09eb1509c282\" (UID: \"6d5fb64b-1b6e-497a-a023-09eb1509c282\") " Feb 27 02:13:51 crc kubenswrapper[4771]: I0227 02:13:51.267760 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br8x8\" (UniqueName: \"kubernetes.io/projected/6d5fb64b-1b6e-497a-a023-09eb1509c282-kube-api-access-br8x8\") pod \"6d5fb64b-1b6e-497a-a023-09eb1509c282\" (UID: \"6d5fb64b-1b6e-497a-a023-09eb1509c282\") " Feb 27 02:13:51 crc kubenswrapper[4771]: I0227 02:13:51.268486 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d5fb64b-1b6e-497a-a023-09eb1509c282-utilities" (OuterVolumeSpecName: "utilities") pod "6d5fb64b-1b6e-497a-a023-09eb1509c282" (UID: "6d5fb64b-1b6e-497a-a023-09eb1509c282"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 02:13:51 crc kubenswrapper[4771]: I0227 02:13:51.285814 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d5fb64b-1b6e-497a-a023-09eb1509c282-kube-api-access-br8x8" (OuterVolumeSpecName: "kube-api-access-br8x8") pod "6d5fb64b-1b6e-497a-a023-09eb1509c282" (UID: "6d5fb64b-1b6e-497a-a023-09eb1509c282"). InnerVolumeSpecName "kube-api-access-br8x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:13:51 crc kubenswrapper[4771]: I0227 02:13:51.370762 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d5fb64b-1b6e-497a-a023-09eb1509c282-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 02:13:51 crc kubenswrapper[4771]: I0227 02:13:51.370813 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br8x8\" (UniqueName: \"kubernetes.io/projected/6d5fb64b-1b6e-497a-a023-09eb1509c282-kube-api-access-br8x8\") on node \"crc\" DevicePath \"\"" Feb 27 02:13:51 crc kubenswrapper[4771]: I0227 02:13:51.659450 4771 generic.go:334] "Generic (PLEG): container finished" podID="6d5fb64b-1b6e-497a-a023-09eb1509c282" containerID="10fc7b1555b8e058900c1c66ebcfbf1fb7379c84d8cad8e281d92114374c76bb" exitCode=0 Feb 27 02:13:51 crc kubenswrapper[4771]: I0227 02:13:51.659492 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pwsw" event={"ID":"6d5fb64b-1b6e-497a-a023-09eb1509c282","Type":"ContainerDied","Data":"10fc7b1555b8e058900c1c66ebcfbf1fb7379c84d8cad8e281d92114374c76bb"} Feb 27 02:13:51 crc kubenswrapper[4771]: I0227 02:13:51.659518 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4pwsw" event={"ID":"6d5fb64b-1b6e-497a-a023-09eb1509c282","Type":"ContainerDied","Data":"b6000d49e0c507538800b3102da69daeadf9f0b31b0fbdca252372fa84307731"} Feb 27 02:13:51 crc kubenswrapper[4771]: I0227 02:13:51.659533 4771 scope.go:117] "RemoveContainer" containerID="10fc7b1555b8e058900c1c66ebcfbf1fb7379c84d8cad8e281d92114374c76bb" Feb 27 02:13:51 crc kubenswrapper[4771]: I0227 02:13:51.659678 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4pwsw" Feb 27 02:13:51 crc kubenswrapper[4771]: I0227 02:13:51.666890 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d5fb64b-1b6e-497a-a023-09eb1509c282-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d5fb64b-1b6e-497a-a023-09eb1509c282" (UID: "6d5fb64b-1b6e-497a-a023-09eb1509c282"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 02:13:51 crc kubenswrapper[4771]: I0227 02:13:51.678336 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d5fb64b-1b6e-497a-a023-09eb1509c282-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 02:13:51 crc kubenswrapper[4771]: I0227 02:13:51.686577 4771 scope.go:117] "RemoveContainer" containerID="c581e021dc1885078e9b6433de1b170f8fca8a8496890a62f3d58f29ea208b1a" Feb 27 02:13:51 crc kubenswrapper[4771]: I0227 02:13:51.713038 4771 scope.go:117] "RemoveContainer" containerID="d255dbf3ae1079fe30344ba9d42b8763d380c1c8bb27d08bd341ce955c72dcb4" Feb 27 02:13:51 crc kubenswrapper[4771]: I0227 02:13:51.787799 4771 scope.go:117] "RemoveContainer" containerID="10fc7b1555b8e058900c1c66ebcfbf1fb7379c84d8cad8e281d92114374c76bb" Feb 27 02:13:51 crc kubenswrapper[4771]: E0227 02:13:51.788228 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10fc7b1555b8e058900c1c66ebcfbf1fb7379c84d8cad8e281d92114374c76bb\": container with ID starting with 10fc7b1555b8e058900c1c66ebcfbf1fb7379c84d8cad8e281d92114374c76bb not found: ID does not exist" containerID="10fc7b1555b8e058900c1c66ebcfbf1fb7379c84d8cad8e281d92114374c76bb" Feb 27 02:13:51 crc kubenswrapper[4771]: I0227 02:13:51.788268 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10fc7b1555b8e058900c1c66ebcfbf1fb7379c84d8cad8e281d92114374c76bb"} err="failed to get container status \"10fc7b1555b8e058900c1c66ebcfbf1fb7379c84d8cad8e281d92114374c76bb\": rpc error: code = NotFound desc = could not find container \"10fc7b1555b8e058900c1c66ebcfbf1fb7379c84d8cad8e281d92114374c76bb\": container with ID starting with 10fc7b1555b8e058900c1c66ebcfbf1fb7379c84d8cad8e281d92114374c76bb not found: ID does not exist" Feb 27 02:13:51 crc kubenswrapper[4771]: I0227 02:13:51.788292 4771 scope.go:117] "RemoveContainer" containerID="c581e021dc1885078e9b6433de1b170f8fca8a8496890a62f3d58f29ea208b1a" Feb 27 02:13:51 crc kubenswrapper[4771]: E0227 02:13:51.788581 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c581e021dc1885078e9b6433de1b170f8fca8a8496890a62f3d58f29ea208b1a\": container with ID starting with c581e021dc1885078e9b6433de1b170f8fca8a8496890a62f3d58f29ea208b1a not found: ID does not exist" containerID="c581e021dc1885078e9b6433de1b170f8fca8a8496890a62f3d58f29ea208b1a" Feb 27 02:13:51 crc kubenswrapper[4771]: I0227 02:13:51.788613 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c581e021dc1885078e9b6433de1b170f8fca8a8496890a62f3d58f29ea208b1a"} err="failed to get container status \"c581e021dc1885078e9b6433de1b170f8fca8a8496890a62f3d58f29ea208b1a\": rpc error: code = NotFound desc = could not find container \"c581e021dc1885078e9b6433de1b170f8fca8a8496890a62f3d58f29ea208b1a\": container with ID starting with c581e021dc1885078e9b6433de1b170f8fca8a8496890a62f3d58f29ea208b1a not found: ID does not exist" Feb 27 02:13:51 crc kubenswrapper[4771]: I0227 02:13:51.788632 4771 scope.go:117] "RemoveContainer" containerID="d255dbf3ae1079fe30344ba9d42b8763d380c1c8bb27d08bd341ce955c72dcb4" Feb 27 02:13:51 crc kubenswrapper[4771]: E0227 02:13:51.788934 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d255dbf3ae1079fe30344ba9d42b8763d380c1c8bb27d08bd341ce955c72dcb4\": container with ID starting with d255dbf3ae1079fe30344ba9d42b8763d380c1c8bb27d08bd341ce955c72dcb4 not found: ID does not exist" containerID="d255dbf3ae1079fe30344ba9d42b8763d380c1c8bb27d08bd341ce955c72dcb4" Feb 27 02:13:51 crc kubenswrapper[4771]: I0227 02:13:51.788972 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d255dbf3ae1079fe30344ba9d42b8763d380c1c8bb27d08bd341ce955c72dcb4"} err="failed to get container status \"d255dbf3ae1079fe30344ba9d42b8763d380c1c8bb27d08bd341ce955c72dcb4\": rpc error: code = NotFound desc = could not find container \"d255dbf3ae1079fe30344ba9d42b8763d380c1c8bb27d08bd341ce955c72dcb4\": container with ID starting with d255dbf3ae1079fe30344ba9d42b8763d380c1c8bb27d08bd341ce955c72dcb4 not found: ID does not exist" Feb 27 02:13:51 crc kubenswrapper[4771]: I0227 02:13:51.989369 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4pwsw"] Feb 27 02:13:52 crc kubenswrapper[4771]: I0227 02:13:52.004036 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4pwsw"] Feb 27 02:13:53 crc kubenswrapper[4771]: I0227 02:13:53.787635 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d5fb64b-1b6e-497a-a023-09eb1509c282" path="/var/lib/kubelet/pods/6d5fb64b-1b6e-497a-a023-09eb1509c282/volumes" Feb 27 02:13:56 crc kubenswrapper[4771]: I0227 02:13:56.773611 4771 scope.go:117] "RemoveContainer" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" Feb 27 02:13:56 crc kubenswrapper[4771]: E0227 02:13:56.775004 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:14:00 crc kubenswrapper[4771]: I0227 02:14:00.144858 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535974-sfhz8"] Feb 27 02:14:00 crc kubenswrapper[4771]: E0227 02:14:00.145953 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d5fb64b-1b6e-497a-a023-09eb1509c282" containerName="extract-content" Feb 27 02:14:00 crc kubenswrapper[4771]: I0227 02:14:00.145972 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d5fb64b-1b6e-497a-a023-09eb1509c282" containerName="extract-content" Feb 27 02:14:00 crc kubenswrapper[4771]: E0227 02:14:00.146006 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d5fb64b-1b6e-497a-a023-09eb1509c282" containerName="registry-server" Feb 27 02:14:00 crc kubenswrapper[4771]: I0227 02:14:00.146016 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d5fb64b-1b6e-497a-a023-09eb1509c282" containerName="registry-server" Feb 27 02:14:00 crc kubenswrapper[4771]: E0227 02:14:00.146030 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d5fb64b-1b6e-497a-a023-09eb1509c282" containerName="extract-utilities" Feb 27 02:14:00 crc kubenswrapper[4771]: I0227 02:14:00.146038 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d5fb64b-1b6e-497a-a023-09eb1509c282" containerName="extract-utilities" Feb 27 02:14:00 crc kubenswrapper[4771]: I0227 02:14:00.146281 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d5fb64b-1b6e-497a-a023-09eb1509c282" containerName="registry-server" Feb 27 02:14:00 crc kubenswrapper[4771]: I0227 02:14:00.147084 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535974-sfhz8" Feb 27 02:14:00 crc kubenswrapper[4771]: I0227 02:14:00.150091 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 02:14:00 crc kubenswrapper[4771]: I0227 02:14:00.150344 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 02:14:00 crc kubenswrapper[4771]: I0227 02:14:00.153908 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535974-sfhz8"] Feb 27 02:14:00 crc kubenswrapper[4771]: I0227 02:14:00.155307 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 02:14:00 crc kubenswrapper[4771]: I0227 02:14:00.338047 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d48tq\" (UniqueName: \"kubernetes.io/projected/4008d1c8-f1f6-4209-8037-2a9c68e3823a-kube-api-access-d48tq\") pod \"auto-csr-approver-29535974-sfhz8\" (UID: \"4008d1c8-f1f6-4209-8037-2a9c68e3823a\") " pod="openshift-infra/auto-csr-approver-29535974-sfhz8" Feb 27 02:14:00 crc kubenswrapper[4771]: I0227 02:14:00.439747 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d48tq\" (UniqueName: \"kubernetes.io/projected/4008d1c8-f1f6-4209-8037-2a9c68e3823a-kube-api-access-d48tq\") pod \"auto-csr-approver-29535974-sfhz8\" (UID: \"4008d1c8-f1f6-4209-8037-2a9c68e3823a\") " pod="openshift-infra/auto-csr-approver-29535974-sfhz8" Feb 27 02:14:00 crc kubenswrapper[4771]: I0227 02:14:00.462986 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d48tq\" (UniqueName: \"kubernetes.io/projected/4008d1c8-f1f6-4209-8037-2a9c68e3823a-kube-api-access-d48tq\") pod \"auto-csr-approver-29535974-sfhz8\" (UID: \"4008d1c8-f1f6-4209-8037-2a9c68e3823a\") " pod="openshift-infra/auto-csr-approver-29535974-sfhz8" Feb 27 02:14:00 crc kubenswrapper[4771]: I0227 02:14:00.471095 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535974-sfhz8" Feb 27 02:14:00 crc kubenswrapper[4771]: I0227 02:14:00.930229 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535974-sfhz8"] Feb 27 02:14:01 crc kubenswrapper[4771]: I0227 02:14:01.750436 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535974-sfhz8" event={"ID":"4008d1c8-f1f6-4209-8037-2a9c68e3823a","Type":"ContainerStarted","Data":"4e10f9c164ae76a06e0db1de054eaf42d46384dd3838f2a2a968ea8e92b0a32f"} Feb 27 02:14:02 crc kubenswrapper[4771]: I0227 02:14:02.759723 4771 generic.go:334] "Generic (PLEG): container finished" podID="4008d1c8-f1f6-4209-8037-2a9c68e3823a" containerID="d62f8ae0d5dd14962f5fdb3119317ffbc9250ddc6c7cc1c721ed88e79aeefd95" exitCode=0 Feb 27 02:14:02 crc kubenswrapper[4771]: I0227 02:14:02.759770 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535974-sfhz8" event={"ID":"4008d1c8-f1f6-4209-8037-2a9c68e3823a","Type":"ContainerDied","Data":"d62f8ae0d5dd14962f5fdb3119317ffbc9250ddc6c7cc1c721ed88e79aeefd95"} Feb 27 02:14:04 crc kubenswrapper[4771]: I0227 02:14:04.162611 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535974-sfhz8" Feb 27 02:14:04 crc kubenswrapper[4771]: I0227 02:14:04.206234 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d48tq\" (UniqueName: \"kubernetes.io/projected/4008d1c8-f1f6-4209-8037-2a9c68e3823a-kube-api-access-d48tq\") pod \"4008d1c8-f1f6-4209-8037-2a9c68e3823a\" (UID: \"4008d1c8-f1f6-4209-8037-2a9c68e3823a\") " Feb 27 02:14:04 crc kubenswrapper[4771]: I0227 02:14:04.211472 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4008d1c8-f1f6-4209-8037-2a9c68e3823a-kube-api-access-d48tq" (OuterVolumeSpecName: "kube-api-access-d48tq") pod "4008d1c8-f1f6-4209-8037-2a9c68e3823a" (UID: "4008d1c8-f1f6-4209-8037-2a9c68e3823a"). InnerVolumeSpecName "kube-api-access-d48tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:14:04 crc kubenswrapper[4771]: I0227 02:14:04.308410 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d48tq\" (UniqueName: \"kubernetes.io/projected/4008d1c8-f1f6-4209-8037-2a9c68e3823a-kube-api-access-d48tq\") on node \"crc\" DevicePath \"\"" Feb 27 02:14:04 crc kubenswrapper[4771]: I0227 02:14:04.781287 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535974-sfhz8" event={"ID":"4008d1c8-f1f6-4209-8037-2a9c68e3823a","Type":"ContainerDied","Data":"4e10f9c164ae76a06e0db1de054eaf42d46384dd3838f2a2a968ea8e92b0a32f"} Feb 27 02:14:04 crc kubenswrapper[4771]: I0227 02:14:04.781327 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535974-sfhz8" Feb 27 02:14:04 crc kubenswrapper[4771]: I0227 02:14:04.781348 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e10f9c164ae76a06e0db1de054eaf42d46384dd3838f2a2a968ea8e92b0a32f" Feb 27 02:14:05 crc kubenswrapper[4771]: I0227 02:14:05.235235 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535968-qgjbh"] Feb 27 02:14:05 crc kubenswrapper[4771]: I0227 02:14:05.246494 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535968-qgjbh"] Feb 27 02:14:05 crc kubenswrapper[4771]: I0227 02:14:05.783322 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaff08b9-734f-4294-9467-5bd95b60d836" path="/var/lib/kubelet/pods/eaff08b9-734f-4294-9467-5bd95b60d836/volumes" Feb 27 02:14:11 crc kubenswrapper[4771]: I0227 02:14:11.773651 4771 scope.go:117] "RemoveContainer" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" Feb 27 02:14:11 crc kubenswrapper[4771]: E0227 02:14:11.774577 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:14:14 crc kubenswrapper[4771]: I0227 02:14:14.989023 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-sgfp9_cd363b49-3f3c-46af-834d-5ab27e2ed35e/kube-rbac-proxy/0.log" Feb 27 02:14:15 crc kubenswrapper[4771]: I0227 02:14:15.032360 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-sgfp9_cd363b49-3f3c-46af-834d-5ab27e2ed35e/controller/0.log" Feb 27 02:14:15 crc kubenswrapper[4771]: I0227 02:14:15.185587 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/cp-frr-files/0.log" Feb 27 02:14:15 crc kubenswrapper[4771]: I0227 02:14:15.315819 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/cp-reloader/0.log" Feb 27 02:14:15 crc kubenswrapper[4771]: I0227 02:14:15.360769 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/cp-frr-files/0.log" Feb 27 02:14:15 crc kubenswrapper[4771]: I0227 02:14:15.373860 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/cp-metrics/0.log" Feb 27 02:14:15 crc kubenswrapper[4771]: I0227 02:14:15.382066 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/cp-reloader/0.log" Feb 27 02:14:15 crc kubenswrapper[4771]: I0227 02:14:15.521358 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/cp-frr-files/0.log" Feb 27 02:14:15 crc kubenswrapper[4771]: I0227 02:14:15.527823 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/cp-reloader/0.log" Feb 27 02:14:15 crc kubenswrapper[4771]: I0227 02:14:15.588273 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/cp-metrics/0.log" Feb 27 02:14:15 crc kubenswrapper[4771]: I0227 02:14:15.594307 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/cp-metrics/0.log" Feb 27 02:14:15 crc kubenswrapper[4771]: I0227 02:14:15.756715 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/cp-reloader/0.log" Feb 27 02:14:15 crc kubenswrapper[4771]: I0227 02:14:15.759942 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/cp-frr-files/0.log" Feb 27 02:14:15 crc kubenswrapper[4771]: I0227 02:14:15.760880 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/cp-metrics/0.log" Feb 27 02:14:15 crc kubenswrapper[4771]: I0227 02:14:15.800348 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/controller/0.log" Feb 27 02:14:16 crc kubenswrapper[4771]: I0227 02:14:16.396565 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/frr-metrics/0.log" Feb 27 02:14:16 crc kubenswrapper[4771]: I0227 02:14:16.447149 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/kube-rbac-proxy/0.log" Feb 27 02:14:16 crc kubenswrapper[4771]: I0227 02:14:16.453710 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/kube-rbac-proxy-frr/0.log" Feb 27 02:14:16 crc kubenswrapper[4771]: I0227 02:14:16.612923 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/reloader/0.log" Feb 27 02:14:16 crc kubenswrapper[4771]: I0227 02:14:16.653819 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-8mq2q_4e3da97e-a051-4d50-b905-3ed4c804cfc6/frr-k8s-webhook-server/0.log" Feb 27 02:14:16 crc kubenswrapper[4771]: I0227 02:14:16.856397 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-d75cc4945-g8fp7_3294b45f-a2de-4a92-8466-46c17ddd0238/manager/0.log" Feb 27 02:14:16 crc kubenswrapper[4771]: I0227 02:14:16.984807 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-65c77fdb5d-6ltq2_635de0be-09c0-49ad-905c-49caa1c8b50e/webhook-server/0.log" Feb 27 02:14:17 crc kubenswrapper[4771]: I0227 02:14:17.051364 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-l5k7x_61a9b00d-d330-4575-bdac-adff64f6786d/kube-rbac-proxy/0.log" Feb 27 02:14:17 crc kubenswrapper[4771]: I0227 02:14:17.607077 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-l5k7x_61a9b00d-d330-4575-bdac-adff64f6786d/speaker/0.log" Feb 27 02:14:18 crc kubenswrapper[4771]: I0227 02:14:18.032971 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cvvlf_88db3f72-8aff-4838-b58f-a37d0e2e2e64/frr/0.log" Feb 27 02:14:25 crc kubenswrapper[4771]: I0227 02:14:25.773666 4771 scope.go:117] "RemoveContainer" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" Feb 27 02:14:25 crc kubenswrapper[4771]: E0227 02:14:25.774642 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:14:30 crc kubenswrapper[4771]: I0227 02:14:30.230668 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck_78ae3d79-21d2-41f2-8685-9eeb9095dbb9/util/0.log" Feb 27 02:14:30 crc kubenswrapper[4771]: I0227 02:14:30.325297 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck_78ae3d79-21d2-41f2-8685-9eeb9095dbb9/pull/0.log" Feb 27 02:14:30 crc kubenswrapper[4771]: I0227 02:14:30.343642 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck_78ae3d79-21d2-41f2-8685-9eeb9095dbb9/util/0.log" Feb 27 02:14:30 crc kubenswrapper[4771]: I0227 02:14:30.398497 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck_78ae3d79-21d2-41f2-8685-9eeb9095dbb9/pull/0.log" Feb 27 02:14:30 crc kubenswrapper[4771]: I0227 02:14:30.585030 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck_78ae3d79-21d2-41f2-8685-9eeb9095dbb9/util/0.log" Feb 27 02:14:30 crc kubenswrapper[4771]: I0227 02:14:30.585150 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck_78ae3d79-21d2-41f2-8685-9eeb9095dbb9/extract/0.log" Feb 27 02:14:30 crc kubenswrapper[4771]: I0227 02:14:30.592066 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82x6fck_78ae3d79-21d2-41f2-8685-9eeb9095dbb9/pull/0.log" Feb 27 02:14:30 crc kubenswrapper[4771]: I0227 02:14:30.793894 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6b7jf_54ef043a-8831-43e4-abb7-583a36418b6c/extract-utilities/0.log" Feb 27 02:14:31 crc kubenswrapper[4771]: I0227 02:14:31.001768 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6b7jf_54ef043a-8831-43e4-abb7-583a36418b6c/extract-utilities/0.log" Feb 27 02:14:31 crc kubenswrapper[4771]: I0227 02:14:31.004540 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6b7jf_54ef043a-8831-43e4-abb7-583a36418b6c/extract-content/0.log" Feb 27 02:14:31 crc kubenswrapper[4771]: I0227 02:14:31.009259 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6b7jf_54ef043a-8831-43e4-abb7-583a36418b6c/extract-content/0.log" Feb 27 02:14:31 crc kubenswrapper[4771]: I0227 02:14:31.179264 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6b7jf_54ef043a-8831-43e4-abb7-583a36418b6c/extract-content/0.log" Feb 27 02:14:31 crc kubenswrapper[4771]: I0227 02:14:31.267850 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6b7jf_54ef043a-8831-43e4-abb7-583a36418b6c/extract-utilities/0.log" Feb 27 02:14:31 crc kubenswrapper[4771]: I0227 02:14:31.627843 4771 scope.go:117] "RemoveContainer" containerID="8011ca3f487eb8d145b3f1c27f6e89cfcad8277d511ff936c8a94a1b41fdfc8c" Feb 27 02:14:31 crc kubenswrapper[4771]: I0227 02:14:31.713924 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n48ck_005f8696-cd2c-46e3-8edb-d9c9c9652871/extract-utilities/0.log" Feb 27 02:14:31 crc kubenswrapper[4771]: I0227 02:14:31.759703 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6b7jf_54ef043a-8831-43e4-abb7-583a36418b6c/registry-server/0.log" Feb 27 02:14:31 crc kubenswrapper[4771]: I0227 02:14:31.922847 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n48ck_005f8696-cd2c-46e3-8edb-d9c9c9652871/extract-content/0.log" Feb 27 02:14:31 crc kubenswrapper[4771]: I0227 02:14:31.929311 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n48ck_005f8696-cd2c-46e3-8edb-d9c9c9652871/extract-utilities/0.log" Feb 27 02:14:31 crc kubenswrapper[4771]: I0227 02:14:31.953787 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n48ck_005f8696-cd2c-46e3-8edb-d9c9c9652871/extract-content/0.log" Feb 27 02:14:32 crc kubenswrapper[4771]: I0227 02:14:32.181904 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n48ck_005f8696-cd2c-46e3-8edb-d9c9c9652871/extract-utilities/0.log" Feb 27 02:14:32 crc kubenswrapper[4771]: I0227 02:14:32.241475 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n48ck_005f8696-cd2c-46e3-8edb-d9c9c9652871/extract-content/0.log" Feb 27 02:14:32 crc kubenswrapper[4771]: I0227 02:14:32.439493 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7_008a5eed-f47a-4fd7-8fbe-c442e115da9a/util/0.log" Feb 27 02:14:32 crc kubenswrapper[4771]: I0227 02:14:32.672904 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7_008a5eed-f47a-4fd7-8fbe-c442e115da9a/util/0.log" Feb 27 02:14:32 crc kubenswrapper[4771]: I0227 02:14:32.687495 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7_008a5eed-f47a-4fd7-8fbe-c442e115da9a/pull/0.log" Feb 27 02:14:32 crc kubenswrapper[4771]: I0227 02:14:32.708926 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7_008a5eed-f47a-4fd7-8fbe-c442e115da9a/pull/0.log" Feb 27 02:14:32 crc kubenswrapper[4771]: I0227 02:14:32.872967 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7_008a5eed-f47a-4fd7-8fbe-c442e115da9a/pull/0.log" Feb 27 02:14:32 crc kubenswrapper[4771]: I0227 02:14:32.919500 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7_008a5eed-f47a-4fd7-8fbe-c442e115da9a/extract/0.log" Feb 27 02:14:32 crc kubenswrapper[4771]: I0227 02:14:32.930765 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4r7nt7_008a5eed-f47a-4fd7-8fbe-c442e115da9a/util/0.log" Feb 27 02:14:33 crc kubenswrapper[4771]: I0227 02:14:33.088656 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n48ck_005f8696-cd2c-46e3-8edb-d9c9c9652871/registry-server/0.log" Feb 27 02:14:33 crc kubenswrapper[4771]: I0227 02:14:33.133263 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-jffnf_9cb60be5-a0ff-489e-a473-32a72359b2ce/marketplace-operator/0.log" Feb 27 02:14:33 crc kubenswrapper[4771]: I0227 02:14:33.303824 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl98j_a25ddae9-53d5-4d86-9914-10fc8e695cb3/extract-utilities/0.log" Feb 27 02:14:33 crc kubenswrapper[4771]: I0227 02:14:33.524125 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl98j_a25ddae9-53d5-4d86-9914-10fc8e695cb3/extract-utilities/0.log" Feb 27 02:14:33 crc kubenswrapper[4771]: I0227 02:14:33.557528 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl98j_a25ddae9-53d5-4d86-9914-10fc8e695cb3/extract-content/0.log" Feb 27 02:14:33 crc kubenswrapper[4771]: I0227 02:14:33.581002 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl98j_a25ddae9-53d5-4d86-9914-10fc8e695cb3/extract-content/0.log" Feb 27 02:14:33 crc kubenswrapper[4771]: I0227 02:14:33.704310 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl98j_a25ddae9-53d5-4d86-9914-10fc8e695cb3/extract-utilities/0.log" Feb 27 02:14:33 crc kubenswrapper[4771]: I0227 02:14:33.748856 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl98j_a25ddae9-53d5-4d86-9914-10fc8e695cb3/extract-content/0.log" Feb 27 02:14:33 crc kubenswrapper[4771]: I0227 02:14:33.849078 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sl98j_a25ddae9-53d5-4d86-9914-10fc8e695cb3/registry-server/0.log" Feb 27 02:14:33 crc kubenswrapper[4771]: I0227 02:14:33.933605 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nm89l_eb1594d2-dbd5-4e37-8d97-dac2a6357808/extract-utilities/0.log" Feb 27 02:14:34 crc kubenswrapper[4771]: I0227 02:14:34.100106 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nm89l_eb1594d2-dbd5-4e37-8d97-dac2a6357808/extract-content/0.log" Feb 27 02:14:34 crc kubenswrapper[4771]: I0227 02:14:34.139953 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nm89l_eb1594d2-dbd5-4e37-8d97-dac2a6357808/extract-content/0.log" Feb 27 02:14:34 crc kubenswrapper[4771]: I0227 02:14:34.153466 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nm89l_eb1594d2-dbd5-4e37-8d97-dac2a6357808/extract-utilities/0.log" Feb 27 02:14:34 crc kubenswrapper[4771]: I0227 02:14:34.296657 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nm89l_eb1594d2-dbd5-4e37-8d97-dac2a6357808/extract-utilities/0.log" Feb 27 02:14:34 crc kubenswrapper[4771]: I0227 02:14:34.323302 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nm89l_eb1594d2-dbd5-4e37-8d97-dac2a6357808/extract-content/0.log" Feb 27 02:14:34 crc kubenswrapper[4771]: I0227 02:14:34.823778 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-nm89l_eb1594d2-dbd5-4e37-8d97-dac2a6357808/registry-server/0.log" Feb 27 02:14:37 crc kubenswrapper[4771]: I0227 02:14:37.779011 4771 scope.go:117] "RemoveContainer" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" Feb 27 02:14:37 crc kubenswrapper[4771]: E0227 02:14:37.779825 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:14:48 crc kubenswrapper[4771]: I0227 02:14:48.773140 4771 scope.go:117] "RemoveContainer" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" Feb 27 02:14:48 crc kubenswrapper[4771]: E0227 02:14:48.773917 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:15:00 crc kubenswrapper[4771]: I0227 02:15:00.157807 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535975-cgjjh"] Feb 27 02:15:00 crc kubenswrapper[4771]: E0227 02:15:00.158663 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4008d1c8-f1f6-4209-8037-2a9c68e3823a" containerName="oc" Feb 27 02:15:00 crc kubenswrapper[4771]: I0227 02:15:00.158677 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4008d1c8-f1f6-4209-8037-2a9c68e3823a" containerName="oc" Feb 27 02:15:00 crc kubenswrapper[4771]: I0227 02:15:00.158869 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4008d1c8-f1f6-4209-8037-2a9c68e3823a" containerName="oc" Feb 27 02:15:00 crc kubenswrapper[4771]: I0227 02:15:00.159496 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535975-cgjjh" Feb 27 02:15:00 crc kubenswrapper[4771]: I0227 02:15:00.174944 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 02:15:00 crc kubenswrapper[4771]: I0227 02:15:00.175845 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 02:15:00 crc kubenswrapper[4771]: I0227 02:15:00.211875 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535975-cgjjh"] Feb 27 02:15:00 crc kubenswrapper[4771]: I0227 02:15:00.222273 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc140a39-01f4-4730-a525-4a1f8e4477fb-secret-volume\") pod \"collect-profiles-29535975-cgjjh\" (UID: \"bc140a39-01f4-4730-a525-4a1f8e4477fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535975-cgjjh" Feb 27 02:15:00 crc kubenswrapper[4771]: I0227 02:15:00.222450 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddtmr\" (UniqueName: \"kubernetes.io/projected/bc140a39-01f4-4730-a525-4a1f8e4477fb-kube-api-access-ddtmr\") pod \"collect-profiles-29535975-cgjjh\" (UID: \"bc140a39-01f4-4730-a525-4a1f8e4477fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535975-cgjjh" Feb 27 02:15:00 crc kubenswrapper[4771]: I0227 02:15:00.222481 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc140a39-01f4-4730-a525-4a1f8e4477fb-config-volume\") pod \"collect-profiles-29535975-cgjjh\" (UID: \"bc140a39-01f4-4730-a525-4a1f8e4477fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535975-cgjjh" Feb 27 02:15:00 crc kubenswrapper[4771]: I0227 02:15:00.324416 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddtmr\" (UniqueName: \"kubernetes.io/projected/bc140a39-01f4-4730-a525-4a1f8e4477fb-kube-api-access-ddtmr\") pod \"collect-profiles-29535975-cgjjh\" (UID: \"bc140a39-01f4-4730-a525-4a1f8e4477fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535975-cgjjh" Feb 27 02:15:00 crc kubenswrapper[4771]: I0227 02:15:00.324468 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc140a39-01f4-4730-a525-4a1f8e4477fb-config-volume\") pod \"collect-profiles-29535975-cgjjh\" (UID: \"bc140a39-01f4-4730-a525-4a1f8e4477fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535975-cgjjh" Feb 27 02:15:00 crc kubenswrapper[4771]: I0227 02:15:00.324529 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc140a39-01f4-4730-a525-4a1f8e4477fb-secret-volume\") pod \"collect-profiles-29535975-cgjjh\" (UID: \"bc140a39-01f4-4730-a525-4a1f8e4477fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535975-cgjjh" Feb 27 02:15:00 crc kubenswrapper[4771]: I0227 02:15:00.326282 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc140a39-01f4-4730-a525-4a1f8e4477fb-config-volume\") pod \"collect-profiles-29535975-cgjjh\" (UID: \"bc140a39-01f4-4730-a525-4a1f8e4477fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535975-cgjjh" Feb 27 02:15:00 crc kubenswrapper[4771]: I0227 02:15:00.335208 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc140a39-01f4-4730-a525-4a1f8e4477fb-secret-volume\") pod \"collect-profiles-29535975-cgjjh\" (UID: \"bc140a39-01f4-4730-a525-4a1f8e4477fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535975-cgjjh" Feb 27 02:15:00 crc kubenswrapper[4771]: I0227 02:15:00.375333 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddtmr\" (UniqueName: \"kubernetes.io/projected/bc140a39-01f4-4730-a525-4a1f8e4477fb-kube-api-access-ddtmr\") pod \"collect-profiles-29535975-cgjjh\" (UID: \"bc140a39-01f4-4730-a525-4a1f8e4477fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535975-cgjjh" Feb 27 02:15:00 crc kubenswrapper[4771]: I0227 02:15:00.496204 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535975-cgjjh" Feb 27 02:15:01 crc kubenswrapper[4771]: I0227 02:15:01.085198 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535975-cgjjh"] Feb 27 02:15:01 crc kubenswrapper[4771]: I0227 02:15:01.311821 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535975-cgjjh" event={"ID":"bc140a39-01f4-4730-a525-4a1f8e4477fb","Type":"ContainerStarted","Data":"1f32af58f8e7a520d0e9d92c31e7daa02a0337651978719a41064b4b3b7ea3f9"} Feb 27 02:15:01 crc kubenswrapper[4771]: I0227 02:15:01.312085 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535975-cgjjh" event={"ID":"bc140a39-01f4-4730-a525-4a1f8e4477fb","Type":"ContainerStarted","Data":"ec3b1f163afce6c1311c56dde2258cc3e6eab96339a48cf3dbc59e10f428d80e"} Feb 27 02:15:01 crc kubenswrapper[4771]: I0227 02:15:01.345434 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29535975-cgjjh" podStartSLOduration=1.34541297 podStartE2EDuration="1.34541297s" podCreationTimestamp="2026-02-27 02:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 02:15:01.332029998 +0000 UTC m=+4214.269591286" watchObservedRunningTime="2026-02-27 02:15:01.34541297 +0000 UTC m=+4214.282974258" Feb 27 02:15:01 crc kubenswrapper[4771]: I0227 02:15:01.778171 4771 scope.go:117] "RemoveContainer" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" Feb 27 02:15:01 crc kubenswrapper[4771]: E0227 02:15:01.778710 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:15:02 crc kubenswrapper[4771]: I0227 02:15:02.321962 4771 generic.go:334] "Generic (PLEG): container finished" podID="bc140a39-01f4-4730-a525-4a1f8e4477fb" containerID="1f32af58f8e7a520d0e9d92c31e7daa02a0337651978719a41064b4b3b7ea3f9" exitCode=0 Feb 27 02:15:02 crc kubenswrapper[4771]: I0227 02:15:02.322016 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535975-cgjjh" event={"ID":"bc140a39-01f4-4730-a525-4a1f8e4477fb","Type":"ContainerDied","Data":"1f32af58f8e7a520d0e9d92c31e7daa02a0337651978719a41064b4b3b7ea3f9"} Feb 27 02:15:03 crc kubenswrapper[4771]: I0227 02:15:03.765310 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535975-cgjjh" Feb 27 02:15:03 crc kubenswrapper[4771]: I0227 02:15:03.914280 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddtmr\" (UniqueName: \"kubernetes.io/projected/bc140a39-01f4-4730-a525-4a1f8e4477fb-kube-api-access-ddtmr\") pod \"bc140a39-01f4-4730-a525-4a1f8e4477fb\" (UID: \"bc140a39-01f4-4730-a525-4a1f8e4477fb\") " Feb 27 02:15:03 crc kubenswrapper[4771]: I0227 02:15:03.914420 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc140a39-01f4-4730-a525-4a1f8e4477fb-config-volume\") pod \"bc140a39-01f4-4730-a525-4a1f8e4477fb\" (UID: \"bc140a39-01f4-4730-a525-4a1f8e4477fb\") " Feb 27 02:15:03 crc kubenswrapper[4771]: I0227 02:15:03.914481 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc140a39-01f4-4730-a525-4a1f8e4477fb-secret-volume\") pod \"bc140a39-01f4-4730-a525-4a1f8e4477fb\" (UID: \"bc140a39-01f4-4730-a525-4a1f8e4477fb\") " Feb 27 02:15:03 crc kubenswrapper[4771]: I0227 02:15:03.915180 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc140a39-01f4-4730-a525-4a1f8e4477fb-config-volume" (OuterVolumeSpecName: "config-volume") pod "bc140a39-01f4-4730-a525-4a1f8e4477fb" (UID: "bc140a39-01f4-4730-a525-4a1f8e4477fb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 02:15:03 crc kubenswrapper[4771]: I0227 02:15:03.920626 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc140a39-01f4-4730-a525-4a1f8e4477fb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bc140a39-01f4-4730-a525-4a1f8e4477fb" (UID: "bc140a39-01f4-4730-a525-4a1f8e4477fb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 02:15:03 crc kubenswrapper[4771]: I0227 02:15:03.921676 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc140a39-01f4-4730-a525-4a1f8e4477fb-kube-api-access-ddtmr" (OuterVolumeSpecName: "kube-api-access-ddtmr") pod "bc140a39-01f4-4730-a525-4a1f8e4477fb" (UID: "bc140a39-01f4-4730-a525-4a1f8e4477fb"). InnerVolumeSpecName "kube-api-access-ddtmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:15:04 crc kubenswrapper[4771]: I0227 02:15:04.016798 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc140a39-01f4-4730-a525-4a1f8e4477fb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 02:15:04 crc kubenswrapper[4771]: I0227 02:15:04.016831 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bc140a39-01f4-4730-a525-4a1f8e4477fb-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 02:15:04 crc kubenswrapper[4771]: I0227 02:15:04.016844 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddtmr\" (UniqueName: \"kubernetes.io/projected/bc140a39-01f4-4730-a525-4a1f8e4477fb-kube-api-access-ddtmr\") on node \"crc\" DevicePath \"\"" Feb 27 02:15:04 crc kubenswrapper[4771]: I0227 02:15:04.340124 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535975-cgjjh" event={"ID":"bc140a39-01f4-4730-a525-4a1f8e4477fb","Type":"ContainerDied","Data":"ec3b1f163afce6c1311c56dde2258cc3e6eab96339a48cf3dbc59e10f428d80e"} Feb 27 02:15:04 crc kubenswrapper[4771]: I0227 02:15:04.340173 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec3b1f163afce6c1311c56dde2258cc3e6eab96339a48cf3dbc59e10f428d80e" Feb 27 02:15:04 crc kubenswrapper[4771]: I0227 02:15:04.340175 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535975-cgjjh" Feb 27 02:15:04 crc kubenswrapper[4771]: I0227 02:15:04.410279 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535930-wkmns"] Feb 27 02:15:04 crc kubenswrapper[4771]: I0227 02:15:04.417411 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535930-wkmns"] Feb 27 02:15:05 crc kubenswrapper[4771]: I0227 02:15:05.782136 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47f3600b-05b8-494c-b37b-85be607f8186" path="/var/lib/kubelet/pods/47f3600b-05b8-494c-b37b-85be607f8186/volumes" Feb 27 02:15:16 crc kubenswrapper[4771]: I0227 02:15:16.776210 4771 scope.go:117] "RemoveContainer" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" Feb 27 02:15:16 crc kubenswrapper[4771]: E0227 02:15:16.781039 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hw7dn_openshift-machine-config-operator(ca81e505-d53f-496e-bd26-7cec669591e4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" Feb 27 02:15:29 crc kubenswrapper[4771]: I0227 02:15:29.460195 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k8ck8"] Feb 27 02:15:29 crc kubenswrapper[4771]: E0227 02:15:29.461112 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc140a39-01f4-4730-a525-4a1f8e4477fb" containerName="collect-profiles" Feb 27 02:15:29 crc kubenswrapper[4771]: I0227 02:15:29.461124 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc140a39-01f4-4730-a525-4a1f8e4477fb" containerName="collect-profiles" Feb 27 02:15:29 crc kubenswrapper[4771]: I0227 02:15:29.461321 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc140a39-01f4-4730-a525-4a1f8e4477fb" containerName="collect-profiles" Feb 27 02:15:29 crc kubenswrapper[4771]: I0227 02:15:29.462628 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k8ck8" Feb 27 02:15:29 crc kubenswrapper[4771]: I0227 02:15:29.480163 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k8ck8"] Feb 27 02:15:29 crc kubenswrapper[4771]: I0227 02:15:29.546450 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc64338-5e8e-4016-8c85-7e5ee93ac70f-utilities\") pod \"redhat-marketplace-k8ck8\" (UID: \"9dc64338-5e8e-4016-8c85-7e5ee93ac70f\") " pod="openshift-marketplace/redhat-marketplace-k8ck8" Feb 27 02:15:29 crc kubenswrapper[4771]: I0227 02:15:29.546616 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc64338-5e8e-4016-8c85-7e5ee93ac70f-catalog-content\") pod \"redhat-marketplace-k8ck8\" (UID: \"9dc64338-5e8e-4016-8c85-7e5ee93ac70f\") " pod="openshift-marketplace/redhat-marketplace-k8ck8" Feb 27 02:15:29 crc kubenswrapper[4771]: I0227 02:15:29.546680 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6mcx\" (UniqueName: \"kubernetes.io/projected/9dc64338-5e8e-4016-8c85-7e5ee93ac70f-kube-api-access-l6mcx\") pod \"redhat-marketplace-k8ck8\" (UID: \"9dc64338-5e8e-4016-8c85-7e5ee93ac70f\") " pod="openshift-marketplace/redhat-marketplace-k8ck8" Feb 27 02:15:29 crc kubenswrapper[4771]: I0227 02:15:29.648122 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6mcx\" (UniqueName: \"kubernetes.io/projected/9dc64338-5e8e-4016-8c85-7e5ee93ac70f-kube-api-access-l6mcx\") pod \"redhat-marketplace-k8ck8\" (UID: \"9dc64338-5e8e-4016-8c85-7e5ee93ac70f\") " pod="openshift-marketplace/redhat-marketplace-k8ck8" Feb 27 02:15:29 crc kubenswrapper[4771]: I0227 02:15:29.648637 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc64338-5e8e-4016-8c85-7e5ee93ac70f-utilities\") pod \"redhat-marketplace-k8ck8\" (UID: \"9dc64338-5e8e-4016-8c85-7e5ee93ac70f\") " pod="openshift-marketplace/redhat-marketplace-k8ck8" Feb 27 02:15:29 crc kubenswrapper[4771]: I0227 02:15:29.648719 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc64338-5e8e-4016-8c85-7e5ee93ac70f-catalog-content\") pod \"redhat-marketplace-k8ck8\" (UID: \"9dc64338-5e8e-4016-8c85-7e5ee93ac70f\") " pod="openshift-marketplace/redhat-marketplace-k8ck8" Feb 27 02:15:29 crc kubenswrapper[4771]: I0227 02:15:29.649094 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc64338-5e8e-4016-8c85-7e5ee93ac70f-utilities\") pod \"redhat-marketplace-k8ck8\" (UID: \"9dc64338-5e8e-4016-8c85-7e5ee93ac70f\") " pod="openshift-marketplace/redhat-marketplace-k8ck8" Feb 27 02:15:29 crc kubenswrapper[4771]: I0227 02:15:29.649294 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc64338-5e8e-4016-8c85-7e5ee93ac70f-catalog-content\") pod \"redhat-marketplace-k8ck8\" (UID: \"9dc64338-5e8e-4016-8c85-7e5ee93ac70f\") " pod="openshift-marketplace/redhat-marketplace-k8ck8" Feb 27 02:15:29 crc kubenswrapper[4771]: I0227 02:15:29.667901 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6mcx\" (UniqueName: \"kubernetes.io/projected/9dc64338-5e8e-4016-8c85-7e5ee93ac70f-kube-api-access-l6mcx\") pod \"redhat-marketplace-k8ck8\" (UID: \"9dc64338-5e8e-4016-8c85-7e5ee93ac70f\") " pod="openshift-marketplace/redhat-marketplace-k8ck8" Feb 27 02:15:29 crc kubenswrapper[4771]: I0227 02:15:29.779634 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k8ck8" Feb 27 02:15:30 crc kubenswrapper[4771]: I0227 02:15:30.234909 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k8ck8"] Feb 27 02:15:30 crc kubenswrapper[4771]: I0227 02:15:30.606627 4771 generic.go:334] "Generic (PLEG): container finished" podID="9dc64338-5e8e-4016-8c85-7e5ee93ac70f" containerID="38f00d781a869ac43f57a977452d591299ee0bfac0a1034f7692f68800cfafbf" exitCode=0 Feb 27 02:15:30 crc kubenswrapper[4771]: I0227 02:15:30.607075 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k8ck8" event={"ID":"9dc64338-5e8e-4016-8c85-7e5ee93ac70f","Type":"ContainerDied","Data":"38f00d781a869ac43f57a977452d591299ee0bfac0a1034f7692f68800cfafbf"} Feb 27 02:15:30 crc kubenswrapper[4771]: I0227 02:15:30.607118 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k8ck8" event={"ID":"9dc64338-5e8e-4016-8c85-7e5ee93ac70f","Type":"ContainerStarted","Data":"3251bdc19dfc8031396d4baf167b62ec4924ebe8ec2baf639334355c60facdd1"} Feb 27 02:15:30 crc kubenswrapper[4771]: I0227 02:15:30.609683 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 02:15:30 crc kubenswrapper[4771]: I0227 02:15:30.773713 4771 scope.go:117] "RemoveContainer" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" Feb 27 02:15:31 crc kubenswrapper[4771]: I0227 02:15:31.630544 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerStarted","Data":"8eeb6a600e72f58064a0af0666b2e1760333ab9c0dffb4c33941bbbff787e68a"} Feb 27 02:15:31 crc kubenswrapper[4771]: I0227 02:15:31.750781 4771 scope.go:117] "RemoveContainer" containerID="bea3b2af5322da83e83d96d97985a4ab4438b6e9e603c3ad651edb94fc325886" Feb 27 02:15:32 crc kubenswrapper[4771]: I0227 02:15:32.641061 4771 generic.go:334] "Generic (PLEG): container finished" podID="9dc64338-5e8e-4016-8c85-7e5ee93ac70f" containerID="a71b2ae7bf25708e2044e1f3e12034516af42fa693c41089733b9be1f0d953ce" exitCode=0 Feb 27 02:15:32 crc kubenswrapper[4771]: I0227 02:15:32.641113 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k8ck8" event={"ID":"9dc64338-5e8e-4016-8c85-7e5ee93ac70f","Type":"ContainerDied","Data":"a71b2ae7bf25708e2044e1f3e12034516af42fa693c41089733b9be1f0d953ce"} Feb 27 02:15:33 crc kubenswrapper[4771]: I0227 02:15:33.656712 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k8ck8" event={"ID":"9dc64338-5e8e-4016-8c85-7e5ee93ac70f","Type":"ContainerStarted","Data":"e2a26c12c04bdb3f604dd570a9f53a6533b801321cad2de7379bfc18fa085038"} Feb 27 02:15:33 crc kubenswrapper[4771]: I0227 02:15:33.678731 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k8ck8" podStartSLOduration=2.180699968 podStartE2EDuration="4.678703043s" podCreationTimestamp="2026-02-27 02:15:29 +0000 UTC" firstStartedPulling="2026-02-27 02:15:30.609244032 +0000 UTC m=+4243.546805360" lastFinishedPulling="2026-02-27 02:15:33.107247097 +0000 UTC m=+4246.044808435" observedRunningTime="2026-02-27 02:15:33.67673351 +0000 UTC m=+4246.614294838" watchObservedRunningTime="2026-02-27 02:15:33.678703043 +0000 UTC m=+4246.616264371" Feb 27 02:15:39 crc kubenswrapper[4771]: I0227 02:15:39.792257 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k8ck8" Feb 27 02:15:39 crc kubenswrapper[4771]: I0227 02:15:39.793264 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k8ck8" Feb 27 02:15:39 crc kubenswrapper[4771]: I0227 02:15:39.840413 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k8ck8" Feb 27 02:15:40 crc kubenswrapper[4771]: I0227 02:15:40.802822 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k8ck8" Feb 27 02:15:40 crc kubenswrapper[4771]: I0227 02:15:40.860482 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k8ck8"] Feb 27 02:15:42 crc kubenswrapper[4771]: I0227 02:15:42.807396 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k8ck8" podUID="9dc64338-5e8e-4016-8c85-7e5ee93ac70f" containerName="registry-server" containerID="cri-o://e2a26c12c04bdb3f604dd570a9f53a6533b801321cad2de7379bfc18fa085038" gracePeriod=2 Feb 27 02:15:43 crc kubenswrapper[4771]: I0227 02:15:43.307057 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k8ck8" Feb 27 02:15:43 crc kubenswrapper[4771]: I0227 02:15:43.432592 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6mcx\" (UniqueName: \"kubernetes.io/projected/9dc64338-5e8e-4016-8c85-7e5ee93ac70f-kube-api-access-l6mcx\") pod \"9dc64338-5e8e-4016-8c85-7e5ee93ac70f\" (UID: \"9dc64338-5e8e-4016-8c85-7e5ee93ac70f\") " Feb 27 02:15:43 crc kubenswrapper[4771]: I0227 02:15:43.432879 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc64338-5e8e-4016-8c85-7e5ee93ac70f-utilities\") pod \"9dc64338-5e8e-4016-8c85-7e5ee93ac70f\" (UID: \"9dc64338-5e8e-4016-8c85-7e5ee93ac70f\") " Feb 27 02:15:43 crc kubenswrapper[4771]: I0227 02:15:43.432975 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc64338-5e8e-4016-8c85-7e5ee93ac70f-catalog-content\") pod \"9dc64338-5e8e-4016-8c85-7e5ee93ac70f\" (UID: \"9dc64338-5e8e-4016-8c85-7e5ee93ac70f\") " Feb 27 02:15:43 crc kubenswrapper[4771]: I0227 02:15:43.434244 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dc64338-5e8e-4016-8c85-7e5ee93ac70f-utilities" (OuterVolumeSpecName: "utilities") pod "9dc64338-5e8e-4016-8c85-7e5ee93ac70f" (UID: "9dc64338-5e8e-4016-8c85-7e5ee93ac70f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 02:15:43 crc kubenswrapper[4771]: I0227 02:15:43.439057 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dc64338-5e8e-4016-8c85-7e5ee93ac70f-kube-api-access-l6mcx" (OuterVolumeSpecName: "kube-api-access-l6mcx") pod "9dc64338-5e8e-4016-8c85-7e5ee93ac70f" (UID: "9dc64338-5e8e-4016-8c85-7e5ee93ac70f"). InnerVolumeSpecName "kube-api-access-l6mcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:15:43 crc kubenswrapper[4771]: I0227 02:15:43.465642 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dc64338-5e8e-4016-8c85-7e5ee93ac70f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9dc64338-5e8e-4016-8c85-7e5ee93ac70f" (UID: "9dc64338-5e8e-4016-8c85-7e5ee93ac70f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 02:15:43 crc kubenswrapper[4771]: I0227 02:15:43.535626 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6mcx\" (UniqueName: \"kubernetes.io/projected/9dc64338-5e8e-4016-8c85-7e5ee93ac70f-kube-api-access-l6mcx\") on node \"crc\" DevicePath \"\"" Feb 27 02:15:43 crc kubenswrapper[4771]: I0227 02:15:43.535670 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc64338-5e8e-4016-8c85-7e5ee93ac70f-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 02:15:43 crc kubenswrapper[4771]: I0227 02:15:43.535683 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc64338-5e8e-4016-8c85-7e5ee93ac70f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 02:15:43 crc kubenswrapper[4771]: I0227 02:15:43.819701 4771 generic.go:334] "Generic (PLEG): container finished" podID="9dc64338-5e8e-4016-8c85-7e5ee93ac70f" containerID="e2a26c12c04bdb3f604dd570a9f53a6533b801321cad2de7379bfc18fa085038" exitCode=0 Feb 27 02:15:43 crc kubenswrapper[4771]: I0227 02:15:43.819766 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k8ck8" event={"ID":"9dc64338-5e8e-4016-8c85-7e5ee93ac70f","Type":"ContainerDied","Data":"e2a26c12c04bdb3f604dd570a9f53a6533b801321cad2de7379bfc18fa085038"} Feb 27 02:15:43 crc kubenswrapper[4771]: I0227 02:15:43.819767 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k8ck8" Feb 27 02:15:43 crc kubenswrapper[4771]: I0227 02:15:43.819816 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k8ck8" event={"ID":"9dc64338-5e8e-4016-8c85-7e5ee93ac70f","Type":"ContainerDied","Data":"3251bdc19dfc8031396d4baf167b62ec4924ebe8ec2baf639334355c60facdd1"} Feb 27 02:15:43 crc kubenswrapper[4771]: I0227 02:15:43.819837 4771 scope.go:117] "RemoveContainer" containerID="e2a26c12c04bdb3f604dd570a9f53a6533b801321cad2de7379bfc18fa085038" Feb 27 02:15:43 crc kubenswrapper[4771]: I0227 02:15:43.860406 4771 scope.go:117] "RemoveContainer" containerID="a71b2ae7bf25708e2044e1f3e12034516af42fa693c41089733b9be1f0d953ce" Feb 27 02:15:43 crc kubenswrapper[4771]: I0227 02:15:43.868029 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k8ck8"] Feb 27 02:15:43 crc kubenswrapper[4771]: I0227 02:15:43.880131 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k8ck8"] Feb 27 02:15:43 crc kubenswrapper[4771]: I0227 02:15:43.898954 4771 scope.go:117] "RemoveContainer" containerID="38f00d781a869ac43f57a977452d591299ee0bfac0a1034f7692f68800cfafbf" Feb 27 02:15:43 crc kubenswrapper[4771]: I0227 02:15:43.941804 4771 scope.go:117] "RemoveContainer" containerID="e2a26c12c04bdb3f604dd570a9f53a6533b801321cad2de7379bfc18fa085038" Feb 27 02:15:43 crc kubenswrapper[4771]: E0227 02:15:43.942290 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2a26c12c04bdb3f604dd570a9f53a6533b801321cad2de7379bfc18fa085038\": container with ID starting with e2a26c12c04bdb3f604dd570a9f53a6533b801321cad2de7379bfc18fa085038 not found: ID does not exist" containerID="e2a26c12c04bdb3f604dd570a9f53a6533b801321cad2de7379bfc18fa085038" Feb 27 02:15:43 crc kubenswrapper[4771]: I0227 02:15:43.942324 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2a26c12c04bdb3f604dd570a9f53a6533b801321cad2de7379bfc18fa085038"} err="failed to get container status \"e2a26c12c04bdb3f604dd570a9f53a6533b801321cad2de7379bfc18fa085038\": rpc error: code = NotFound desc = could not find container \"e2a26c12c04bdb3f604dd570a9f53a6533b801321cad2de7379bfc18fa085038\": container with ID starting with e2a26c12c04bdb3f604dd570a9f53a6533b801321cad2de7379bfc18fa085038 not found: ID does not exist" Feb 27 02:15:43 crc kubenswrapper[4771]: I0227 02:15:43.942344 4771 scope.go:117] "RemoveContainer" containerID="a71b2ae7bf25708e2044e1f3e12034516af42fa693c41089733b9be1f0d953ce" Feb 27 02:15:43 crc kubenswrapper[4771]: E0227 02:15:43.942648 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a71b2ae7bf25708e2044e1f3e12034516af42fa693c41089733b9be1f0d953ce\": container with ID starting with a71b2ae7bf25708e2044e1f3e12034516af42fa693c41089733b9be1f0d953ce not found: ID does not exist" containerID="a71b2ae7bf25708e2044e1f3e12034516af42fa693c41089733b9be1f0d953ce" Feb 27 02:15:43 crc kubenswrapper[4771]: I0227 02:15:43.942684 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a71b2ae7bf25708e2044e1f3e12034516af42fa693c41089733b9be1f0d953ce"} err="failed to get container status \"a71b2ae7bf25708e2044e1f3e12034516af42fa693c41089733b9be1f0d953ce\": rpc error: code = NotFound desc = could not find container \"a71b2ae7bf25708e2044e1f3e12034516af42fa693c41089733b9be1f0d953ce\": container with ID starting with a71b2ae7bf25708e2044e1f3e12034516af42fa693c41089733b9be1f0d953ce not found: ID does not exist" Feb 27 02:15:43 crc kubenswrapper[4771]: I0227 02:15:43.942699 4771 scope.go:117] "RemoveContainer" containerID="38f00d781a869ac43f57a977452d591299ee0bfac0a1034f7692f68800cfafbf" Feb 27 02:15:43 crc kubenswrapper[4771]: E0227 02:15:43.942935 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38f00d781a869ac43f57a977452d591299ee0bfac0a1034f7692f68800cfafbf\": container with ID starting with 38f00d781a869ac43f57a977452d591299ee0bfac0a1034f7692f68800cfafbf not found: ID does not exist" containerID="38f00d781a869ac43f57a977452d591299ee0bfac0a1034f7692f68800cfafbf" Feb 27 02:15:43 crc kubenswrapper[4771]: I0227 02:15:43.942962 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38f00d781a869ac43f57a977452d591299ee0bfac0a1034f7692f68800cfafbf"} err="failed to get container status \"38f00d781a869ac43f57a977452d591299ee0bfac0a1034f7692f68800cfafbf\": rpc error: code = NotFound desc = could not find container \"38f00d781a869ac43f57a977452d591299ee0bfac0a1034f7692f68800cfafbf\": container with ID starting with 38f00d781a869ac43f57a977452d591299ee0bfac0a1034f7692f68800cfafbf not found: ID does not exist" Feb 27 02:15:45 crc kubenswrapper[4771]: I0227 02:15:45.796064 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dc64338-5e8e-4016-8c85-7e5ee93ac70f" path="/var/lib/kubelet/pods/9dc64338-5e8e-4016-8c85-7e5ee93ac70f/volumes" Feb 27 02:16:00 crc kubenswrapper[4771]: I0227 02:16:00.162250 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535976-zhxzz"] Feb 27 02:16:00 crc kubenswrapper[4771]: E0227 02:16:00.163242 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc64338-5e8e-4016-8c85-7e5ee93ac70f" containerName="extract-content" Feb 27 02:16:00 crc kubenswrapper[4771]: I0227 02:16:00.163259 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc64338-5e8e-4016-8c85-7e5ee93ac70f" containerName="extract-content" Feb 27 02:16:00 crc kubenswrapper[4771]: E0227 02:16:00.163279 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc64338-5e8e-4016-8c85-7e5ee93ac70f" containerName="extract-utilities" Feb 27 02:16:00 crc kubenswrapper[4771]: I0227 02:16:00.163287 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc64338-5e8e-4016-8c85-7e5ee93ac70f" containerName="extract-utilities" Feb 27 02:16:00 crc kubenswrapper[4771]: E0227 02:16:00.163307 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc64338-5e8e-4016-8c85-7e5ee93ac70f" containerName="registry-server" Feb 27 02:16:00 crc kubenswrapper[4771]: I0227 02:16:00.163317 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc64338-5e8e-4016-8c85-7e5ee93ac70f" containerName="registry-server" Feb 27 02:16:00 crc kubenswrapper[4771]: I0227 02:16:00.163576 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dc64338-5e8e-4016-8c85-7e5ee93ac70f" containerName="registry-server" Feb 27 02:16:00 crc kubenswrapper[4771]: I0227 02:16:00.164344 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535976-zhxzz" Feb 27 02:16:00 crc kubenswrapper[4771]: I0227 02:16:00.168156 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 02:16:00 crc kubenswrapper[4771]: I0227 02:16:00.168609 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 02:16:00 crc kubenswrapper[4771]: I0227 02:16:00.169637 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 02:16:00 crc kubenswrapper[4771]: I0227 02:16:00.181541 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535976-zhxzz"] Feb 27 02:16:00 crc kubenswrapper[4771]: I0227 02:16:00.301645 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d9wj\" (UniqueName: \"kubernetes.io/projected/de18a2d2-e432-4fe6-abae-ed0ef8c2c993-kube-api-access-7d9wj\") pod \"auto-csr-approver-29535976-zhxzz\" (UID: \"de18a2d2-e432-4fe6-abae-ed0ef8c2c993\") " pod="openshift-infra/auto-csr-approver-29535976-zhxzz" Feb 27 02:16:00 crc kubenswrapper[4771]: I0227 02:16:00.402783 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d9wj\" (UniqueName: \"kubernetes.io/projected/de18a2d2-e432-4fe6-abae-ed0ef8c2c993-kube-api-access-7d9wj\") pod \"auto-csr-approver-29535976-zhxzz\" (UID: \"de18a2d2-e432-4fe6-abae-ed0ef8c2c993\") " pod="openshift-infra/auto-csr-approver-29535976-zhxzz" Feb 27 02:16:00 crc kubenswrapper[4771]: I0227 02:16:00.437278 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d9wj\" (UniqueName: \"kubernetes.io/projected/de18a2d2-e432-4fe6-abae-ed0ef8c2c993-kube-api-access-7d9wj\") pod \"auto-csr-approver-29535976-zhxzz\" (UID: \"de18a2d2-e432-4fe6-abae-ed0ef8c2c993\") " pod="openshift-infra/auto-csr-approver-29535976-zhxzz" Feb 27 02:16:00 crc kubenswrapper[4771]: I0227 02:16:00.506301 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535976-zhxzz" Feb 27 02:16:01 crc kubenswrapper[4771]: I0227 02:16:01.009003 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535976-zhxzz"] Feb 27 02:16:01 crc kubenswrapper[4771]: I0227 02:16:01.932660 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535976-zhxzz" event={"ID":"de18a2d2-e432-4fe6-abae-ed0ef8c2c993","Type":"ContainerStarted","Data":"e971317e8182b45fe29b67d621fa538e57a5fc211654dfbc49991d142688ca5d"} Feb 27 02:16:02 crc kubenswrapper[4771]: I0227 02:16:02.946573 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535976-zhxzz" event={"ID":"de18a2d2-e432-4fe6-abae-ed0ef8c2c993","Type":"ContainerStarted","Data":"ce9073019dd70ab51ca1d45c558226a6bd9b4f0572ab181fb5b0dafbbbc3ca04"} Feb 27 02:16:02 crc kubenswrapper[4771]: I0227 02:16:02.982376 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535976-zhxzz" podStartSLOduration=1.873789256 podStartE2EDuration="2.982348765s" podCreationTimestamp="2026-02-27 02:16:00 +0000 UTC" firstStartedPulling="2026-02-27 02:16:01.015887252 +0000 UTC m=+4273.953448540" lastFinishedPulling="2026-02-27 02:16:02.124446751 +0000 UTC m=+4275.062008049" observedRunningTime="2026-02-27 02:16:02.974566784 +0000 UTC m=+4275.912128072" watchObservedRunningTime="2026-02-27 02:16:02.982348765 +0000 UTC m=+4275.919910103" Feb 27 02:16:03 crc kubenswrapper[4771]: I0227 02:16:03.972465 4771 generic.go:334] "Generic (PLEG): container finished" podID="de18a2d2-e432-4fe6-abae-ed0ef8c2c993" containerID="ce9073019dd70ab51ca1d45c558226a6bd9b4f0572ab181fb5b0dafbbbc3ca04" exitCode=0 Feb 27 02:16:03 crc kubenswrapper[4771]: I0227 02:16:03.973342 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535976-zhxzz" event={"ID":"de18a2d2-e432-4fe6-abae-ed0ef8c2c993","Type":"ContainerDied","Data":"ce9073019dd70ab51ca1d45c558226a6bd9b4f0572ab181fb5b0dafbbbc3ca04"} Feb 27 02:16:05 crc kubenswrapper[4771]: I0227 02:16:05.421052 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535976-zhxzz" Feb 27 02:16:05 crc kubenswrapper[4771]: I0227 02:16:05.592023 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d9wj\" (UniqueName: \"kubernetes.io/projected/de18a2d2-e432-4fe6-abae-ed0ef8c2c993-kube-api-access-7d9wj\") pod \"de18a2d2-e432-4fe6-abae-ed0ef8c2c993\" (UID: \"de18a2d2-e432-4fe6-abae-ed0ef8c2c993\") " Feb 27 02:16:05 crc kubenswrapper[4771]: I0227 02:16:05.611779 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de18a2d2-e432-4fe6-abae-ed0ef8c2c993-kube-api-access-7d9wj" (OuterVolumeSpecName: "kube-api-access-7d9wj") pod "de18a2d2-e432-4fe6-abae-ed0ef8c2c993" (UID: "de18a2d2-e432-4fe6-abae-ed0ef8c2c993"). InnerVolumeSpecName "kube-api-access-7d9wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:16:05 crc kubenswrapper[4771]: I0227 02:16:05.694522 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d9wj\" (UniqueName: \"kubernetes.io/projected/de18a2d2-e432-4fe6-abae-ed0ef8c2c993-kube-api-access-7d9wj\") on node \"crc\" DevicePath \"\"" Feb 27 02:16:05 crc kubenswrapper[4771]: I0227 02:16:05.999869 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535976-zhxzz" event={"ID":"de18a2d2-e432-4fe6-abae-ed0ef8c2c993","Type":"ContainerDied","Data":"e971317e8182b45fe29b67d621fa538e57a5fc211654dfbc49991d142688ca5d"} Feb 27 02:16:05 crc kubenswrapper[4771]: I0227 02:16:05.999909 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e971317e8182b45fe29b67d621fa538e57a5fc211654dfbc49991d142688ca5d" Feb 27 02:16:06 crc kubenswrapper[4771]: I0227 02:16:05.999966 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535976-zhxzz" Feb 27 02:16:06 crc kubenswrapper[4771]: I0227 02:16:06.051235 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535970-cq27p"] Feb 27 02:16:06 crc kubenswrapper[4771]: I0227 02:16:06.059316 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535970-cq27p"] Feb 27 02:16:07 crc kubenswrapper[4771]: I0227 02:16:07.792637 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0b2f301-2490-4c19-b932-77fface25a45" path="/var/lib/kubelet/pods/c0b2f301-2490-4c19-b932-77fface25a45/volumes" Feb 27 02:16:27 crc kubenswrapper[4771]: I0227 02:16:27.219923 4771 generic.go:334] "Generic (PLEG): container finished" podID="375bb02e-1244-4971-8c93-07ee9b85b707" containerID="303446f56183c30347d19b02835526b8788297756fdb321a9df3f3f88c5ee6be" exitCode=0 Feb 27 02:16:27 crc kubenswrapper[4771]: I0227 02:16:27.220034 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n5q7k/must-gather-54md8" event={"ID":"375bb02e-1244-4971-8c93-07ee9b85b707","Type":"ContainerDied","Data":"303446f56183c30347d19b02835526b8788297756fdb321a9df3f3f88c5ee6be"} Feb 27 02:16:27 crc kubenswrapper[4771]: I0227 02:16:27.221220 4771 scope.go:117] "RemoveContainer" containerID="303446f56183c30347d19b02835526b8788297756fdb321a9df3f3f88c5ee6be" Feb 27 02:16:27 crc kubenswrapper[4771]: I0227 02:16:27.794511 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n5q7k_must-gather-54md8_375bb02e-1244-4971-8c93-07ee9b85b707/gather/0.log" Feb 27 02:16:31 crc kubenswrapper[4771]: I0227 02:16:31.838532 4771 scope.go:117] "RemoveContainer" containerID="bf1719b71f52a0faa6d35e36cd9db3739459261dfc7ab5492a9c2f1163665cf8" Feb 27 02:16:39 crc kubenswrapper[4771]: I0227 02:16:39.068708 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n5q7k/must-gather-54md8"] Feb 27 02:16:39 crc kubenswrapper[4771]: I0227 02:16:39.069795 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-n5q7k/must-gather-54md8" podUID="375bb02e-1244-4971-8c93-07ee9b85b707" containerName="copy" containerID="cri-o://8811ee2926d09f09109a1ccdd110cac73fda5825aca403d8fb6f38d32be101d5" gracePeriod=2 Feb 27 02:16:39 crc kubenswrapper[4771]: I0227 02:16:39.082512 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n5q7k/must-gather-54md8"] Feb 27 02:16:39 crc kubenswrapper[4771]: I0227 02:16:39.371631 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n5q7k_must-gather-54md8_375bb02e-1244-4971-8c93-07ee9b85b707/copy/0.log" Feb 27 02:16:39 crc kubenswrapper[4771]: I0227 02:16:39.373119 4771 generic.go:334] "Generic (PLEG): container finished" podID="375bb02e-1244-4971-8c93-07ee9b85b707" containerID="8811ee2926d09f09109a1ccdd110cac73fda5825aca403d8fb6f38d32be101d5" exitCode=143 Feb 27 02:16:39 crc kubenswrapper[4771]: I0227 02:16:39.566992 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n5q7k_must-gather-54md8_375bb02e-1244-4971-8c93-07ee9b85b707/copy/0.log" Feb 27 02:16:39 crc kubenswrapper[4771]: I0227 02:16:39.567761 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5q7k/must-gather-54md8" Feb 27 02:16:39 crc kubenswrapper[4771]: I0227 02:16:39.655072 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c82vr\" (UniqueName: \"kubernetes.io/projected/375bb02e-1244-4971-8c93-07ee9b85b707-kube-api-access-c82vr\") pod \"375bb02e-1244-4971-8c93-07ee9b85b707\" (UID: \"375bb02e-1244-4971-8c93-07ee9b85b707\") " Feb 27 02:16:39 crc kubenswrapper[4771]: I0227 02:16:39.655337 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/375bb02e-1244-4971-8c93-07ee9b85b707-must-gather-output\") pod \"375bb02e-1244-4971-8c93-07ee9b85b707\" (UID: \"375bb02e-1244-4971-8c93-07ee9b85b707\") " Feb 27 02:16:39 crc kubenswrapper[4771]: I0227 02:16:39.661571 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/375bb02e-1244-4971-8c93-07ee9b85b707-kube-api-access-c82vr" (OuterVolumeSpecName: "kube-api-access-c82vr") pod "375bb02e-1244-4971-8c93-07ee9b85b707" (UID: "375bb02e-1244-4971-8c93-07ee9b85b707"). InnerVolumeSpecName "kube-api-access-c82vr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:16:39 crc kubenswrapper[4771]: I0227 02:16:39.757272 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c82vr\" (UniqueName: \"kubernetes.io/projected/375bb02e-1244-4971-8c93-07ee9b85b707-kube-api-access-c82vr\") on node \"crc\" DevicePath \"\"" Feb 27 02:16:39 crc kubenswrapper[4771]: I0227 02:16:39.810827 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/375bb02e-1244-4971-8c93-07ee9b85b707-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "375bb02e-1244-4971-8c93-07ee9b85b707" (UID: "375bb02e-1244-4971-8c93-07ee9b85b707"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 02:16:39 crc kubenswrapper[4771]: I0227 02:16:39.859648 4771 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/375bb02e-1244-4971-8c93-07ee9b85b707-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 27 02:16:40 crc kubenswrapper[4771]: I0227 02:16:40.385803 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n5q7k_must-gather-54md8_375bb02e-1244-4971-8c93-07ee9b85b707/copy/0.log" Feb 27 02:16:40 crc kubenswrapper[4771]: I0227 02:16:40.386336 4771 scope.go:117] "RemoveContainer" containerID="8811ee2926d09f09109a1ccdd110cac73fda5825aca403d8fb6f38d32be101d5" Feb 27 02:16:40 crc kubenswrapper[4771]: I0227 02:16:40.386599 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n5q7k/must-gather-54md8" Feb 27 02:16:40 crc kubenswrapper[4771]: I0227 02:16:40.408404 4771 scope.go:117] "RemoveContainer" containerID="303446f56183c30347d19b02835526b8788297756fdb321a9df3f3f88c5ee6be" Feb 27 02:16:41 crc kubenswrapper[4771]: I0227 02:16:41.784589 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="375bb02e-1244-4971-8c93-07ee9b85b707" path="/var/lib/kubelet/pods/375bb02e-1244-4971-8c93-07ee9b85b707/volumes" Feb 27 02:17:58 crc kubenswrapper[4771]: I0227 02:17:58.952823 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 02:17:58 crc kubenswrapper[4771]: I0227 02:17:58.953506 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 02:18:00 crc kubenswrapper[4771]: I0227 02:18:00.150540 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535978-g6ks6"] Feb 27 02:18:00 crc kubenswrapper[4771]: E0227 02:18:00.151089 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="375bb02e-1244-4971-8c93-07ee9b85b707" containerName="copy" Feb 27 02:18:00 crc kubenswrapper[4771]: I0227 02:18:00.151105 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="375bb02e-1244-4971-8c93-07ee9b85b707" containerName="copy" Feb 27 02:18:00 crc kubenswrapper[4771]: E0227 02:18:00.151116 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="375bb02e-1244-4971-8c93-07ee9b85b707" containerName="gather" Feb 27 02:18:00 crc kubenswrapper[4771]: I0227 02:18:00.151124 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="375bb02e-1244-4971-8c93-07ee9b85b707" containerName="gather" Feb 27 02:18:00 crc kubenswrapper[4771]: E0227 02:18:00.151141 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de18a2d2-e432-4fe6-abae-ed0ef8c2c993" containerName="oc" Feb 27 02:18:00 crc kubenswrapper[4771]: I0227 02:18:00.151149 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="de18a2d2-e432-4fe6-abae-ed0ef8c2c993" containerName="oc" Feb 27 02:18:00 crc kubenswrapper[4771]: I0227 02:18:00.151386 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="de18a2d2-e432-4fe6-abae-ed0ef8c2c993" containerName="oc" Feb 27 02:18:00 crc kubenswrapper[4771]: I0227 02:18:00.151412 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="375bb02e-1244-4971-8c93-07ee9b85b707" containerName="copy" Feb 27 02:18:00 crc kubenswrapper[4771]: I0227 02:18:00.151442 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="375bb02e-1244-4971-8c93-07ee9b85b707" containerName="gather" Feb 27 02:18:00 crc kubenswrapper[4771]: I0227 02:18:00.152218 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535978-g6ks6" Feb 27 02:18:00 crc kubenswrapper[4771]: I0227 02:18:00.154440 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 02:18:00 crc kubenswrapper[4771]: I0227 02:18:00.154972 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 02:18:00 crc kubenswrapper[4771]: I0227 02:18:00.160375 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535978-g6ks6"] Feb 27 02:18:00 crc kubenswrapper[4771]: I0227 02:18:00.162725 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 02:18:00 crc kubenswrapper[4771]: I0227 02:18:00.299123 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm97p\" (UniqueName: \"kubernetes.io/projected/225fd216-8646-4048-b4c0-0d4ecffe1350-kube-api-access-lm97p\") pod \"auto-csr-approver-29535978-g6ks6\" (UID: \"225fd216-8646-4048-b4c0-0d4ecffe1350\") " pod="openshift-infra/auto-csr-approver-29535978-g6ks6" Feb 27 02:18:00 crc kubenswrapper[4771]: I0227 02:18:00.401334 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm97p\" (UniqueName: \"kubernetes.io/projected/225fd216-8646-4048-b4c0-0d4ecffe1350-kube-api-access-lm97p\") pod \"auto-csr-approver-29535978-g6ks6\" (UID: \"225fd216-8646-4048-b4c0-0d4ecffe1350\") " pod="openshift-infra/auto-csr-approver-29535978-g6ks6" Feb 27 02:18:00 crc kubenswrapper[4771]: I0227 02:18:00.421313 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm97p\" (UniqueName: \"kubernetes.io/projected/225fd216-8646-4048-b4c0-0d4ecffe1350-kube-api-access-lm97p\") pod \"auto-csr-approver-29535978-g6ks6\" (UID: \"225fd216-8646-4048-b4c0-0d4ecffe1350\") " pod="openshift-infra/auto-csr-approver-29535978-g6ks6" Feb 27 02:18:00 crc kubenswrapper[4771]: I0227 02:18:00.490294 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535978-g6ks6" Feb 27 02:18:00 crc kubenswrapper[4771]: I0227 02:18:00.956250 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535978-g6ks6"] Feb 27 02:18:01 crc kubenswrapper[4771]: I0227 02:18:01.251050 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535978-g6ks6" event={"ID":"225fd216-8646-4048-b4c0-0d4ecffe1350","Type":"ContainerStarted","Data":"baeb5468b481680d51eb1c915e182877c52082da9a37a68db2047e9f24d5977e"} Feb 27 02:18:03 crc kubenswrapper[4771]: I0227 02:18:03.272058 4771 generic.go:334] "Generic (PLEG): container finished" podID="225fd216-8646-4048-b4c0-0d4ecffe1350" containerID="53c29fa147e2b9263d1ff870cd22f8f03f25c09e1102f76cacccd9e472f6deca" exitCode=0 Feb 27 02:18:03 crc kubenswrapper[4771]: I0227 02:18:03.272147 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535978-g6ks6" event={"ID":"225fd216-8646-4048-b4c0-0d4ecffe1350","Type":"ContainerDied","Data":"53c29fa147e2b9263d1ff870cd22f8f03f25c09e1102f76cacccd9e472f6deca"} Feb 27 02:18:04 crc kubenswrapper[4771]: I0227 02:18:04.703086 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535978-g6ks6" Feb 27 02:18:04 crc kubenswrapper[4771]: I0227 02:18:04.900753 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm97p\" (UniqueName: \"kubernetes.io/projected/225fd216-8646-4048-b4c0-0d4ecffe1350-kube-api-access-lm97p\") pod \"225fd216-8646-4048-b4c0-0d4ecffe1350\" (UID: \"225fd216-8646-4048-b4c0-0d4ecffe1350\") " Feb 27 02:18:04 crc kubenswrapper[4771]: I0227 02:18:04.908294 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/225fd216-8646-4048-b4c0-0d4ecffe1350-kube-api-access-lm97p" (OuterVolumeSpecName: "kube-api-access-lm97p") pod "225fd216-8646-4048-b4c0-0d4ecffe1350" (UID: "225fd216-8646-4048-b4c0-0d4ecffe1350"). InnerVolumeSpecName "kube-api-access-lm97p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:18:05 crc kubenswrapper[4771]: I0227 02:18:05.002921 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lm97p\" (UniqueName: \"kubernetes.io/projected/225fd216-8646-4048-b4c0-0d4ecffe1350-kube-api-access-lm97p\") on node \"crc\" DevicePath \"\"" Feb 27 02:18:05 crc kubenswrapper[4771]: I0227 02:18:05.297440 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535978-g6ks6" event={"ID":"225fd216-8646-4048-b4c0-0d4ecffe1350","Type":"ContainerDied","Data":"baeb5468b481680d51eb1c915e182877c52082da9a37a68db2047e9f24d5977e"} Feb 27 02:18:05 crc kubenswrapper[4771]: I0227 02:18:05.297761 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="baeb5468b481680d51eb1c915e182877c52082da9a37a68db2047e9f24d5977e" Feb 27 02:18:05 crc kubenswrapper[4771]: I0227 02:18:05.297491 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535978-g6ks6" Feb 27 02:18:05 crc kubenswrapper[4771]: I0227 02:18:05.770253 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535972-5rfsz"] Feb 27 02:18:05 crc kubenswrapper[4771]: I0227 02:18:05.783022 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535972-5rfsz"] Feb 27 02:18:07 crc kubenswrapper[4771]: I0227 02:18:07.792466 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2947f34-0e2b-4968-9c29-ef67acacebb0" path="/var/lib/kubelet/pods/d2947f34-0e2b-4968-9c29-ef67acacebb0/volumes" Feb 27 02:18:28 crc kubenswrapper[4771]: I0227 02:18:28.953472 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 02:18:28 crc kubenswrapper[4771]: I0227 02:18:28.954092 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 02:18:32 crc kubenswrapper[4771]: I0227 02:18:32.021094 4771 scope.go:117] "RemoveContainer" containerID="17eed80d8b151144d9ba77d19620947864ff00c767218f3ffaefa7ec5b42e1d1" Feb 27 02:18:58 crc kubenswrapper[4771]: I0227 02:18:58.952742 4771 patch_prober.go:28] interesting pod/machine-config-daemon-hw7dn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 02:18:58 crc kubenswrapper[4771]: I0227 02:18:58.953332 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 02:18:58 crc kubenswrapper[4771]: I0227 02:18:58.953382 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" Feb 27 02:18:58 crc kubenswrapper[4771]: I0227 02:18:58.954173 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8eeb6a600e72f58064a0af0666b2e1760333ab9c0dffb4c33941bbbff787e68a"} pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 02:18:58 crc kubenswrapper[4771]: I0227 02:18:58.954255 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" podUID="ca81e505-d53f-496e-bd26-7cec669591e4" containerName="machine-config-daemon" containerID="cri-o://8eeb6a600e72f58064a0af0666b2e1760333ab9c0dffb4c33941bbbff787e68a" gracePeriod=600 Feb 27 02:18:59 crc kubenswrapper[4771]: I0227 02:18:59.956730 4771 generic.go:334] "Generic (PLEG): container finished" podID="ca81e505-d53f-496e-bd26-7cec669591e4" containerID="8eeb6a600e72f58064a0af0666b2e1760333ab9c0dffb4c33941bbbff787e68a" exitCode=0 Feb 27 02:18:59 crc kubenswrapper[4771]: I0227 02:18:59.956753 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerDied","Data":"8eeb6a600e72f58064a0af0666b2e1760333ab9c0dffb4c33941bbbff787e68a"} Feb 27 02:18:59 crc kubenswrapper[4771]: I0227 02:18:59.957160 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hw7dn" event={"ID":"ca81e505-d53f-496e-bd26-7cec669591e4","Type":"ContainerStarted","Data":"791eecd382e0b72017d8e1a0c4f2906311be900f0a3495c78dafda986328422e"} Feb 27 02:18:59 crc kubenswrapper[4771]: I0227 02:18:59.957185 4771 scope.go:117] "RemoveContainer" containerID="b0c27932ed00fff85ebf2f13a04a13a6f0f41d0a900e5847cbefa05fe75f99eb" Feb 27 02:20:00 crc kubenswrapper[4771]: I0227 02:20:00.154827 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535980-dvgvm"] Feb 27 02:20:00 crc kubenswrapper[4771]: E0227 02:20:00.156030 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="225fd216-8646-4048-b4c0-0d4ecffe1350" containerName="oc" Feb 27 02:20:00 crc kubenswrapper[4771]: I0227 02:20:00.156049 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="225fd216-8646-4048-b4c0-0d4ecffe1350" containerName="oc" Feb 27 02:20:00 crc kubenswrapper[4771]: I0227 02:20:00.156418 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="225fd216-8646-4048-b4c0-0d4ecffe1350" containerName="oc" Feb 27 02:20:00 crc kubenswrapper[4771]: I0227 02:20:00.157336 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535980-dvgvm" Feb 27 02:20:00 crc kubenswrapper[4771]: I0227 02:20:00.160105 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 02:20:00 crc kubenswrapper[4771]: I0227 02:20:00.160401 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gg4db" Feb 27 02:20:00 crc kubenswrapper[4771]: I0227 02:20:00.165240 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535980-dvgvm"] Feb 27 02:20:00 crc kubenswrapper[4771]: I0227 02:20:00.166139 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 02:20:00 crc kubenswrapper[4771]: I0227 02:20:00.267305 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2s42\" (UniqueName: \"kubernetes.io/projected/97193ec2-8523-4aad-9083-a5be7b136f91-kube-api-access-j2s42\") pod \"auto-csr-approver-29535980-dvgvm\" (UID: \"97193ec2-8523-4aad-9083-a5be7b136f91\") " pod="openshift-infra/auto-csr-approver-29535980-dvgvm" Feb 27 02:20:00 crc kubenswrapper[4771]: I0227 02:20:00.368652 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2s42\" (UniqueName: \"kubernetes.io/projected/97193ec2-8523-4aad-9083-a5be7b136f91-kube-api-access-j2s42\") pod \"auto-csr-approver-29535980-dvgvm\" (UID: \"97193ec2-8523-4aad-9083-a5be7b136f91\") " pod="openshift-infra/auto-csr-approver-29535980-dvgvm" Feb 27 02:20:00 crc kubenswrapper[4771]: I0227 02:20:00.390891 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2s42\" (UniqueName: \"kubernetes.io/projected/97193ec2-8523-4aad-9083-a5be7b136f91-kube-api-access-j2s42\") pod \"auto-csr-approver-29535980-dvgvm\" (UID: \"97193ec2-8523-4aad-9083-a5be7b136f91\") " pod="openshift-infra/auto-csr-approver-29535980-dvgvm" Feb 27 02:20:00 crc kubenswrapper[4771]: I0227 02:20:00.476334 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535980-dvgvm" Feb 27 02:20:00 crc kubenswrapper[4771]: I0227 02:20:00.968721 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535980-dvgvm"] Feb 27 02:20:01 crc kubenswrapper[4771]: I0227 02:20:01.691795 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535980-dvgvm" event={"ID":"97193ec2-8523-4aad-9083-a5be7b136f91","Type":"ContainerStarted","Data":"cfd76ba19473839c24f1e650bf2a5e91eac1c8d936df9a59e38c164f555aaf66"} Feb 27 02:20:02 crc kubenswrapper[4771]: I0227 02:20:02.704979 4771 generic.go:334] "Generic (PLEG): container finished" podID="97193ec2-8523-4aad-9083-a5be7b136f91" containerID="4bcaaa10729d76d17627a7d3f129d3ece799c1092c35433050645d9ea13b101d" exitCode=0 Feb 27 02:20:02 crc kubenswrapper[4771]: I0227 02:20:02.705191 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535980-dvgvm" event={"ID":"97193ec2-8523-4aad-9083-a5be7b136f91","Type":"ContainerDied","Data":"4bcaaa10729d76d17627a7d3f129d3ece799c1092c35433050645d9ea13b101d"} Feb 27 02:20:04 crc kubenswrapper[4771]: I0227 02:20:04.081528 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535980-dvgvm" Feb 27 02:20:04 crc kubenswrapper[4771]: I0227 02:20:04.249358 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2s42\" (UniqueName: \"kubernetes.io/projected/97193ec2-8523-4aad-9083-a5be7b136f91-kube-api-access-j2s42\") pod \"97193ec2-8523-4aad-9083-a5be7b136f91\" (UID: \"97193ec2-8523-4aad-9083-a5be7b136f91\") " Feb 27 02:20:04 crc kubenswrapper[4771]: I0227 02:20:04.255927 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97193ec2-8523-4aad-9083-a5be7b136f91-kube-api-access-j2s42" (OuterVolumeSpecName: "kube-api-access-j2s42") pod "97193ec2-8523-4aad-9083-a5be7b136f91" (UID: "97193ec2-8523-4aad-9083-a5be7b136f91"). InnerVolumeSpecName "kube-api-access-j2s42". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 02:20:04 crc kubenswrapper[4771]: I0227 02:20:04.352034 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2s42\" (UniqueName: \"kubernetes.io/projected/97193ec2-8523-4aad-9083-a5be7b136f91-kube-api-access-j2s42\") on node \"crc\" DevicePath \"\"" Feb 27 02:20:04 crc kubenswrapper[4771]: I0227 02:20:04.728012 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535980-dvgvm" event={"ID":"97193ec2-8523-4aad-9083-a5be7b136f91","Type":"ContainerDied","Data":"cfd76ba19473839c24f1e650bf2a5e91eac1c8d936df9a59e38c164f555aaf66"} Feb 27 02:20:04 crc kubenswrapper[4771]: I0227 02:20:04.728052 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfd76ba19473839c24f1e650bf2a5e91eac1c8d936df9a59e38c164f555aaf66" Feb 27 02:20:04 crc kubenswrapper[4771]: I0227 02:20:04.728117 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535980-dvgvm" Feb 27 02:20:05 crc kubenswrapper[4771]: I0227 02:20:05.158305 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535974-sfhz8"] Feb 27 02:20:05 crc kubenswrapper[4771]: I0227 02:20:05.166708 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535974-sfhz8"] Feb 27 02:20:05 crc kubenswrapper[4771]: I0227 02:20:05.792107 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4008d1c8-f1f6-4209-8037-2a9c68e3823a" path="/var/lib/kubelet/pods/4008d1c8-f1f6-4209-8037-2a9c68e3823a/volumes" Feb 27 02:20:32 crc kubenswrapper[4771]: I0227 02:20:32.153174 4771 scope.go:117] "RemoveContainer" containerID="d62f8ae0d5dd14962f5fdb3119317ffbc9250ddc6c7cc1c721ed88e79aeefd95"